DeBaca, Richard C.; Sarkissian, Edwin; Madatyan, Mariyetta; Shepard, Douglas; Gluck, Scott; Apolinski, Mark; McDuffie, James; Tremblay, Dennis
TES L1B Subsystem is a computer program that performs several functions for the Tropospheric Emission Spectrometer (TES). The term "L1B" (an abbreviation of "level 1B"), refers to data, specific to the TES, on radiometric calibrated spectral radiances and their corresponding noise equivalent spectral radiances (NESRs), plus ancillary geolocation, quality, and engineering data. The functions performed by TES L1B Subsystem include shear analysis, monitoring of signal levels, detection of ice build-up, and phase correction and radiometric and spectral calibration of TES target data. Also, the program computes NESRs for target spectra, writes scientific TES level-1B data to hierarchical- data-format (HDF) files for public distribution, computes brightness temperatures, and quantifies interpixel signal variability for the purpose of first-order cloud and heterogeneous land screening by the level-2 software summarized in the immediately following article. This program uses an in-house-developed algorithm, called "NUSRT," to correct instrument line-shape factors.
Perun, Vincent S.; Jarnot, Robert F.; Wagner, Paul A.; Cofield, Richard E., IV; Nguyen, Honghanh T.; Vuu, Christina
This software is an improvement on Version 2, which was described in EOS MLS Level 1B Data Processing, Version 2.2, NASA Tech Briefs, Vol. 33, No. 5 (May 2009), p. 34. It accepts the EOS MLS Level 0 science/engineering data, and the EOS Aura spacecraft ephemeris/attitude data, and produces calibrated instrument radiances and associated engineering and diagnostic data. This version makes the code more robust, improves calibration, provides more diagnostics outputs, defines the Galactic core more finely, and fixes the equator crossing. The Level 1 processing software manages several different tasks. It qualifies each data quantity using instrument configuration and checksum data, as well as data transmission quality flags. Statistical tests are applied for data quality and reasonableness. The instrument engineering data (e.g., voltages, currents, temperatures, and encoder angles) is calibrated by the software, and the filter channel space reference measurements are interpolated onto the times of each limb measurement with the interpolates being differenced from the measurements. Filter channel calibration target measurements are interpolated onto the times of each limb measurement, and are used to compute radiometric gain. The total signal power is determined and analyzed by each digital autocorrelator spectrometer (DACS) during each data integration. The software converts each DACS data integration from an autocorrelation measurement in the time domain into a spectral measurement in the frequency domain, and estimates separately the spectrally, smoothly varying and spectrally averaged components of the limb port signal arising from antenna emission and scattering effects. Limb radiances are also calibrated.
Perun, Vincent; Jarnot, Robert; Pickett, Herbert; Cofield, Richard; Schwartz, Michael; Wagner, Paul
A computer program performs level- 1B processing (the term 1B is explained below) of data from observations of the limb of the Earth by the Earth Observing System (EOS) Microwave Limb Sounder (MLS), which is an instrument aboard the Aura spacecraft. This software accepts, as input, the raw EOS MLS scientific and engineering data and the Aura spacecraft ephemeris and attitude data. Its output consists of calibrated instrument radiances and associated engineering and diagnostic data. [This software is one of several computer programs, denoted product generation executives (PGEs), for processing EOS MLS data. Starting from level 0 (representing the aforementioned raw data, the PGEs and their data products are denoted by alphanumeric labels (e.g., 1B and 2) that signify the successive stages of processing.] At the time of this reporting, this software is at version 2.2 and incorporates improvements over a prior version that make the code more robust, improve calibration, provide more diagnostic outputs, improve the interface with the Level 2 PGE, and effect a 15-percent reduction in file sizes by use of data compression.
Tøffner-Clausen, Lars; Floberghagen, Rune; Mecozzi, Riccardo; Menard, Yvon
Swarm, a three-satellite constellation to study the dynamics of the Earth's magnetic field and its interactions with the Earth system, has been launched in November 2013. The objective of the Swarm mission is to provide the best ever survey of the geomagnetic field and its temporal evolution, which will bring new insights into the Earth system by improving our understanding of the Earth's interior and environment. The Level 1b Products of the Swarm mission contain time-series of the quality screened, calibrated, corrected, and fully geo-localized measurements of the magnetic field intensity, the magnetic field vector (provided in both instrument and Earth-fixed frames), the plasma density, temperature, and velocity. Additionally, quality screened and pre-calibrated measurements of the nongravitational accelerations are provided. Geo-localization is performed by 24- channel GPS receivers and by means of unique, three head Advanced Stellar Compasses for high-precision satellite attitude information. The Swarm Level 1b data will be provided in daily products separately for each of the three Swarm spacecrafts. This poster will present detailed lists of the contents of the Swarm Level 1b Products and brief descriptions of the processing algorithms used in the generation of these data.
Scagliola, M.; Fornari, M.; Tagliani, N.; Frommknecht, B.; Bouffard, J.; Parrinello, T.
CryoSat was launched on the 8th April 2010 and it is the first European ice mission dedicated to monitoring precise changes in the thickness of polar ice sheets and floating sea ice over a 3-year period. Cryosat carries an innovative radar altimeter called the Synthetic Aperture Interferometric Altimeter (SIRAL), that transmits pulses at a high pulse repetition frequency thus making the received echoes phase coherent and suitable for azimuth processing. This allows to reach a significantly improved along track resolution with respect to traditional pulse-width limited altimeters. CryoSat is the first altimetry mission operating in SAR mode and continuous improvement in the Level1 Instrument Processing Facility (IPF1) are being identified, tested and validated in order to improve the quality of the Level1b products. Towards the release of the BaselineC of the CryoSat Level1b SAR/SARin products, that is expected at the end of 2014, several improvements have been identified: a datation bias of about -0.5195 ms will be corrected a range bias of about 0.6730 m will be corrected The range window size will be doubled with respect to BaselineB, so that the in Level1b products the waveforms will be doubled too Improved processing for 1Hz echoes to have sharper waveforms Surface sample stack weighting to filter out the single look echoes acquired at highest look angle, that results in a sharpening of the 20Hz waveforms Additional auxiliary information related to the mispointing angles of the instrument as well as to the stacks of single look echoes will be added This poster details the main quality improvements that are foreseen to be included in the CryoSat Level1b SAR/SARin products in BaselineC.
Scagliola, Michele; Fornari, Marco; Bouzinac, Catherine; Tagliani, Nicolas; Parrinello, Tommaso
CryoSat was launched on the 8th April 2010 and it is the first European ice mission dedicated to monitoring precise changes in the thickness of polar ice sheets and floating sea ice over a 3-year period. Cryosat carries an innovative radar altimeter called the Synthetic Aperture Interferometric Altimeter (SIRAL), that transmits pulses at a high pulse repetition frequency thus making the received echoes phase coherent and suitable for azimuth processing. This allows to reach a significantly improved along track resolution with respect to traditional pulse-width limited altimeters. CryoSat is the first altimetry mission operating in SAR mode and continuous improvement in the Level1 Instrument Processing Facility (IPF1) are being identified, tested and validated in order to improve the quality of the Level1b products. Towards the release of the BaselineC of the CryoSat Level1b SAR/SARin products, that is expected during 2014, several improvements have been identified: • a datation bias of about -0.5195 ms will be corrected • a range bias of about -0.6730 m will be corrected • the waveform length in the Level1b product will be doubled with respect to BaselineB • improved processing for 1Hz echoes to have sharper waveforms • surface sample stack weighting to filter out the single look echoes acquired at highest look angle, that results in a sharpening of the 20Hz waveforms This poster details the main improvements that are foreseen to be included in the CryoSat Level1b SAR/SARin products in BaselineC.
National Aeronautics and Space Administration — Please note that the machine on which these AVHRR data are processed has reached its life expectancy and will no longer be available as of 02 June 2008 until further...
Scagliola, Michele; Fornari, Marco; Di Giacinto, Andrea; Bouffard, Jerome; Féménias, Pierre; Parrinello, Tommaso
CryoSat was launched on the 8th April 2010 and is the first European ice mission dedicated to the monitoring of precise changes in the thickness of polar ice sheets and floating sea ice. Cryosat carries an innovative radar altimeter called the Synthetic Aperture Interferometric Altimeter (SIRAL), that transmits pulses at a high pulse repetition frequency thus making the received echoes phase coherent and suitable for azimuth processing. This allows to reach a significantly improved along track resolution with respect to traditional pulse-width limited altimeters. CryoSat is the first altimetry mission operating in SAR mode and continuous improvements in the Level1 Instrument Processing Facility (IPF1) are being identified, tested and validated in order to improve the quality of the Level1b products. The current IPF, Baseline B, was released in operation in February 2012. A reprocessing campaign followed, in order to reprocess the data since July 2010. After more than 2 years of development, the release in operations of Baseline C is expected in the first half of 2015. BaselineC Level1b products will be distributed in an updated format, including for example the attitude information (roll, pitch and yaw) and, for SAR/SARIN, the waveform length doubled with respect to Baseline B. Moreveor, various algorithm improvements have been identified: • a datation bias of about -0.5195 ms will be corrected (SAR/SARIn) • a range bias of about 0.6730 m will be corrected (SAR/SARIn) • a roll bias of 0.1062 deg and a pitch bias of 0.0520 deg • Surface sample stack weighting to filter out the single look echoes acquired at highest look angle, that results in a sharpening of the 20Hz waveforms With the operational release of BaselineC, the second CryoSat reprocessing campaign will be initiated, taking benefit of the upgrade implemented in the IPF1 processing chain but also at IPF2 level. The reprocessing campaign will cover the full Cryosat mission starting on 16th July 2010
Scagliola, Michele; Fornari, Marco; Bouffard, Jerome; Parrinello, Tommaso
CryoSat was launched on the 8th April 2010 and is the first European ice mission dedicated to the monitoring of precise changes in the thickness of polar ice sheets and floating sea ice. Cryosat carries an innovative radar altimeter called the Synthetic Aperture Interferometric Altimeter (SIRAL), that transmits pulses at a high pulse repetition frequency thus making the received echoes phase coherent and suitable for azimuth processing. This allows to reach a significantly improved along track resolution with respect to traditional pulse-width limited altimeters. CryoSat is the first altimetry mission operating in SAR mode and continuous improvements in the Level1 Instrument Processing Facility (IPF1) are being identified, tested and validated in order to improve the quality of the Level1b products. The current IPF, Baseline C, was released in operation in April 2015 and the second CryoSat reprocessing campaign was jointly initiated, taking benefit of the upgrade implemented in the IPF1 processing chain but also of some specific configurations for the calibration corrections. In particular, the CryoSat Level1b BaselineC products generated in the framework of the second reprocessing campaign include refined information for what concerns the mispointing angles and the calibration corrections. This poster will thus detail thus the evolutions that are currently planned for the CryoSat BaselineD SAR/SARin Level1b products and the corresponding quality improvements that are expected.
Klos, R.A.; Sinclair, J.E.; Torres, C.; Mobbs, S.F.; Galson, D.A.
The probabilistic Systems Assessment Code (PSAC) User Group of the OECD Nuclear Energy Agency has organised a series of code intercomparison studies of relevance to the performance assessment of underground repositories for radioactive wastes - known collectively by the name PSACOIN. The latest of these to be undertaken is designated PSACOIN Level 1b, and the case specification provides a complete assessment model of the behaviour of radionuclides following release into the biosphere. PSACOIN Level 1b differs from other biosphere oriented intercomparison exercises in that individual dose is the end point of the calculations as opposed to any other intermediate quantity. The PSACOIN Level 1b case specification describes a simple source term which is used to simulate the release of activity to the biosphere from certain types of near surface waste repository, the transport of radionuclides through the biosphere and their eventual uptake by humankind. The biosphere sub model comprises 4 compartments representing top and deep soil layers, river water and river sediment. The transport of radionuclides between the physical compartments is described by ten transfer coefficients and doses to humankind arise from the simultaneous consumption of water, fish, meat, milk, and grain as well as from dust inhalation and external γ-irradiation. The parameters of the exposure pathway sub model are chosen to be representative of an individual living in a small agrarian community. (13 refs., 3 figs., 2 tabs.)
Scagliola, Michele; Fornari, Marco; Di Giacinto, Andrea; Bouffard, Jerome; Féménias, Pierre; Parrinello, Tommaso
CryoSat was launched on the 8th April 2010 and is the first European ice mission dedicated to the monitoring of precise changes in the thickness of polar ice sheets and floating sea ice. CryoSat is the first altimetry mission operating in SAR mode and it carries an innovative radar altimeter called the Synthetic Aperture Interferometric Altimeter (SIRAL), that transmits pulses at a high pulse repetition frequency thus making the received echoes phase coherent and suitable for azimuth processing. The current CryoSat IPF (Instrument Processing Facility), Baseline B, was released in operation in February 2012. After more than 2 years of development, the release in operations of the Baseline C is expected in the first half of 2015. It is worth recalling here that the CryoSat SAR/SARin IPF1 generates 20Hz waveforms in correspondence of an approximately equally spaced set of ground locations on the Earth surface, i.e. surface samples, and that a surface sample gathers a collection of single-look echoes coming from the processed bursts during the time of visibility. Thus, for a given surface sample, the stack can be defined as the collection of all the single-look echoes pointing to the current surface sample, after applying all the necessary range corrections. The L1B product contains the power average of all the single-look echoes in the stack: the multi-looked L1B waveform. This reduces the data volume, while removing some information contained in the single looks, useful for characterizing the surface and modelling the L1B waveform. To recover such information, a set of parameters has been added to the L1B product: the stack characterization or beam behaviour parameters. The stack characterization, already included in previous Baselines, has been reviewed and expanded in Baseline C. This poster describes all the stack characterization parameters, detailing what they represent and how they have been computed. In details, such parameters can be summarized in: - Stack
Kosarev, Yu G; Gusev, V D
Works are presented on automation systems for editing and publishing operations by methods of processing symbol information and information contained in training selection (ranking of objectives by promise, classification algorithm of tones and noise). The book will be of interest to specialists in the automation of processing textural information, programming, and pattern recognition.
John Sweller; Susan Sweller
Natural information processing systems such as biological evolution and human cognition organize information used to govern the activities of natural entities. When dealing with biologically secondary information, these systems can be specified by five common principles that we propose underlie natural information processing systems. The principles equate: (1) human long-term memory with a genome; (2) learning from other humans with biological reproduction; (3) problem solving through random ...
Leuchs, Gerd; Beth, Thomas
... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.5 SimulationofHamiltonians... References... 1 1 1 3 5 8 10 2 Quantum Information Processing and Error Correction with Jump Codes (G. Alber, M. Mussinger...
Furusawa, Akira [Department of Applied Physics, School of Engineering, The University of Tokyo (Japan)
I will briefly explain the definition and advantage of hybrid quantum information processing, which is hybridization of qubit and continuous-variable technologies. The final goal would be realization of universal gate sets both for qubit and continuous-variable quantum information processing with the hybrid technologies. For that purpose, qubit teleportation with a continuousvariable teleporter is one of the most important ingredients.
Full Text Available The paper systematizes several theoretical view-points on scientific information processing skill. It decomposes the processing skills into sub-skills. Several methods such analysis, synthesis, induction, deduction, document analysis were used to build up a theoretical framework. Interviews and survey to professional being trained and a case study was carried out to evaluate the results. All professional in the sample improved their performance in scientific information processing.
Quantum processing and communication is emerging as a challenging technique at the beginning of the new millennium. This is an up-to-date insight into the current research of quantum superposition, entanglement, and the quantum measurement process - the key ingredients of quantum information processing. The authors further address quantum protocols and algorithms. Complementary to similar programmes in other countries and at the European level, the German Research Foundation (DFG) started a focused research program on quantum information in 1999. The contributions - written by leading experts - bring together the latest results in quantum information as well as addressing all the relevant questions
Briggs, Andrew; Ferry, David; Stoneham, Marshall
Microelectronics and the classical information technologies transformed the physics of semiconductors. Photonics has given optical materials a new direction. Quantum information technologies, we believe, will have immense impact on condensed matter physics. The novel systems of quantum information processing need to be designed and made. Their behaviours must be manipulated in ways that are intrinsically quantal and generally nanoscale. Both in this special issue and in previous issues (see e.g., Spiller T P and Munro W J 2006 J. Phys.: Condens. Matter 18 V1-10) we see the emergence of new ideas that link the fundamentals of science to the pragmatism of market-led industry. We hope these papers will be followed by many others on quantum information processing in the Journal of Physics: Condensed Matter.
D'Ariano, Giacomo Mauro
I review some recent advances in foundational research at Pavia QUIT group. The general idea is that there is only Quantum Theory without quantization rules, and the whole Physics - including space-time and relativity - is emergent from the quantum-information processing. And since Quantum Theory itself is axiomatized solely on informational principles, the whole Physics must be reformulated in information-theoretical terms: this is the It from bit of J. A. Wheeler.The review is divided into four parts: a) the informational axiomatization of Quantum Theory; b) how space-time and relativistic covariance emerge from quantum computation; c) what is the information-theoretical meaning of inertial mass and of (ℎ/2π), and how the quantum field emerges; d) an observational consequence of the new quantum field theory: a mass-dependent refraction index of vacuum. I will conclude with the research lines that will follow in the immediate future.
Hartman, Carol R.; Burgess, Ann W.
This paper presents a neuropsychosocial model of information processing to explain a victimization experience, specifically child sexual abuse. It surveys the relation of sensation, perception, and cognition as a systematic way to provide a framework for studying human behavior and describing human response to traumatic events. (Author/JDD)
Attempts made to design and extend space system capabilities are reported. Special attention was given to establishing user needs for information or services which might be provided by space systems. Data given do not attempt to detail scientific, technical, or economic bases for the needs expressed by the users.
Dietel, Harvey M
An Introduction to Information Processing provides an informal introduction to the computer field. This book introduces computer hardware, which is the actual computing equipment.Organized into three parts encompassing 12 chapters, this book begins with an overview of the evolution of personal computing and includes detailed case studies on two of the most essential personal computers for the 1980s, namely, the IBM Personal Computer and Apple's Macintosh. This text then traces the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapte
Tahir Shah, K.
There are an estimated 100,000 genes in the human genome of which 97% is non-coding. On the other hand, bacteria have little or no non-coding DNA. Non-coding region includes introns, ALU sequences, satellite DNA, and other segments not expressed as proteins. Why it exists? Why nature has kept non-coding during the long evolutionary period if it has no role in the development of complex life forms? Does complexity of a species somehow correlated to the existence of apparently useless sequences? What kind of capability is encoded within such nucleotide sequences that is a necessary, but not a sufficient condition for the evolution of complex life forms, keeping in mind the C-value paradox and the omnipresence of non-coding segments in higher eurkaryotes and also in many archea and prokaryotes. The physico-chemical description of biological processes is hardware oriented and does not highlight algorithmic or information processing aspect. However, an algorithm without its hardware implementation is useless as much as hardware without its capability to run an algorithm. The nature and type of computation an information-processing hardware can perform depends only on its algorithm and the architecture that reflects the algorithm. Given that enormously difficult tasks such as high fidelity replication, transcription, editing and regulation are all achieved within a long linear sequence, it is natural to think that some parts of a genome are involved is these tasks. If some complex algorithms are encoded with these parts, then it is natural to think that non-coding regions contain processing-information algorithms. A comparison between well-known automatic sequences and sequences constructed out of motifs is found in all species proves the point: noncoding regions are a sort of ''hardwired'' programs, i.e., they are linear representations of information-processing machines. Thus in our model, a noncoding region, e.g., an intron contains a program (or equivalently, it is
The advantage of the photon's mobility makes optical quantum system ideally suited for delegated quantum computation. I will present results for the realization for a measurement-based quantum network in a client-server environment, where quantum information is securely communicated and computed. Related to measurement-based quantum computing I will discuss a recent experiment showing that quantum discord can be used as resource for the remote state preparation, which might shine new light on the requirements for quantum-enhanced information processing. Finally, I will briefly review recent photonic quantum simulation experiments of four frustrated Heisenberg-interactions spins and present an outlook of feasible simulation experiments with more complex interactions or random walk structures. As outlook I will discuss the current status of new quantum technology for improving the scalability of photonic quantum systems by using superconducting single-photon detectors and tailored light-matter interactions. (author)
Lala, J. H.
Design and performance details of the advanced information processing system (AIPS) for fault and damage tolerant data processing on aircraft and spacecraft are presented. AIPS comprises several computers distributed throughout the vehicle and linked by a damage tolerant data bus. Most I/O functions are available to all the computers, which run in a TDMA mode. Each computer performs separate specific tasks in normal operation and assumes other tasks in degraded modes. Redundant software assures that all fault monitoring, logging and reporting are automated, together with control functions. Redundant duplex links and damage-spread limitation provide the fault tolerance. Details of an advanced design of a laboratory-scale proof-of-concept system are described, including functional operations.
Science Communications International (SCI), formerly General Science Corporation, has developed several commercial products based upon experience acquired as a NASA Contractor. Among them are METPRO, a meteorological data acquisition and processing system, which has been widely used, RISKPRO, an environmental assessment system, and MAPPRO, a geographic information system. METPRO software is used to collect weather data from satellites, ground-based observation systems and radio weather broadcasts to generate weather maps, enabling potential disaster areas to receive advance warning. GSC's initial work for NASA Goddard Space Flight Center resulted in METPAK, a weather satellite data analysis system. METPAK led to the commercial METPRO system. The company also provides data to other government agencies, U.S. embassies and foreign countries.
Garnier, Anne; Trémas, Thierry; Pelon, Jacques; Lee, Kam-Pui; Nobileau, Delphine; Gross-Colzy, Lydwine; Pascal, Nicolas; Ferrage, Pascale; Scott, Noëlle A.
Version 2 of the Level 1b calibrated radiances of the Imaging Infrared Radiometer (IIR) on board the Cloud-Aerosol Lidar and Infrared Satellite Observation (CALIPSO) satellite has been released recently. This new version incorporates corrections of small but systematic seasonal calibration biases previously revealed in Version 1 data products mostly north of 30° N. These biases - of different amplitudes in the three IIR channels 8.65 µm (IIR1), 10.6 µm (IIR2), and 12.05 µm (IIR3) - were made apparent by a striping effect in images of IIR inter-channel brightness temperature differences (BTDs) and through seasonal warm biases of nighttime IIR brightness temperatures in the 30-60° N latitude range. The latter were highlighted through observed and simulated comparisons with similar channels of the Moderate Resolution Imaging Spectroradiometer (MODIS) on board the Aqua spacecraft. To characterize the calibration biases affecting Version 1 data, a semi-empirical approach is developed, which is based on the in-depth analysis of the IIR internal calibration procedure in conjunction with observations such as statistical comparisons with similar MODIS/Aqua channels. Two types of calibration biases are revealed: an equalization bias affecting part of the individual IIR images and a global bias affecting the radiometric level of each image. These biases are observed only when the temperature of the instrument increases, and they are found to be functions of elapsed time since night-to-day transition, regardless of the season. Correction coefficients of Version 1 radiances could thus be defined and implemented in the Version 2 code. As a result, the striping effect seen in Version 1 is significantly attenuated in Version 2. Systematic discrepancies between nighttime and daytime IIR-MODIS BTDs in the 30-60° N latitude range in summer are reduced from 0.2 K in Version 1 to 0.1 K in Version 2 for IIR1-MODIS29. For IIR2-MODIS31 and IIR3-MODIS32, they are reduced from 0.4 K
Stewart, L. J.
The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.
National Aeronautics and Space Administration — The Level-1B (L1B) Radiance Product OML1BRVZ (Version-3) from the Aura-OMI is now available (http://disc.gsfc.nasa.gov/Aura/OMI/oml1brvz_v003.shtml) to public from...
National Aeronautics and Space Administration — The Level-1B (L1B) Radiance Product OML1BRUG (Version-3) from the Aura-OMI is now available to public (http://disc.gsfc.nasa.gov/Aura/OMI/oml1brug_v003.shtml) from...
National Aeronautics and Space Administration — The Level-1B (L1B) Radiance Product OML1BRUZ (Version-3) from the Aura-OMI is now available (http://disc.gsfc.nasa.gov/Aura/OMI/oml1bruz_v003.shtml) to public from...
business form in which information is entered by filling in blanks, or circling alternatives. The fields of the form cor- respond to the various pieces...power. Parallelism, rather than raw speed of the computing elements, seems to be the way that the 4-15 MACHINE INTELIGENCE brain gets such jobs done...MACHINE INTELIGENCE all intelligent systems. The purpose of this paper is to characterize the weak methods and to explain how and why they arise in
This paper demonstrates, how cross-functional business processes may be aligned with product specification systems in an intra-organizational environment by integrating planning systems and expert systems, thereby providing an end-to-end integrated and an automated solution to the “build-to-order...
applications in optical disk memory systems [91. This device is constructed in a glass /SiO2/Si waveguide. The choice of a Si substrate allows for the...contact mask) were formed in the photoresist deposited on all of the samples, we covered the unwanted gratings on each sample with cover glass slides...processing, let us consider TeO2 (v, = 620 m/s) as a potential substrate for applications requiring large time delays. This con- sideration is despite
The reasons for the current widespread arguments between designers of advanced technological systems like, for instance, nuclear power plants and opponents from the general public concerning levels of acceptable risk may be found in incompatible definitions of risk, in differences in risk perception and criteria for acceptance, etc. Of importance may, however, also be the difficulties met in presenting the basis for risk analysis, such as the conceptual system models applied, in an explicit and credible form. Application of modern information technology for the design of control systems and human-machine interfaces together with the trends towards large centralised industrial installations have made it increasingly difficult to establish an acceptable model framework, in particular considering the role of human errors in major system failures and accidents. Different aspects of this problem are discussed in the paper, and areas are identified where research is needed in order to improve not only the safety of advanced systems, but also the basis for their acceptance by the general public. (author)
Schmidt, Erik Meineche
BRICS is a research centre and international PhD school in theoretical computer science, based at the University of Aarhus, Denmark. The centre has recently become engaged in quantum information processing in cooperation with the Department of Physics, also University of Aarhus. This extended...... abstract surveys activities at BRICS with special emphasis on the activities in quantum information processing....
Bartosz Mackowiak; Mirko Wiederholt
Decision-makers often face limited liability and thus know that their loss will be bounded. We study how limited liability affects the behavior of an agent who chooses how much information to acquire and process in order to take a good decision. We find that an agent facing limited liability processes less information than an agent with unlimited liability. The informational gap between the two agents is larger in bad times than in good times and when information is more costly to process.
Mahoney, John R; Ellison, Christopher J; Crutchfield, James P [Complexity Sciences Center and Physics Department, University of California at Davis, One Shields Avenue, Davis, CA 95616 (United States)], E-mail: firstname.lastname@example.org, E-mail: email@example.com, E-mail: firstname.lastname@example.org
We give a systematic expansion of the crypticity-a recently introduced measure of the inaccessibility of a stationary process's internal state information. This leads to a hierarchy of k-cryptic processes and allows us to identify finite-state processes that have infinite cryptic order-the internal state information is present across arbitrarily long, observed sequences. The crypticity expansion is exact in both the finite- and infinite-order cases. It turns out that k-crypticity is complementary to the Markovian finite-order property that describes state information in processes. One application of these results is an efficient expansion of the excess entropy-the mutual information between a process's infinite past and infinite future-that is finite and exact for finite-order cryptic processes. (fast track communication)
Lutters, Diederick; Wijnker, T.C.; Kals, H.J.J.
A recently proposed reference model indicates the use of structured information as the basis for the control of design and manufacturing processes. The model is used as a basis to describe the integration of design and process planning. A differentiation is made between macro- and micro process
Piccinini, Gualtiero; Scarantino, Andrea
Computation and information processing are among the most fundamental notions in cognitive science. They are also among the most imprecisely discussed. Many cognitive scientists take it for granted that cognition involves computation, information processing, or both - although others disagree vehemently. Yet different cognitive scientists use 'computation' and 'information processing' to mean different things, sometimes without realizing that they do. In addition, computation and information processing are surrounded by several myths; first and foremost, that they are the same thing. In this paper, we address this unsatisfactory state of affairs by presenting a general and theory-neutral account of computation and information processing. We also apply our framework by analyzing the relations between computation and information processing on one hand and classicism, connectionism, and computational neuroscience on the other. We defend the relevance to cognitive science of both computation, at least in a generic sense, and information processing, in three important senses of the term. Our account advances several foundational debates in cognitive science by untangling some of their conceptual knots in a theory-neutral way. By leveling the playing field, we pave the way for the future resolution of the debates' empirical aspects.
Arnfred, Sidse M H
of the left somatosensory cortex and it was suggested to be in accordance with two theories of schizophrenic information processing: the theory of deficiency of corollary discharge and the theory of weakening of the influence of past regularities. No gating deficiency was observed and the imprecision...... Rado (1890-1972) suggested that one of two un-reducible deficits in schizophrenia was a disorder of proprioception. Exploration of proprioceptive information processing is possible through the measurement of evoked and event related potentials. Event related EEG can be analyzed as conventional time...... and amplitude attenuation was not a general phenomenon across the entire brain response. Summing up, in support of Rado's hypothesis, schizophrenia spectrum patients demonstrated abnormalities in proprioceptive information processing. Future work needs to extend the findings in larger un-medicated, non...
The ultimate goal of the classicality program is to quantify the amount of quantumness of certain processes. Here, classicality is studied for a restricted type of process: quantum information processing (QIP). Under special conditions, one can force some qubits of a quantum computer into a classical state without affecting the outcome of the computation. The minimal set of conditions is described and its structure is studied. Some implications of this formalism are the increase of noise robustness, a proof of the quantumness of mixed state quantum computing, and a step forward in understanding the very foundation of QIP
Klos, R.A.; Sinclair, J.E.; Torres, C.; Bergstroem, U.; Galson, D.A.
This report focuses on the biosphere modelling aspects of the assessment of the radiological impact of the disposal of radioactive waste in greater detail. Seven exposure pathways are modelled: drinking water, freshwater fish, meat, milk and grain consumption as well as external gamma irradiation and contaminated dust inhalation. The accumulation of radionuclides, released in groundwaters, in the upper soil is also modelled. The objectives of this Level 1b exercise can be summarized as follows: 1 to gain experience in the application of probabilistic systems assessment methodology to transport and radiological exposure sub-models for the biosphere and hence to methods of estimating the total risk to individuals or groups of individuals; 2 to contribute to the verification of biosphere transport and exposure sub-models; 3 to investigate the effects of parameter uncertainty in the biosphere transport and exposure sub-models on the estimate of mean dose to individuals exposed via several exposure pathways
Barato, Andre C; Hartich, David; Seifert, Udo
We show that a rate of conditional Shannon entropy reduction, characterizing the learning of an internal process about an external process, is bounded by the thermodynamic entropy production. This approach allows for the definition of an informational efficiency that can be used to study cellular information processing. We analyze three models of increasing complexity inspired by the Escherichia coli sensory network, where the external process is an external ligand concentration jumping between two values. We start with a simple model for which ATP must be consumed so that a protein inside the cell can learn about the external concentration. With a second model for a single receptor we show that the rate at which the receptor learns about the external environment can be nonzero even without any dissipation inside the cell since chemical work done by the external process compensates for this learning rate. The third model is more complete, also containing adaptation. For this model we show inter alia that a bacterium in an environment that changes at a very slow time-scale is quite inefficient, dissipating much more than it learns. Using the concept of a coarse-grained learning rate, we show for the model with adaptation that while the activity learns about the external signal the option of changing the methylation level increases the concentration range for which the learning rate is substantial. (paper)
Wickens, Christopher D.; Flach, John M.
Theoretical models of sensory-information processing by the human brain are reviewed from a human-factors perspective, with a focus on their implications for aircraft and avionics design. The topics addressed include perception (signal detection and selection), linguistic factors in perception (context provision, logical reversals, absence of cues, and order reversals), mental models, and working and long-term memory. Particular attention is given to decision-making problems such as situation assessment, decision formulation, decision quality, selection of action, the speed-accuracy tradeoff, stimulus-response compatibility, stimulus sequencing, dual-task performance, task difficulty and structure, and factors affecting multiple task performance (processing modalities, codes, and stages).
Seelen, Werner v
In this fundamental book the authors devise a framework that describes the working of the brain as a whole. It presents a comprehensive introduction to the principles of Neural Information Processing as well as recent and authoritative research. The books´ guiding principles are the main purpose of neural activity, namely, to organize behavior to ensure survival, as well as the understanding of the evolutionary genesis of the brain. Among the developed principles and strategies belong self-organization of neural systems, flexibility, the active interpretation of the world by means of construction and prediction as well as their embedding into the world, all of which form the framework of the presented description. Since, in brains, their partial self-organization, the lifelong adaptation and their use of various methods of processing incoming information are all interconnected, the authors have chosen not only neurobiology and evolution theory as a basis for the elaboration of such a framework, but also syst...
Pitts, Felix L.
Advanced Information Processing System (AIPS) is a computer systems philosophy, a set of validated hardware building blocks, and a set of validated services as embodied in system software. The goal of AIPS is to provide the knowledgebase which will allow achievement of validated fault-tolerant distributed computer system architectures, suitable for a broad range of applications, having failure probability requirements of 10E-9 at 10 hours. A background and description is given followed by program accomplishments, the current focus, applications, technology transfer, FY92 accomplishments, and funding.
Reina Estupinan, John-Henry
Since information has been regarded os a physical entity, the field of quantum information theory has blossomed. This brings novel applications, such as quantum computation. This field has attracted the attention of numerous researchers with backgrounds ranging from computer science, mathematics and engineering, to the physical sciences. Thus, we now have an interdisciplinary field where great efforts are being made in order to build devices that should allow for the processing of information at a quantum level, and also in the understanding of the complex structure of some physical processes at a more basic level. This thesis is devoted to the theoretical study of structures at the nanometer-scale, 'nanostructures', through physical processes that mainly involve the solid-state and quantum optics, in order to propose reliable schemes for the processing of quantum information. Initially, the main results of quantum information theory and quantum computation are briefly reviewed. Next, the state-of-the-art of quantum dots technology is described. In so doing, the theoretical background and the practicalities required for this thesis are introduced. A discussion of the current quantum hardware used for quantum information processing is given. In particular, the solid-state proposals to date are emphasised. A detailed prescription is given, using an optically-driven coupled quantum dot system, to reliably prepare and manipulate exciton maximally entangled Bell and Greenberger-Horne-Zeilinger (GHZ) states. Manipulation of the strength and duration of selective light-pulses needed for producing these highly entangled states provides us with crucial elements for the processing of solid-state based quantum information. The all-optical generation of states of the so-called Bell basis for a system of two quantum dots (QDs) is exploited for performing the quantum teleportation of the excitonic state of a dot in an array of three coupled QDs. Theoretical predictions suggest
Urquhart, Christine; Tbaishat, Dina; Yeoman, Alison
This book adopts a holistic interpretation of information architecture, to offer a variety of methods, tools, and techniques that may be used when designing websites and information systems that support workflows and what people require when 'managing information'.
Quantum Information Processing (QIP) is expected to bring revolutionary enhancement to various technological areas. However, today's QIP applications are far from being practical. The problem involves both hardware issues, i.e., quantum devices are imperfect, and software issues, i.e., the functionality of some QIP applications is not fully understood. Aiming to improve the practicality of QIP, in my PhD research I have studied various topics in quantum cryptography and ion trap quantum computation. In quantum cryptography, I first studied the security of position-based quantum cryptography (PBQC). I discovered a wrong assumption in the previous literature that the cheaters are not allowed to share entangled resources. I proposed entanglement attacks that could cheat all known PBQC protocols. I also studied the practicality of continuous-variable (CV) quantum secret sharing (QSS). While the security of CV QSS was considered by the literature only in the limit of infinite squeezing, I found that finitely squeezed CV resources could also provide finite secret sharing rate. Our work relaxes the stringent resources requirement of implementing QSS. In ion trap quantum computation, I studied the phase error of quantum information induced by dc Stark effect during ion transportation. I found an optimized ion trajectory for which the phase error is the minimum. I also defined a threshold speed, above which ion transportation would induce significant error. In addition, I proposed a new application for ion trap systems as universal bosonic simulators (UBS). I introduced two architectures, and discussed their respective strength and weakness. I illustrated the implementations of bosonic state initialization, transformation, and measurement by applying radiation fields or by varying the trap potential. When comparing with conducting optical experiments, the ion trap UBS is advantageous in higher state initialization efficiency and higher measurement accuracy. Finally, I
Maggini, Marco; Jain, Lakhmi
This handbook presents some of the most recent topics in neural information processing, covering both theoretical concepts and practical applications. The contributions include: Deep architectures Recurrent, recursive, and graph neural networks Cellular neural networks Bayesian networks Approximation capabilities of neural networks Semi-supervised learning Statistical relational learning Kernel methods for structured data Multiple classifier systems Self organisation and modal learning Applications to ...
Timucin, Dogan Aslan
Low computational accuracy is an important obstacle for optical processors which blocks their way to becoming a practical reality and a serious challenger for classical computing paradigms. This research presents a comprehensive solution approach to the problem of accuracy enhancement in discrete analog optical information processing systems. Statistical analysis of a generic three-plane optical processor is carried out first, taking into account the effects of diffraction, interchannel crosstalk, and background radiation. Noise sources included in the analysis are photon, excitation, and emission fluctuations in the source array, transmission and polarization fluctuations in the modulator, and photoelectron, gain, dark, shot, and thermal noise in the detector array. Means and mutual coherence and probability density functions are derived for both optical and electrical output signals. Next, statistical models for a number of popular optoelectronic devices are studied. Specific devices considered here are light-emitting and laser diode sources, an ideal noiseless modulator and a Gaussian random-amplitude-transmittance modulator, p-i-n and avalanche photodiode detectors followed by electronic postprocessing, and ideal free-space geometrical -optics propagation and single-lens imaging systems. Output signal statistics are determined for various interesting device combinations by inserting these models into the general formalism. Finally, based on these special-case output statistics, results on accuracy limitations and enhancement in optical processors are presented. Here, starting with the formulation of the accuracy enhancement problem as (1) an optimal detection problem and (2) as a parameter estimation problem, the potential accuracy improvements achievable via the classical multiple-hypothesis -testing and maximum likelihood and Bayesian parameter estimation methods are demonstrated. Merits of using proper normalizing transforms which can potentially stabilize
Andersen, Ulrik Lund; Leuchs, G.; Silberhorn, C.
the continuous degree of freedom of a quantum system for encoding, processing or detecting information, one enters the field of continuous-variable (CV) quantum information processing. In this paper we review the basic principles of CV quantum information processing with main focus on recent developments...... in the field. We will be addressing the three main stages of a quantum information system; the preparation stage where quantum information is encoded into CVs of coherent states and single-photon states, the processing stage where CV information is manipulated to carry out a specified protocol and a detection...... stage where CV information is measured using homodyne detection or photon counting....
Infochemistry: Information Processing at the Nanoscale, defines a new field of science, and describes the processes, systems and devices at the interface between chemistry and information sciences. The book is devoted to the application of molecular species and nanostructures to advanced information processing. It includes the design and synthesis of suitable materials and nanostructures, their characterization, and finally applications of molecular species and nanostructures for information storage and processing purposes. Divided into twelve chapters; the first three chapters serve as an int
Torres, Jesús; Saldaña, David; Rodríguez-Ortiz, Isabel R.
The goal of this study was to compare the processing of social information in deaf and hearing adolescents. A task was developed to assess social information processing (SIP) skills of deaf adolescents based on Crick and Dodge's (1994; A review and reformulation of social information-processing mechanisms in children's social adjustment.…
Hölscher, Christian; Munk, Matthias
... simultaneously recorded spike trains 120 Mark Laubach, Nandakumar S. Narayanan, and Eyal Y. Kimchi Part III Neuronal population information coding and plasticity in specific brain areas 149 7 F...
Kirby, John R.; Das, J. P.
The simultaneous and successive processing model of cognitive abilities was compared to a traditional primary mental abilities model. Simultaneous processing was found to be primarily related to spatial ability; and to a lesser extent, to memory and inductive reasoning. Subjects were 104 fourth-grade urban males. (Author/GD C)
Harris, R. L., Sr.
The scanning behavior of pilots must be understood so that cockpit displays can be assembled which will provide the most information accurately and quickly to the pilot. The results of seven years of collecting and analyzing pilot scanning data are summarized. The data indicate that pilot scanning behavior is: (1) subsconscious; (2) situation dependent; and (3) can be disrupted if pilots are forced to make conscious decisions. Testing techniques and scanning analysis techniques have been developed that are sensitive to pilot workload.
Calfee, Robert C.
"This paper consists of three sections--(a) the relation of theoretical analyses of learning to curriculum design, (b) the role of information-processing models in analyses of learning processes, and (c) selected examples of the application of information-processing models to curriculum design problems." (Author)
Haase, Richard F.; Jome, LaRae M.; Ferreira, Joaquim Armando; Santos, Eduardo J. R.; Connacher, Christopher C.; Sendrowitz, Kerrin
The purpose of this study was to provide additional validity evidence for a model of person-environment fit based on polychronicity, stimulus load, and information processing capacities. In this line of research the confluence of polychronicity and information processing (e.g., the ability of individuals to process stimuli from the environment…
Hart, Eric W.
The mathematics of information processing and the Internet can be organized around four fundamental themes: (1) access (finding information easily); (2) security (keeping information confidential); (3) accuracy (ensuring accurate information); and (4) efficiency (data compression). In this article, the author discusses each theme with reference to…
Horton, Rebecca; Carroll, Malcolm S.; Tarman, Thomas David
Qubits demonstrated using GaAs double quantum dots (DQD). The qubit basis states are the (1) singlet and (2) triplet stationary states. Long spin decoherence times in silicon spurs translation of GaAs qubit in to silicon. In the near term the goals are: (1) Develop surface gate enhancement mode double quantum dots (MOS & strained-Si/SiGe) to demonstrate few electrons and spin read-out and to examine impurity doped quantum-dots as an alternative architecture; (2) Use mobility, C-V, ESR, quantum dot performance & modeling to feedback and improve upon processing, this includes development of atomic precision fabrication at SNL; (3) Examine integrated electronics approaches to RF-SET; (4) Use combinations of numerical packages for multi-scale simulation of quantum dot systems (NEMO3D, EMT, TCAD, SPICE); and (5) Continue micro-architecture evaluation for different device and transport architectures.
Jaeschke, A.; Keller, H.; Orth, H.
On a production management level, a process information system in a nuclear reprocessing plant (NRP) has to fulfill conventional operating functions and functions for nuclear material surveillance (safeguards). Based on today's state of the art of on-line process control technology, the progress in hardware and software technology allows to introduce more process-specific intelligence into process information systems. Exemplified by an expert-system-aided laboratory management system as component of a NRP process information system, the paper demonstrates that these technologies can be applied already. (DG) [de
Coenen, A.M.L.; Drinkenburg, W.H.I.M.
Information provided by external stimuli does reach the brain during sleep, although the amount of information is reduced during sleep compared to wakefulness. The process controlling this reduction is called `sensory' gating and evidence exists that the underlying neurophysiological processes take
This study examined the strategies commonly adopted by Osun state secondary school students in processing career information. It specifically examined the sources of career information available to the students, the uses to which the students put the information collected and how their career decision making skills can be ...
Aberer, Karl; Wombacher, Andreas
Automatizing information commerce requires languages to represent the typical information commerce processes. Existing languages and standards cover either only very specific types of business models or are too general to capture in a concise way the specific properties of information commerce
Leemans, S.J.J.; Fahland, D.; Van Der Aalst, W.M.P.; Reichert, M.; Reijers, H.A.
Understanding the performance of business processes is an important part of any business process intelligence project. From historical information recorded in event logs, performance can be measured and visualized on a discovered process model. Thereby the accuracy of the measured performance, e.g.,
DOE O 232.1A, Occurrence Reporting and Processing of Operations Information, and 10 CFR 830.350, Occurrence Reporting and Processing of Operations Information (when it becomes effective), along with this manual, set forth occurrence reporting requirements for Department of Energy (DOE) Departmental Elements and contractors responsible for the management and operation of DOE-owned and -leased facilities. These requirements include categorization of occurrences related to safety, security, environment, health, or operations (``Reportable Occurrences``); DOE notification of these occurrences; and the development and submission of documented follow-up reports. This Manual provides detailed information for categorizing and reporting occurrences at DOE facilities. Information gathered by the Occurrence Reporting and processing System is used for analysis of the Department`s performance in environmental protection, safeguards and security, and safety and health of its workers and the public. This information is also used to develop lessons learned and document events that significantly impact DOE operations.
DOE O 232.1A, Occurrence Reporting and Processing of Operations Information, and 10 CFR 830.350, Occurrence Reporting and Processing of Operations Information (when it becomes effective), along with this manual, set forth occurrence reporting requirements for Department of Energy (DOE) Departmental Elements and contractors responsible for the management and operation of DOE-owned and -leased facilities. These requirements include categorization of occurrences related to safety, security, environment, health, or operations (''Reportable Occurrences''); DOE notification of these occurrences; and the development and submission of documented follow-up reports. This Manual provides detailed information for categorizing and reporting occurrences at DOE facilities. Information gathered by the Occurrence Reporting and processing System is used for analysis of the Department's performance in environmental protection, safeguards and security, and safety and health of its workers and the public. This information is also used to develop lessons learned and document events that significantly impact DOE operations
Rieffel, Eleanor G.
This survey, aimed at information processing researchers, highlights intriguing but lesser known results, corrects misconceptions, and suggests research areas. Themes include: certainty in quantum algorithms; the "fewer worlds" theory of quantum mechanics; quantum learning; probability theory versus quantum mechanics.
qubits, the 2n energy levels of the spin-system can be treated as an n-qubit system. ... Quantum information processing; qubit; nuclear magnetic resonance quantum comput- ing. ..... The equilibrium spectrum has theoretical intensities in the ra-.
Crowe, Sarah; Tully, Mary P; Cantrill, Judith A
The need for effective communication and handling of secondary care information in general practices is paramount. To explore practice processes on receiving secondary care correspondence in a way that integrates the information needs and perceptions of practice staff both clinical and administrative. Qualitative study using semi-structured interviews with a wide range of practice staff (n = 36) in nine practices in the Northwest of England. Analysis was based on the framework approach using N-Vivo software and involved transcription, familiarization, coding, charting, mapping and interpretation. The 'information processing model' was developed to describe the six stages involved in practice processing of secondary care information. These included the amendment or updating of practice records whilst simultaneously or separately actioning secondary care recommendations, using either a 'one-step' or 'two-step' approach, respectively. Many factors were found to influence each stage and impact on the continuum of patient care. The primary purpose of processing secondary care information is to support patient care; this study raises the profile of information flow and usage within practices as an issue requiring further consideration.
Full Text Available Securing sensitive organizational data has become increasingly vital to organizations. An Information Security Management System (ISMS is a systematic approach for establishing, implementing, operating, monitoring, reviewing, maintaining and improving an organization's information security. Key elements of the operation of an ISMS are ISMS processes. However, and in spite of its importance, an ISMS process framework with a description of ISMS processes and their interaction as well as the interaction with other management processes is not available in the literature. Cost benefit analysis of information security investments regarding single measures protecting information and ISMS processes are not in the focus of current research, mostly focused on economics. This article aims to fill this research gap by proposing such an ISMS process framework as the main contribution. Based on a set of agreed upon ISMS processes in existing standards like ISO 27000 series, COBIT and ITIL. Within the framework, identified processes are described and their interaction and interfaces are specified. This framework helps to focus on the operation of the ISMS, instead of focusing on measures and controls. By this, as a main finding, the systemic character of the ISMS consisting of processes and the perception of relevant roles of the ISMS is strengthened.
Full Text Available The purpose of this study was to evaluate the information processing of 43 business managers with a professional superior performance. The theoretical framework considers three models: the Theory of Managerial Roles of Henry Mintzberg, the Theory of Information Processing, and Process Model Response to Rorschach by John Exner. The participants have been evaluated by Rorschach method. The results show that these managers are able to collect data, evaluate them and establish rankings properly. At same time, they are capable of being objective and accurate in the problems assessment. This information processing style permits an interpretation of the world around on basis of a very personal and characteristic processing way or cognitive style.
Tan, Wee-Kek; Tan, Chuan-Hoo
Acquiring the knowledge to assemble an integrated Information System (IS) development process that is tailored to the specific needs of a project has become increasingly important. It is therefore necessary for educators to impart to students this crucial skill. However, Situational Method Engineering (SME) is an inherently complex process that…
Song, X.M.; Zang, F.; Bij, van der J.D.; Weggeman, M.C.D.P.
Despite the obvious linkage between information technologies (IT) and knowledge processes and the apparent strategic importance of both, little research has done to explicitly examine how, if at all, IT and knowledge processes affect firm outcomes. The purpose of this study is to bridge this
Full Text Available Main laws of technical thermodynamics are universal and could be applied to processes other than thermodynamic ones. The results of the comparison of peculiarities of irreversible informational and thermodynamic processes are presented in the article and a new term “Infopy” is used. A more precise definition of “infopy” as an energetic charge is given in the article.
Quantum information processors exploit the quantum features of superposition and entanglement for applications not possible in classical devices, offering the potential for significant improvements in the communication and processing of information. Experimental realization of large-scale quantum information processors remains a long term vision, as the required nearly pure quantum behaviour is observed only in exotic hardware such as individual laser-cooled atoms and isolated photons. But recent theoretical and experimental advances suggest that cold atoms and individual photons may lead the way towards bigger and better quantum information processors, effectively building mesoscopic versions of Schroedinger's cat' from the bottom up. (author)
Treats the Mathematics of many important areas in digital information processing. This book covers, in a unified presentation, five topics: Data Compression, Cryptography, Sampling (Signal Theory), Error Control Codes, Data Reduction. It is useful for teachers, students and practitioners in Electronic Engineering, Computer Science and Mathematics.
Mikhail O. Kolbanev
Full Text Available Subject of study. The paper describes basic information technologies for automating of information processes of data storage, distribution and processing in terms of required physical resources. It is shown that the study of these processes with such traditional objectives of modern computer science, as the ability to transfer knowledge, degree of automation, information security, coding, reliability, and others, is not enough. The reasons are: on the one hand, the increase in the volume and intensity of information exchange in the subject of human activity and, on the other hand, drawing near to the limit of information systems efficiency based on semiconductor technologies. Creation of such technologies, which not only provide support for information interaction, but also consume a rational amount of physical resources, has become an actual problem of modern engineering development. Thus, basic information technologies for storage, distribution and processing of information to support the interaction between people are the object of study, and physical temporal, spatial and energy resources required for implementation of these technologies are the subject of study. Approaches. An attempt is made to enlarge the possibilities of traditional cybernetics methodology, which replaces the consideration of material information component by states search for information objects. It is done by taking explicitly into account the amount of physical resources required for changes in the states of information media. Purpose of study. The paper deals with working out of a common approach to the comparison and subsequent selection of basic information technologies for storage, distribution and processing of data, taking into account not only the requirements for the quality of information exchange in particular subject area and the degree of technology application, but also the amounts of consumed physical resources. Main findings. Classification of resources
Mandic, D.; Barbic, B.; Linke, B.; Colak, I.
Original NEK design was using several Process Computer Systems (PCS) for both process control and process supervision. PCS were built by different manufacturers around different hardware and software platforms. Operational experience and new regulatory requirements imposed new technical and functional requirements on the PCS. Requirements such as: - Acquisition of new signals from the technological processes and environment - Implementation of new application programs - Significant improvement of MMI (Man Machine Interface) - Process data transfer to other than Main Control Room (MCR) locations - Process data archiving and capability to retrieve same data for future analysis were impossible to be implemented within old systems. In order to satisfy new requirements, NEK has decided to build new Process Information System (PIS). During the design and construction of the PIS Project Phase I, in addition to the main foreign contractor, there was significant participation of local architect engineering and construction companies. This paper presents experience of NEK and local partners. (author)
Faist, Philippe; Dupuis, Frédéric; Oppenheim, Jonathan; Renner, Renato
Irreversible information processing cannot be carried out without some inevitable thermodynamical work cost. This fundamental restriction, known as Landauer's principle, is increasingly relevant today, as the energy dissipation of computing devices impedes the development of their performance. Here we determine the minimal work required to carry out any logical process, for instance a computation. It is given by the entropy of the discarded information conditional to the output of the computation. Our formula takes precisely into account the statistically fluctuating work requirement of the logical process. It enables the explicit calculation of practical scenarios, such as computational circuits or quantum measurements. On the conceptual level, our result gives a precise and operational connection between thermodynamic and information entropy, and explains the emergence of the entropy state function in macroscopic thermodynamics.
In this article, I argue that consciousness is a unique way of processing information, in that: it produces information, rather than purely transmitting it; the information it produces is meaningful for us; the meaning it has is always individuated. This uniqueness allows us to process information on the basis of our personal needs and ever-changing interactions with the environment, and consequently to act autonomously. Three main basic cognitive processes contribute to realize this unique way of information processing: the self, attention and working memory. The self, which is primarily expressed via the central and peripheral nervous systems, maps our body, the environment, and our relations with the environment. It is the primary means by which the complexity inherent to our composite structure is reduced into the "single voice" of a unique individual. It provides a reference system that (albeit evolving) is sufficiently stable to define the variations that will be used as the raw material for the construction of conscious information. Attention allows for the selection of those variations in the state of the self that are most relevant in the given situation. Attention originates and is deployed from a single locus inside our body, which represents the center of the self, around which all our conscious experiences are organized. Whatever is focused by attention appears in our consciousness as possessing a spatial quality defined by this center and the direction toward which attention is focused. In addition, attention determines two other features of conscious experience: periodicity and phenomenal quality. Self and attention are necessary but not sufficient for conscious information to be produced. Complex forms of conscious experiences, such as the various modes of givenness of conscious experience and the stream of consciousness, need a working memory mechanism to assemble the basic pieces of information selected by attention.
Since the terms Data Warehouse and On-Line Analytical Processing were proposed by Inmon and Codd, Codd, Sally respectively the traditional ideas of creating information systems in support of management¿s decision became interesting again in theory and practice. Today information warehousing is a strategic market for any data base systems vendor. Nevertheless the theoretical discussions of this topic go back to the early years of the 20th century as far as management science and accounting the...
Lichtenwalner, Peter F.; White, Edward V.; Baumann, Erwin W.
Structural health monitoring (SHM) technology provides a means to significantly reduce life cycle of aerospace vehicles by eliminating unnecessary inspections, minimizing inspection complexity, and providing accurate diagnostics and prognostics to support vehicle life extension. In order to accomplish this, a comprehensive SHM system will need to acquire data from a wide variety of diverse sensors including strain gages, accelerometers, acoustic emission sensors, crack growth gages, corrosion sensors, and piezoelectric transducers. Significant amounts of computer processing will then be required to convert this raw sensor data into meaningful information which indicates both the diagnostics of the current structural integrity as well as the prognostics necessary for planning and managing the future health of the structure in a cost effective manner. This paper provides a description of the key types of information processing technologies required in an effective SHM system. These include artificial intelligence techniques such as neural networks, expert systems, and fuzzy logic for nonlinear modeling, pattern recognition, and complex decision making; signal processing techniques such as Fourier and wavelet transforms for spectral analysis and feature extraction; statistical algorithms for optimal detection, estimation, prediction, and fusion; and a wide variety of other algorithms for data analysis and visualization. The intent of this paper is to provide an overview of the role of information processing for SHM, discuss various technologies which can contribute to accomplishing this role, and present some example applications of information processing for SHM implemented at the Boeing Company.
Quantification of information leakage is a successful approach for evaluating the security of a system. It models the system to be analyzed as a channel with the secret as the input and an output as observable by the attacker as the output, and applies information theory to quantify the amount...... and randomized processes with Markovian models and to compute their information leakage for a very general model of attacker. We present the QUAIL tool that automates such analysis and is able to compute the information leakage of an imperative WHILE language. Finally, we show how to use QUAIL to analyze some...... of information transmitted through such channel, thus effectively quantifying how many bits of the secret can be inferred by the attacker by analyzing the system’s output. Channels are usually encoded as matrices of conditional probabilities, known as channel matrices. Such matrices grow exponentially...
A variety of experimental results over the past decades provide examples of near-optimal information processing in biological networks, including in biochemical and transcriptional regulatory networks. Computing information-theoretic quantities requires first choosing or computing the joint probability distribution describing multiple nodes in such a network --- for example, representing the probability distribution of finding an integer copy number of each of two interacting reactants or gene products while respecting the `intrinsic' small copy number noise constraining information transmission at the scale of the cell. I'll given an overview of some recent analytic and numerical work facilitating calculation of such joint distributions and the associated information, which in turn makes possible numerical optimization of information flow in models of noisy regulatory and biochemical networks. Illustrating cases include quantification of form-function relations, ideal design of regulatory cascades, and response to oscillatory driving.
Fagg, G.E.; Moore, K. [Univ. of Tennessee, Knoxville, TN (United States). Dept. of Computer Science; Dongarra, J.J. [Univ. of Tennessee, Knoxville, TN (United States). Dept. of Computer Science]|[Oak Ridge National Lab., TN (United States). Computer Science and Mathematics Div.; Geist, A. [Oak Ridge National Lab., TN (United States). Computer Science and Mathematics Div.
SNIPE is a metacomputing system that aims to provide a reliable, secure, fault tolerant environment for long term distributed computing applications and data stores across the global Internet. This system combines global naming and replication of both processing and data to support large scale information processing applications leading to better availability and reliability than currently available with typical cluster computing and/or distributed computer environments.
National Aeronautics and Space Administration — MERIS is a programmable, medium-spectral resolution, imaging spectrometer operating in the solar reflective spectral range. Fifteen spectral bands can be selected by...
Full Text Available Abstract: We present a model of information processing which is based on two concurrent ways of describing the world, where a description in one of the languages limits the possibilities for realisations in the other language. The two describing dimensions appear in our common sense as dichotomies of perspectives: subjective - objective; diversity - similarity; individual - collective. We abstract from the subjective connotations and treat the test theoretical case of an interval on which several concurrent categories can be introduced. We investigate multidimensional partitions as potential carriers of information and compare their efficiency to that of sequenced carriers. We regard the same assembly once as a contemporary collection, once as a longitudinal sequence and find promising inroads towards understanding information processing by auto-regulated systems. Information is understood to point out that what is the case from among alternatives, which could be the case. We have translated these ideas into logical operations on the set of natural numbers and have found two equivalence points on N where matches between sequential and commutative ways of presenting a state of the world can agree in a stable fashion: a flip-flop mechanism is envisioned. By following this new approach, a mathematical treatment of some poignant biomathematical problems is allowed. Also, the concepts presented in this treatise may well have relevance and applications within the information processing and the theory of language fields.
Leach, Janice; Torres, Teresa M.
The Springfield Processing Plant is a hypothetical facility. It has been constructed for use in training workshops. Information is provided about the facility and its surroundings, particularly security-related aspects such as target identification, threat data, entry control, and response force data.
Nijstad, Bernard A.; Oltmanns, Jan
Group decision making has attracted much scientific interest, but few studies have investigated group decisions that do not get made. Based on the Motivated Information Processing in Groups model, this study analysed the effect of epistemic motivation (low vs. high) and social motivation (proself
Cells receive signaling molecules by receptors and relay information via sensory networks so that they can respond properly depending on the type of signal. Recent studies have shown that cells can extract multidimensional information from dynamical concentration patterns of signaling molecules. We herein study how biochemical systems can process multidimensional information embedded in dynamical patterns. We model the decoding networks by linear response functions, and optimize the functions with the calculus of variations to maximize the mutual information between patterns and output. We find that, when the noise intensity is lower, decoders with different linear response functions, i.e., distinct decoders, can extract much information. However, when the noise intensity is higher, distinct decoders do not provide the maximum amount of information. This indicates that, when transmitting information by dynamical patterns, embedding information in multiple patterns is not optimal when the noise intensity is very large. Furthermore, we explore the biochemical implementations of these decoders using control theory and demonstrate that these decoders can be implemented biochemically through the modification of cascade-type networks, which are prevalent in actual signaling pathways.
Fore, Stephanie; Palumbo, Fabrizio; Pelgrims, Robbrecht; Yaksi, Emre
The habenula is a brain region that has gained increasing popularity over the recent years due to its role in processing value-related and experience-dependent information with a strong link to depression, addiction, sleep and social interactions. This small diencephalic nucleus is proposed to act as a multimodal hub or a switchboard, where inputs from different brain regions converge. These diverse inputs to the habenula carry information about the sensory world and the animal's internal state, such as reward expectation or mood. However, it is not clear how these diverse habenular inputs interact with each other and how such interactions contribute to the function of habenular circuits in regulating behavioral responses in various tasks and contexts. In this review, we aim to discuss how information processing in habenular circuits, can contribute to specific behavioral programs that are attributed to the habenula. Copyright © 2017 Elsevier Ltd. All rights reserved.
Yates, Bernice C; Dodendorf, Diane; Lane, Judy; LaFramboise, Louise; Pozehl, Bunny; Duncan, Kathleen; Knodel, Kendra
One of the main problems in conducting clinical trials is low participation rate due to potential participants' misunderstanding of the rationale for the clinical trial or perceptions of loss of control over treatment decisions. The objective of this study was to test an alternate informed consent process in cardiac rehabilitation participants that involved the use of a multimedia flip chart to describe a future randomized clinical trial and then asked, hypothetically, if they would participate in the future trial. An attractive and inviting visual presentation of the study was created in the form of a 23-page flip chart that included 24 color photographs displaying information about the purpose of the study, similarities and differences between the two treatment groups, and the data collection process. We tested the flip chart in 35 cardiac rehabilitation participants. Participants were asked if they would participate in this future study on two occasions: immediately after the description of the flip chart and 24 hours later, after reading through the informed consent document. Participants were also asked their perceptions of the flip chart and consent process. Of the 35 participants surveyed, 19 (54%) indicated that they would participate in the future study. No participant changed his or her decision 24 hours later after reading the full consent form. The participation rate improved 145% over that of an earlier feasibility study where the recruitment rate was 22%. Most participants stated that the flip chart was helpful and informative and that the photographs were effective in communicating the purpose of the study. Participation rates could be enhanced in future clinical trials by using a visual presentation to explain and describe the study as part of the informed consent process. More research is needed to test alternate methods of obtaining informed consent.
Dambre, Joni; Verstraeten, David; Schrauwen, Benjamin; Massar, Serge
Many dynamical systems, both natural and artificial, are stimulated by time dependent external signals, somehow processing the information contained therein. We demonstrate how to quantify the different modes in which information can be processed by such systems and combine them to define the computational capacity of a dynamical system. This is bounded by the number of linearly independent state variables of the dynamical system, equaling it if the system obeys the fading memory condition. It can be interpreted as the total number of linearly independent functions of its stimuli the system can compute. Our theory combines concepts from machine learning (reservoir computing), system modeling, stochastic processes, and functional analysis. We illustrate our theory by numerical simulations for the logistic map, a recurrent neural network, and a two-dimensional reaction diffusion system, uncovering universal trade-offs between the non-linearity of the computation and the system's short-term memory.
Dambre, Joni; Verstraeten, David; Schrauwen, Benjamin; Massar, Serge
Many dynamical systems, both natural and artificial, are stimulated by time dependent external signals, somehow processing the information contained therein. We demonstrate how to quantify the different modes in which information can be processed by such systems and combine them to define the computational capacity of a dynamical system. This is bounded by the number of linearly independent state variables of the dynamical system, equaling it if the system obeys the fading memory condition. It can be interpreted as the total number of linearly independent functions of its stimuli the system can compute. Our theory combines concepts from machine learning (reservoir computing), system modeling, stochastic processes, and functional analysis. We illustrate our theory by numerical simulations for the logistic map, a recurrent neural network, and a two-dimensional reaction diffusion system, uncovering universal trade-offs between the non-linearity of the computation and the system's short-term memory. PMID:22816038
Simchy-Gross, Rhimmon; Margulis, Elizabeth Hellmuth
In research on psychological time, it is important to examine the subjective duration of entire stimulus sequences, such as those produced by music (Teki, Frontiers in Neuroscience, 10, 2016). Yet research on the temporal oddball illusion (according to which oddball stimuli seem longer than standard stimuli of the same duration) has examined only the subjective duration of single events contained within sequences, not the subjective duration of sequences themselves. Does the finding that oddballs seem longer than standards translate to entire sequences, such that entire sequences that contain oddballs seem longer than those that do not? Is this potential translation influenced by the mode of information processing-whether people are engaged in direct or indirect temporal processing? Two experiments aimed to answer both questions using different manipulations of information processing. In both experiments, musical sequences either did or did not contain oddballs (auditory sliding tones). To manipulate information processing, we varied the task (Experiment 1), the sequence event structure (Experiments 1 and 2), and the sequence familiarity (Experiment 2) independently within subjects. Overall, in both experiments, the sequences that contained oddballs seemed shorter than those that did not when people were engaged in direct temporal processing, but longer when people were engaged in indirect temporal processing. These findings support the dual-process contingency model of time estimation (Zakay, Attention, Perception & Psychophysics, 54, 656-664, 1993). Theoretical implications for attention-based and memory-based models of time estimation, the pacemaker accumulator and coding efficiency hypotheses of time perception, and dynamic attending theory are discussed.
Li, Ximeng; Nielson, Flemming; Nielson, Hanne Riis
The security validation of practical computer systems calls for the ability to specify and verify information flow policies that are dependent on data content. Such policies play an important role in concurrent, communicating systems: consider a scenario where messages are sent to different...... processes according to their tagging. We devise a security type system that enforces content-dependent information flow policies in the presence of communication and concurrency. The type system soundly guarantees a compositional noninterference property. All theoretical results have been formally proved...
Zobrist, A. L.; Bryant, N. A.
Current trends in the use of remotely sensed data include integration of multiple data sources of various formats and use of complex models. These trends have placed a strain on information processing systems because an enormous number of capabilities are needed to perform a single application. A solution to this problem is to create a general set of capabilities which can perform a wide variety of applications. General capabilities for the Image-Based Information System (IBIS) are outlined in this report. They are then cross-referenced for a set of applications performed at JPL.
Bhattacherjee, Anol; Sanford, Clive Carlton
This study examines how processes of external influence shape information technology acceptance among potential users, how such influence effects vary across a user population, and whether these effects are persistent over time. Drawing on the elaboration-likelihood model (ELM), we compared two...... alternative influence processes, the central and peripheral routes, in motivating IT acceptance. These processes were respectively operationalized using the argument quality and source credibility constructs, and linked to perceived usefulness and attitude, the core perceptual drivers of IT acceptance. We...... further examined how these influence processes were moderated by users' IT expertise and perceived job relevance and the temporal stability of such influence effects. Nine hypotheses thus developed were empirically validated using a field survey of document management system acceptance at an eastern...
Huck, Friedrich O. (Editor); Park, Stephen K. (Editor)
This publication is a compilation of the papers presented at the NASA conference on Visual Information Processing for Television and Telerobotics. The conference was held at the Williamsburg Hilton, Williamsburg, Virginia on May 10 to 12, 1989. The conference was sponsored jointly by NASA Offices of Aeronautics and Space Technology (OAST) and Space Science and Applications (OSSA) and the NASA Langley Research Center. The presentations were grouped into three sessions: Image Gathering, Coding, and Advanced Concepts; Systems; and Technologies. The program was organized to provide a forum in which researchers from industry, universities, and government could be brought together to discuss the state of knowledge in image gathering, coding, and processing methods.
Haeffner, H.; Haensel, W.; Rapol, U.; Koerber, T.; Benhelm, J.; Riebe, M.; Chek-al-Kar, D.; Schmidt-Kaler, F.; Becher, C.; Roos, C.; Blatt, R.
Single Ca + ions and crystals of Ca + ions are confined in a linear Paul trap and are investigated for quantum information processing. Here we report on recent experimental advancements towards a quantum computer with such a system. Laser-cooled trapped ions are ideally suited systems for the investigation and implementation of quantum information processing as one can gain almost complete control over their internal and external degrees of freedom. The combination of a Paul type ion trap with laser cooling leads to unique properties of trapped cold ions, such as control of the motional state down to the zero-point of the trapping potential, a high degree of isolation from the environment and thus a very long time available for manipulations and interactions at the quantum level. The very same properties make single trapped atoms and ions well suited for storing quantum information in long lived internal states, e.g. by encoding a quantum bit (qubit) of information within the coherent superposition of the S 1/2 ground state and the metastable D 5/2 excited state of Ca + . Recently we have achieved the implementation of simple algorithms with up to 3 qubits on an ion-trap quantum computer. We will report on methods to implement single qubit rotations, the realization of a two-qubit universal quantum gate (Cirac-Zoller CNOT-gate), the deterministic generation of multi-particle entangled states (GHZ- and W-states), their full tomographic reconstruction, the realization of deterministic quantum teleportation, its quantum process tomography and the encoding of quantum information in decoherence-free subspaces with coherence times exceeding 20 seconds. (author)
Barrett, M.D.; Schaetz, T.; Chiaverini, J.; Leibfried, D.; Britton, J.; Itano, W.M.; Jost, J.D.; Langer, C.; Ozeri, R.; Wineland, D.J.; Knill, E.
We summarize two experiments on the creation and manipulation of multi-particle entangled states of trapped atomic ions - quantum dense coding and quantum teleportation. The techniques used in these experiments constitute an important step toward performing large-scale quantum information processing. The techniques also have application in other areas of physics, providing improvement in quantum-limited measurement and fundamental tests of quantum mechanical principles, for example
errors (that is of the output of the human operator). There is growing evidence (Senders, personal communication; Norman , personal communication...relates to the relative tendency to depend on sensory information or to be more analytic and independent. Norman (personal communication) has referred...decision process model. Ergonomics, 12, 543-557. Senders, J., Elkid, J., Grignetti, M., & Smallwood , R. 1966. An investigation of the visual sampling
The Fifth Generation Computer Project in Japan intends to develop a new generation of computers by extensive research in many areas. This paper discusses many research topics which the Japanese are hoping will lead to a radical new knowledge information processing system. Topics discussed include new computer architecture, programming styles, semantics of programming languages, relational databases, linguistics theory, artificial intelligence, functional images and interference systems.
Yukalov, V. I.; Sornette, D.
A survey is given summarizing the state of the art of describing information processing in Quantum Decision Theory, which has been recently advanced as a novel variant of decision making, based on the mathematical theory of separable Hilbert spaces. This mathematical structure captures the effect of superposition of composite prospects, including many incorporated intended actions. The theory characterizes entangled decision making, non-commutativity of subsequent decisions, and intention int...
Full text: I will describe how cold atoms can be manipulated to realize arrays of addressable qbits as prototype quantum registers, focussing on how atom chips can be used in combination with cavity qed techniques to form such an array. I will discuss how the array can be generated and steered using optical lattices and the Mott transition, and describe the sources of noise and how these place limits on the use of such chips in quantum information processing. (author)
Quantum computers are information processing devices which operate by and exploit the laws of quantum mechanics, potentially allowing them to solve problems which are intractable using classical computers. This dissertation considers the practical issues involved in one of the more successful implementations to date, nuclear magnetic resonance (NMR). Techniques for dealing with systematic errors are presented, and a quantum protocol is implemented. Chapter 1 is a brief introduction to quantum computation. The physical basis of its efficiency and issues involved in its implementation are discussed. NMR quantum information processing is reviewed in more detail in Chapter 2. Chapter 3 considers some of the errors that may be introduced in the process of implementing an algorithm, and high-level ways of reducing the impact of these errors by using composite rotations. Novel general expressions for stabilising composite rotations are presented in Chapter 4 and a new class of composite rotations, tailored composite rotations, presented in Chapter 5. Chapter 6 describes some of the advantages and pitfalls of combining composite rotations. Experimental evaluations of the composite rotations are given in each case. An actual implementation of a quantum information protocol, approximate quantum cloning, is presented in Chapter 7. The dissertation ends with appendices which contain expansions of some equations and detailed calculations of certain composite rotation results, as well as spectrometer pulse sequence programs. (author)
Hurlburt, George; Hildreth, Bradley; Acevedo, Teresa
The Test and Evaluation Community Network (TECNET) is building a Multilevel Secure (MLS) system. This system features simultaneous access to classified and unclassified information and easy access through widely available communications channels. It provides the necessary separation of classification levels, assured through the use of trusted system design techniques, security assessments and evaluations. This system enables cleared T&E users to view and manipulate classified and unclassified information resources either using a single terminal interface or multiple windows in a graphical user interface. TECNET is in direct partnership with the National Security Agency (NSA) to develop and field the MLS TECNET capability in the near term. The centerpiece of this partnership is a state-of-the-art Concurrent Systems Security Engineering (CSSE) process. In developing the MLS TECNET capability, TECNET and NSA are providing members, with various expertise and diverse backgrounds, to participate in the CSSE process. The CSSE process is founded on the concepts of both Systems Engineering and Concurrent Engineering. Systems Engineering is an interdisciplinary approach to evolve and verify an integrated and life cycle balanced set of system product and process solutions that satisfy customer needs (ASD/ENS-MIL STD 499B 1992). Concurrent Engineering is design and development using the simultaneous, applied talents of a diverse group of people with the appropriate skills. Harnessing diverse talents to support CSSE requires active participation by team members in an environment that both respects and encourages diversity.
Aalst, van der W.M.P.; Wah, B.W.
Process-aware information systems support operational business processes by combining advances in information technology with recent insights from management science. Workflow management systems are typical examples of such systems. However, many other types of information systems are also "process
Gallistel, C R.
The framework provided by Claude Shannon's [Bell Syst. Technol. J. 27 (1948) 623] theory of information leads to a quantitatively oriented reconceptualization of the processes that mediate conditioning. The focus shifts from processes set in motion by individual events to processes sensitive to the information carried by the flow of events. The conception of what properties of the conditioned and unconditioned stimuli are important shifts from the tangible properties to the intangible properties of number, duration, frequency and contingency. In this view, a stimulus becomes a CS if its onset substantially reduces the subject's uncertainty about the time of occurrence of the next US. One way to represent the subject's knowledge of that time of occurrence is by the cumulative probability function, which has two limiting forms: (1) The state of maximal uncertainty (minimal knowledge) is represented by the inverse exponential function for the random rate condition, in which the US is equally likely at any moment. (2) The limit to the subject's attainable certainty is represented by the cumulative normal function, whose momentary expectation is the CS-US latency minus the time elapsed since CS onset. Its standard deviation is the Weber fraction times the CS-US latency.
van der Meer, Matthijs; Kurth-Nelson, Zeb; Redish, A David
Decisions result from an interaction between multiple functional systems acting in parallel to process information in very different ways, each with strengths and weaknesses. In this review, the authors address three action-selection components of decision-making: The Pavlovian system releases an action from a limited repertoire of potential actions, such as approaching learned stimuli. Like the Pavlovian system, the habit system is computationally fast but, unlike the Pavlovian system permits arbitrary stimulus-action pairings. These associations are a "forward'' mechanism; when a situation is recognized, the action is released. In contrast, the deliberative system is flexible but takes time to process. The deliberative system uses knowledge of the causal structure of the world to search into the future, planning actions to maximize expected rewards. Deliberation depends on the ability to imagine future possibilities, including novel situations, and it allows decisions to be taken without having previously experienced the options. Various anatomical structures have been identified that carry out the information processing of each of these systems: hippocampus constitutes a map of the world that can be used for searching/imagining the future; dorsal striatal neurons represent situation-action associations; and ventral striatum maintains value representations for all three systems. Each system presents vulnerabilities to pathologies that can manifest as psychiatric disorders. Understanding these systems and their relation to neuroanatomy opens up a deeper way to treat the structural problems underlying various disorders.
Full Text Available The recognition heuristic (RH; Goldstein and Gigerenzer, 2002 suggests that, when applicable, probabilistic inferences are based on a noncompensatory examination of whether an object is recognized or not. The overall findings on the processes that underlie this fast and frugal heuristic are somewhat mixed, and many studies have expressed the need for considering a more compensatory integration of recognition information. Regardless of the mechanism involved, it is clear that recognition has a strong influence on choices, and this finding might be explained by the fact that recognition cues arouse affect and thus receive more attention than cognitive cues. To test this assumption, we investigated whether recognition results in a direct affective signal by measuring physiological arousal (i.e., peripheral arterial tone in the established city-size task. We found that recognition of cities does not directly result in increased physiological arousal. Moreover, the results show that physiological arousal increased with increasing inconsistency between recognition information and additional cue information. These findings support predictions derived by a compensatory Parallel Constraint Satisfaction model rather than predictions of noncompensatory models. Additional results concerning confidence ratings, response times, and choice proportions further demonstrated that recognition information and other cognitive cues are integrated in a compensatory manner.
Andersen, Lasse Mejling
This PhD thesis treats applications of nonlinear optical effects for quantum information processing. The two main applications are four-wave mixing in the form of Bragg scattering (BS) for quantum-state-preserving frequency conversion, and sum-frequency generation (SFG) in second-order nonlinear......-chirping the pumps. In the high-conversion regime without the effects of NPM, exact Green functions for BS are derived. In this limit, separability is possible for conversion efficiencies up to 60 %. However, the system still allows for selective frequency conversion as well as re-shaping of the output. One way...
Quantum wells, alternate thin layers of two different semiconductor materials, show an exceptional electric field dependence of the optical absorption, called the quantum-confined Stark effect (QCSE), for electric fields perpendicular to the layers. This enables electrically controlled optical modulators and optically controlled self-electro-optic-effect devices that can operate at high speed and low energy density. Recent developments in these QCSE devices are summarized, including new device materials and novel device structures. The variety of sophisticated devices now demonstrated is promising for applications to information processing
Burgess, Ann W; Clements, Paul T
Sexual abuse is considered to be a pandemic contemporary public health issue, with significant physical and psychosocial consequences for its victims. However, the incidence of elder sexual assault is difficult to estimate with any degree of confidence. A convenience sample of 284 case records were reviewed for Post-Traumatic Stress Disorder (PTSD) symptoms. The purpose of this paper is to present the limited data noted on record review on four PTSD symptoms of startle, physiological upset, anger, and numbness. A treatment model for information processing of intrapsychic trauma is presented to describe domain disruption within a nursing diagnosis of rape trauma syndrome and provide guidance for sensitive assessment and intervention.
Jackson, Russell E; Calvillo, Dusti P
Visual search of the environment is a fundamental human behavior that perceptual load affects powerfully. Previously investigated means for overcoming the inhibitions of high perceptual load, however, generalize poorly to real-world human behavior. We hypothesized that humans would process evolutionarily relevant stimuli more efficiently than evolutionarily novel stimuli, and evolutionary relevance would mitigate the repercussions of high perceptual load during visual search. Animacy is a significant component to evolutionary relevance of visual stimuli because perceiving animate entities is time-sensitive in ways that pose significant evolutionary consequences. Participants completing a visual search task located evolutionarily relevant and animate objects fastest and with the least impact of high perceptual load. Evolutionarily novel and inanimate objects were located slowest and with the highest impact of perceptual load. Evolutionary relevance may importantly affect everyday visual information processing.
Russell E. Jackson
Full Text Available Visual search of the environment is a fundamental human behavior that perceptual load affects powerfully. Previously investigated means for overcoming the inhibitions of high perceptual load, however, generalize poorly to real-world human behavior. We hypothesized that humans would process evolutionarily relevant stimuli more efficiently than evolutionarily novel stimuli, and evolutionary relevance would mitigate the repercussions of high perceptual load during visual search. Animacy is a significant component to evolutionary relevance of visual stimuli because perceiving animate entities is time-sensitive in ways that pose significant evolutionary consequences. Participants completing a visual search task located evolutionarily relevant and animate objects fastest and with the least impact of high perceptual load. Evolutionarily novel and inanimate objects were located slowest and with the highest impact of perceptual load. Evolutionary relevance may importantly affect everyday visual information processing.
Scholten, Lotte; van Knippenberg, Daan; Nijstad, Bernard A.; De Dreu, Carsten K. W.
Integrating dual-process models [Chaiken, S., & Trope, Y. (Eds.). (1999). Dual-process theories in social psychology. NewYork: Guilford Press] with work on information sharing and group decision-making [Stasser, G., & Titus, W. (1985). Pooling of unshared information in group decision making: biased
Furusawa, Akira; Takei, Nobuyuki
Quantum teleportation is one of the most important subjects in quantum information science. This is because quantum teleportation can be regarded as not only quantum information transfer but also a building block for universal quantum information processing. Furthermore, deterministic quantum information processing is very important for efficient processing and it can be realized with continuous-variable quantum information processing. In this review, quantum teleportation for continuous variables and related quantum information processing are reviewed from these points of view
Bapat, Shweta S; Patel, Harshali K; Sansgiry, Sujit S
In this study, we evaluate the role of information anxiety and information load on the intention to read information from prescription drug information leaflets (PILs). These PILs were developed based on the principals of information load and consumer information processing. This was an experimental prospective repeated measures study conducted in the United States where 360 (62% response rate) university students (>18 years old) participated. Participants were presented with a scenario followed by exposure to the three drug product information sources used to operationalize information load. The three sources were: (i) current practice; (ii) pre-existing one-page text only; and (iii) interventional one-page prototype PILs designed for the study. Information anxiety was measured as anxiety experienced by the individual when encountering information. The outcome variable of intention to read PILs was defined as the likelihood that the patient will read the information provided in the leaflets. A survey questionnaire was used to capture the data and the objectives were analyzed by performing a repeated measures MANOVA using SAS version 9.3. When compared to current practice and one-page text only leaflets, one-page PILs had significantly lower scores on information anxiety ( p information load ( p Information anxiety and information load significantly impacted intention to read ( p < 0.001). Newly developed PILs increased patient's intention to read and can help in improving the counseling services provided by pharmacists.
Palfreyman Niall M
Full Text Available Abstract Background Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs, which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Results Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Conclusion Narrator is a
Mandel, Johannes J; Fuss, Hendrik; Palfreyman, Niall M; Dubitzky, Werner
Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Narrator is a flexible and intuitive systems biology tool. It is
It is difficult not to be amazed by the ability of the human brain to process, to structure and to memorize information. Even by the toughest standards the behaviour of this network of about 10 11 neurons qualifies as complex, and both the scientific community and the public take great interest in the growing field of neuroscience. The scientific endeavour to learn more about the function of the brain as an information processing system is here a truly interdisciplinary one, with important contributions from biology, computer science, physics, engineering and mathematics as the authors quite rightly point out in the introduction of their book. The role of the theoretical disciplines here is to provide mathematical models of information processing systems and the tools to study them. These models and tools are at the centre of the material covered in the book by Coolen, Kuehn and Sollich. The book is divided into five parts, providing basic introductory material on neural network models as well as the details of advanced techniques to study them. A mathematical appendix complements the main text. The range of topics is extremely broad, still the presentation is concise and the book well arranged. To stress the breadth of the book let me just mention a few keywords here: the material ranges from the basics of perceptrons and recurrent network architectures to more advanced aspects such as Bayesian learning and support vector machines; Shannon's theory of information and the definition of entropy are discussed, and a chapter on Amari's information geometry is not missing either. Finally the statistical mechanics chapters cover Gardner theory and the replica analysis of the Hopfield model, not without being preceded by a brief introduction of the basic concepts of equilibrium statistical physics. The book also contains a part on effective theories of the macroscopic dynamics of neural networks. Many dynamical aspects of neural networks are usually hard to find in the
Shin, Hyun Kook; Park, Jeong Seok; Baek, Seung Min; Kim, Young Jin; Joo, Jae Yoon; Lee, Sang Mok; Jeong, Young Woo; Seo, Ho Jun; Kim, Do Youn; Lee, Tae Hoon
The Operational Information Processing Platform(OIPP) is platform system which was designed to provide the development and operation environments for plant operation and plant monitoring. It is based on the Plant Computer Systems (PCS) of Yonggwang 3 and 4, Ulchin 3 and 4, and Yonggwang 5 and 6 Nuclear Power Plants (NPP). The UNIX based workstation, real time kernel and graphics design tool are selected and installed through the reviewing the function of PCS. In order to construct the development environment for open system architecture and distributed computer system, open computer system architecture was adapted both in hardware and software. For verification of system design and evaluation of technical methodologies, the PCS running under the OIPP is being designed and implemented. In this system, the man-machine interface and system functions are being designed and implemented to evaluate the differences between the UCN 3, 4 PCS and OIPP. 15 tabs., 32 figs., 11 refs. (Author)
Khoury, Antonio Z. [Universidade Federal Fluminense (UFF), Niteroi, RJ (Brazil)
Full text: In this work we discuss several proposals for quantum information processing using the transverse structure of paraxial beams. Different techniques for production and manipulation of optical vortices have been employed and combined with polarization transformations in order to investigate fundamental properties of quantum entanglement as well as to propose new tools for quantum information processing. As an example, we have recently proposed and demonstrated a controlled NOT (CNOT) gate based on a Michelson interferometer in which the photon polarization is the control bit and the first order transverse mode is the target. The device is based on a single lens design for an astigmatic mode converter that transforms the transverse mode of paraxial optical beams. In analogy with Bell's inequality for two-qubit quantum states, we propose an inequality criterion for the non-separability of the spin-orbit degrees of freedom of a laser beam. A definition of separable and non-separable spin-orbit modes is used in consonance with the one presented in Phys. Rev. Lett. 99, 2007. As the usual Bell's inequality can be violated for entangled two-qubit quantum states, we show both theoretically and experimentally that the proposed spin-orbit inequality criterion can be violated for non-separable modes. The inequality is discussed both in the classical and quantum domains. We propose a polarization to orbital angular momentum teleportation scheme using entangled photon pairs generated by spontaneous parametric down conversion. By making a joint detection of the polarization and angular momentum parity of a single photon, we are able to detect all the Bell-states and perform, in principle, perfect teleportation from a discrete to a continuous system using minimal resources. The proposed protocol implementation demands experimental resources that are currently available in quantum optics laboratories. (author)
Full Text Available In the conditions of globalization and the world economic communications, the role of information support of business processes increases in various branches and fields of activity. There is not an exception for the warehouse activity. Such information support is realized in warehouse logistic systems. In relation to territorial administratively education, the warehouse logistic system gets a format of difficult social and economic structure which controls the economic streams covering the intermediary, trade and transport organizations and the enterprises of other branches and spheres. Spatial movement of inventory items makes new demands to participants of merchandising. Warehousing (in the meaning – storage – is one of the operations entering into logistic activity, on the organization of a material stream, as a requirement. Therefore, warehousing as "management of spatial movement of stocks" – is justified. Warehousing, in such understanding, tries to get rid of the perception as to containing stocks – a business expensive. This aspiration finds reflection in the logistic systems working by the principle: "just in time", "economical production" and others. Therefore, the role of warehouses as places of storage is transformed to understanding of warehousing as an innovative logistic system.
Hoard, James E.
Integrating diverse information sources and application software in a principled and general manner will require a very capable advanced information management (AIM) system. In particular, such a system will need a comprehensive addressing scheme to locate the material in its docuverse. It will also need a natural language processing (NLP) system of great sophistication. It seems that the NLP system must serve three functions. First, it provides an natural language interface (NLI) for the users. Second, it serves as the core component that understands and makes use of the real-world interpretations (RWIs) contained in the docuverse. Third, it enables the reasoning specialists (RSs) to arrive at conclusions that can be transformed into procedures that will satisfy the users' requests. The best candidate for an intelligent agent that can satisfactorily make use of RSs and transform documents (TDs) appears to be an object oriented data base (OODB). OODBs have, apparently, an inherent capacity to use the large numbers of RSs and TDs that will be required by an AIM system and an inherent capacity to use them in an effective way.
Buonfiglio, Marzia; Di Sabato, Francesco; Mandillo, Silvia; Albini, Mariarita; Di Bonaventura, Carlo; Giallonardo, Annateresa; Avanzini, Giuliano
Relevant to the study of epileptogenesis is learning processing, given the pivotal role that neuroplasticity assumes in both mechanisms. Recently, evoked potential analyses showed a link between analytic cognitive style and altered neural excitability in both migraine and healthy subjects, regardless of cognitive impairment or psychological disorders. In this study we evaluated analytic/global and visual/auditory perceptual dimensions of cognitive style in patients with epilepsy. Twenty-five cryptogenic temporal lobe epilepsy (TLE) patients matched with 25 idiopathic generalized epilepsy (IGE) sufferers and 25 healthy volunteers were recruited and participated in three cognitive style tests: "Sternberg-Wagner Self-Assessment Inventory", the C. Cornoldi test series called AMOS, and the Mariani Learning style Questionnaire. Our results demonstrate a significant association between analytic cognitive style and both IGE and TLE and respectively a predominant auditory and visual analytic style (ANOVA: p values <0,0001). These findings should encourage further research to investigate information processing style and its neurophysiological correlates in epilepsy. Copyright © 2017 Elsevier Inc. All rights reserved.
Business processes and information systems mutually affect each other in non-trivial ways. Frequently, the business process design and the information system design are not well aligned. This means that business processes are designed without taking the information system impact into account, and vice versa. Missing alignment at design time often results in quality problems at runtime, such as large response times of information systems, large process execution times, overloaded information s...
The Secretary of Defense announced the Corporate Information Management initiative on November 16, 1990, to establish a DoD-wide concept for managing computer, communications, and information management functions...
A. A. Malyuk
Full Text Available The features of the information protection task solution in its modern statement as a complex problem that encompasses all aspects of information technology development are discussed. Such an interpretation would inevitably lead to an increase of the role of the systemic problems solution of which relies on advanced scientific and methodological basis, so called information protection processes’ intensification.
Kim, Tae Whan; Choi, Kwang; Oh, Jeong Hoon; Jeong, Hyun Sook; Keum, Jong Yong
The goal of this project is to establish integrated environment focused on enhanced information services to researchers through the providing of acquisition information, key phrase retrieval function, journal content information linked with various subsystems already developed. The results of the project are as follows. 1. It is possible to serve information on unreceivable materials among required materials throughout the system. 2. Retrieval efficiency is increased by the adding of key phrase retrieval function. 3. Rapidity of information service is enhanced by the providing of journal contents of each issue received and work performance of contents service is become higher. 4. It is possible to acquire, store, serve technical information needed in R and D synthetically and systematically throughout the development of total system linked with various subsystems required to technical information management and service. 21 refs. (Author)
NEA has for many years now been collating information on, and analysing, laws and regulations on the peaceful uses of nuclear energy, and this work has resulted in a series of publications. However, as seen by the multiplication of computer-based legal information centres, both at national and international level, conventional information systems are no longer adequate to deal with the increasing volume of information and with users' needs. In view of the particular aspects of nuclear law and of its own availabilities, NEA has endeavoured to make the best possible use of existing structures by opting for participation in the IAEA International Nuclear Information System rather than by creating a specialised centre. Before becoming operational, the arrangements concluded between NEA and IAEA required that the INIS rules be altered somewhat to take account of the specific problems raised by treatment of legal literature and also to improve the quality of information provided to users. (auth.) [fr
The process of designing high-quality software systems is one of the major issues in software engineering research. Over the years, this has resulted in numerous design methods, each with specific qualities and drawbacks. For example, the Rational Unified Process is a comprehensive design process,
S.S. Ficco (Stefano)
textabstractEconomic agents generally operate in uncertain environments and, prior to making decisions, invest time and resources to collect useful information. Consumers compare the prices charged by di..erent firms before purchasing a product. Politicians gather information from di..erent
This paper presents how a Geographical Information System (GIS) can be incorporated in an intelligent learning software system for environmental matters. The system is called ALGIS and incorporates the GIS in order to present effectively information about the physical and anthropogenic environment of Greece in a more interactive way. The system…
Information is an important resource for new product development (NPD) process in subsidiary. However, we still lack of research to analyze NPD process from information perspective in subsidiary context. This research is an exploratory research and it exploited 8 cases of NPD process in consumer goods subsidiaries operating in Indonesian market. Three types of information have been identified and analyzed NPD process; global, regional and local information. The result of this research ...
Meiryani; Muhammad Syaifullah
Abstract The purpose of this study was to determine the influence of business process to the quality of the accounting information system. This study aims to examine the influence of business process on the quality of the information system of accounting information system. The study was theoritical research which considered the roles of business process on quality of accounting information system which use secondary data collection. The results showed that the business process have a signifi...
Modern digital automation techniques allow the application of demanding types of process control. These types of process control are characterized by their belonging to higher levels in a multilevel model. Functional and technical aspects of the performance of digital automation plants are presented and explained. A modern automation system is described considering special procedures of process control (e.g. real time diagnosis)
Chaffee, Ellen Earle
When a rational decision process is desired, information specialists can contribute information and also contribute to the process in which that information is used, thereby promoting rational decision-making. The contribution of Stanford's information specialists to rational decision-making is described. (MLW)
Nevill, Dorothy D.; And Others
Tested the assumptions that the structural features of vocational schemas affect vocational information processing and career self-efficacy. Results indicated that effective vocational information processing was facilitated by well-integrated systems that processed information along fewer dimensions. The importance of schematic organization on the…
TKACH L. M.
Full Text Available Formulation of the problem. If public relations as a phenomenon of information management are examined, we deal with the question of knowledge content and nature of relationship of PR with environment, ability to manage the perception and attitude of people to events in the environment; ensure priority of information over other resources. Goal. To investigate the concept of "public relations" of foreign and domestic experts; consider the typology of the public and the "laws" of public opinion; define the basic principles according to which relations with public should be built, and to identify PR activities as a kind of social communication. Conclusions. Public relations on the basis of advanced information and communication technologies create fundamentally new opportunities for information control and influence on public consciousness.
Aa, van der J.H.; Leopold, H.; Mannhardt, F.; Reijers, H.A.; Gaaloul, K.; Schmidt, R.; Nurcan, S.; Guerreiro, S.; Ma, Q.
An organization’s knowledge on its business processes represents valuable corporate knowledge because it can be used to enhance the performance of these processes. In many organizations, documentation of process knowledge is scattered around various process information sources. Such information
... (CONTINUED) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.65 Process safety... 40 Protection of Environment 15 2010-07-01 2010-07-01 false Process safety information. 68.65... compilation of written process safety information before conducting any process hazard analysis required by...
Lee, Ji Ho; Kim, Tae Whan; Kim, Sun Ja; Kim, Young Min; Choi, Kwang; Oh, Joung Hun; Choung, Hyun Suk; Keum, Jong Yong; Yoo, An Na; Harn, Deuck Haing; Choun, Young Chun
The major goal of this project is to develop a more efficient information management system by connecting the KAREI serials database which enable the users to access from their own laboratory facilities through KAREI-NET. The importance of this project is to make the serials information of KAERI easily accessible to users as valuable resources for R and D activities. The results of the project are as follows. 1) Development of the serials database and retrieval system enabled us to access to the serials holding information through KAERI-NET. 2) The database construction establishes a foundation for the management of 1,600 serials held in KAERI. 3) The system can be applied not only to KAERI but also to similar medium-level libraries. (Author)
Tonfoni, G; Ichalkaranje, N S
The Internet/WWW has made it possible to easily access quantities of information never available before. However, both the amount of information and the variation in quality pose obstacles to the efficient use of the medium. Artificial intelligence techniques can be useful tools in this context. Intelligent systems can be applied to searching the Internet and data-mining, interpreting Internet-derived material, the human-Web interface, remote condition monitoring and many other areas. This volume presents the latest research on the interaction between intelligent systems (neural networks, adap
Kim, Jong Hyun; Seong, Poong Hyun
This paper proposes a quantitative approach to modeling the information processing of NPP operators. The aim of this work is to derive the amount of the information processed during a certain control task. The focus will be on i) developing a model for information processing of NPP operators and ii) quantifying the model. To resolve the problems of the previous approaches based on the information theory, i.e. the problems of single channel approaches, we primarily develop the information processing model having multiple stages, which contains information flows. Then the uncertainty of the information is quantified using the Conant's model, a kind of information theory
Full Text Available This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two fundamental questions: (1 What input information guides moral judgments?; and (2 What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework, critically evaluates them on empirical and theoretical grounds, outlines a general integrative model grounded in information processing, and offers conceptual and methodological suggestions for future research. The information processing perspective provides a useful theoretical framework for organizing extant and future work in the rapidly growing field of moral judgment.
Gross, Kenneth C.; Morreale, Patricia
A method and system for processing a set of data from an industrial process and/or a sensor. The method and system can include processing data from either real or calculated data related to an industrial process variable. One of the data sets can be an artificial signal data set generated by an autoregressive moving average technique. After obtaining two data sets associated with one physical variable, a difference function data set is obtained by determining the arithmetic difference between the two pairs of data sets over time. A frequency domain transformation is made of the difference function data set to obtain Fourier modes describing a composite function data set. A residual function data set is obtained by subtracting the composite function data set from the difference function data set and the residual function data set (free of nonwhite noise) is analyzed by a statistical probability ratio test to provide a validated data base.
Craik . F.I.M., & Lockhart , R.S. (1972). Levels of processing : A framework for memory research. Journal of Verbal Learning and Verbal Behavior, 11...task at both levels of performance, then one would, in both cases, postulate systems that had the ability to process symbols at the microscopic level ...821760s and early 70s. (cf. Atkinson & Shiffrin. 1968: Craik & Lockhart . 1972: Norman, Rumelhart, & LNR, 1975). This architecture is comprised of several
de Dreu, C.K.W.; de Vries, N.K.
In two experiments we studied the prediction that majority support induces stronger convergent processing than minority support for a persuasive message, the more so when recipients are explicitly forced to pay attention to the source's point of view; this in turn affects the amount of attitude change on related issues. Convergent processing is the systematic elaboration on the sources position, but with a stronger focus on verification and justification rather than falsification. In Exp 1 wi...
Brookhuis, Karel Anton
We set out to test the hypotheses generated by Shiffrin & Schneider’s model of information procesing with our new tool, the ERP. The experiments were devised to test hypotheses that were orginally based on performance data alone, i.e. reaction time and errors. Although the overt behaviour was
tradi- tionally called the "span of apprehension" (Kulpe, 1904; Durable Storage Wundt , 1899). However, a partial-report procedure demon- strates...Gehrig. P. (1992). On the time course Wundt . W. (1899). Zur Kritik tachistoskopischer Versuche [A crit- of perceptual information that results from a
Kim, Jong Hyun; Seong, Poong Hyun
This paper proposes a quantitative approach to modeling the information processing of NPP operators. The aim of this work is to derive the amount of the information processed during a certain control task under input information overload. We primarily develop the information processing model having multiple stages, which contains information flow. Then the uncertainty of the information is quantified using the Conant's model, a kind of information theory. We also investigate the applicability of this approach to quantifying the information reduction of operators under the input information overload
The paper describes a systematic approach to the design of information interfaces for operator support in diagnosing complex systems faults. The need of interpreting primary measured plant variables within the framework of different system representations organized into an abstraction hierarchy is identified from an analysis of the problem of diagnosing complex systems. A formalized approach to the modelling of production systems, called Multilevel Flow Modelling, is described. A MFM model specifies plant control requirements and the associated need for plant information and provide a consistent context for the interpretation of real time plant signals in diagnosis of malfunctions. The use of MFM models as a basis for functional design of the plant instrumentation system is outlined, and the use of knowledge Based (Expert) Systems for the design of man-machine interfaces is mentioned. Such systems would allow an active user participation in diagnosis and thus provide the basis for cooperative problem solving. 14 refs. (author)
Mental Hygiene Bourdieu , Pierre 1977 Outline of a Theory of Practice. Richard Nice, trans. Cambridge: Cambridge University Press. Cain, Leo F. and Samuel...hospital posed a unique evacuation problem. When a fire occurs in a hospital, information is typically communicated to doctors , nurses, and other...bore no relation whatsoever to the emergencies they announced, and they differed from institution to institution. Thus doctors , nuerses and other staff
Full Text Available Abstract The purpose of this study was to determine the influence of business process to the quality of the accounting information system. This study aims to examine the influence of business process on the quality of the information system of accounting information system. The study was theoritical research which considered the roles of business process on quality of accounting information system which use secondary data collection. The results showed that the business process have a significant effect on the quality of accounting information systems.
Kim, Du Gyu; Lee, JaeMu
This study examined an instruction method for the improvement of information processing abilities in elementary school students. Current elementary students are required to develop information processing abilities to create new knowledge for this digital age. There is, however, a shortage of instruction strategies for these information processing…
Sadjadi, Firooz; Sadjadi, Farzad
Polarimetric sensing is an area of active research in a variety of applications. In particular, the use of polarization diversity has been shown to improve performance in automatic target detection and recognition. Within the diverse scope of polarimetric sensing, the field of passive polarimetric sensing is of particular interest. This chapter presents several new methods for gathering in formation using such passive techniques. One method extracts three-dimensional (3D) information and surface properties using one or more sensors. Another method extracts scene-specific algebraic expressions that remain unchanged under polariza tion transformations (such as along the transmission path to the sensor).
Jespersen, Kristina Risom
collection platform to obtain measurements from within the NPD process. 42 large, international companies participated in the data collecting simulation. Results revealed five different information paths that were not connecting all stages of the NPD process. Moreover, results show that the front......-end is not driving the information acquisition through the stages of the NPD process, and that environmental turbulence disconnects stages from the information paths in the NPD process. This implies that information is at the same time a key to success and a key to entrapment in the NPD process....
Vatansever, Deniz; Menon, David K; Stamatakis, Emmanuel A
Concurrent with mental processes that require rigorous computation and control, a series of automated decisions and actions govern our daily lives, providing efficient and adaptive responses to environmental demands. Using a cognitive flexibility task, we show that a set of brain regions collectively known as the default mode network plays a crucial role in such "autopilot" behavior, i.e., when rapidly selecting appropriate responses under predictable behavioral contexts. While applying learned rules, the default mode network shows both greater activity and connectivity. Furthermore, functional interactions between this network and hippocampal and parahippocampal areas as well as primary visual cortex correlate with the speed of accurate responses. These findings indicate a memory-based "autopilot role" for the default mode network, which may have important implications for our current understanding of healthy and adaptive brain processing.
This report documents the impact analysis of a proposed Defense Waste Processing Facility (DWPF) for immobilizing high-level waste currently being stored on an interim basis at the Savannah River Plant (SRP). The DWPF will process the waste into a form suitable for shipment to and disposal in a federal repository. The DWPF will convert the high-level waste into: a leach-resistant form containing above 99.9% of all the radioactivity, and a residue of slightly contaminated salt. The document describes the SRP site and environs, including population, land and water uses; surface and subsurface soils and waters; meteorology; and ecology. A conceptual integrated facility for concurrently producing glass waste and saltcrete is described, and the environmental effects of constructing and operating the facility are presented. Alternative sites and waste disposal options are addressed. Also environmental consultations and permits are discussed
Razvan Daniel ZOTA
Full Text Available Cloud computing represents one of the latest emerging trends in distributed computing that enables the existence of hardware infrastructure and software applications as services. The present paper offers a general approach to the cloud computing standardization as a mean of improving the speed of adoption for the cloud technologies. Moreover, this study tries to show out how organizations may achieve more consistent business processes while operating with cloud computing technologies.
Based upon quantum-inspired entanglement in quantum-classical hybrids, a simple algorithm for instantaneous transmissions of non-intentional messages (chosen at random) to remote distances is proposed. The idea is to implement instantaneous transmission of conditional information on remote distances via a quantum-classical hybrid that preserves superposition of random solutions, while allowing one to measure its state variables using classical methods. Such a hybrid system reinforces the advantages, and minimizes the limitations, of both quantum and classical characteristics. Consider n observers, and assume that each of them gets a copy of the system and runs it separately. Although they run identical systems, the outcomes of even synchronized runs may be different because the solutions of these systems are random. However, the global constrain must be satisfied. Therefore, if the observer #1 (the sender) made a measurement of the acceleration v(sub 1) at t =T, then the receiver, by measuring the corresponding acceleration v(sub 1) at t =T, may get a wrong value because the accelerations are random, and only their ratios are deterministic. Obviously, the transmission of this knowledge is instantaneous as soon as the measurements have been performed. In addition to that, the distance between the observers is irrelevant because the x-coordinate does not enter the governing equations. However, the Shannon information transmitted is zero. None of the senders can control the outcomes of their measurements because they are random. The senders cannot transmit intentional messages. Nevertheless, based on the transmitted knowledge, they can coordinate their actions based on conditional information. If the observer #1 knows his own measurements, the measurements of the others can be fully determined. It is important to emphasize that the origin of entanglement of all the observers is the joint probability density that couples their actions. There is no centralized source
Kodym, Oldřich; Unucka, Jakub
Paper deals with complex system for processing information from means of transport acting as parts of train (rail or road). It focuses on automated information gathering using AutoID technology, information transmission via Internet of Things networks and information usage in information systems of logistic firms for support of selected processes on MES and ERP levels. Different kinds of gathered information from whole transport chain are discussed. Compliance with existing standards is mentioned. Security of information in full life cycle is integral part of presented system. Design of fully equipped system based on synthesized functional nodes is presented.
Conforti, R.; Leoni, de M.; La Rosa, M.; Aalst, van der W.M.P.; Salinesi, C.; Norrie, M.C.; Pastor, O.
This paper proposes a technique that supports process participants in making risk-informed decisions, with the aim to reduce the process risks. Risk reduction involves decreasing the likelihood and severity of a process fault from occurring. Given a process exposed to risks, e.g. a financial process
de Keijzer, Ander; van Keulen, Maurice
At present, many information sources are available wherever you are. Most of the time, the information needed is spread across several of those information sources. Gathering this information is a tedious and time consuming job. Automating this process would assist the user in its task. Integration
Centre for Theoretical Studies and Supercomputer Education and Research Centre, ... the parent to the offspring, sensory information conveyed by the sense organ to the .... The task involved in genetic information processing is. ASSEMBLY.
relationship between acuity and light sensitivity. Animals have evolved a wide variety of solutions to this problem such as folded membranes, to have a larger receptive surfaces, and lenses, to focus light onto the receptive membranes. On the neural capacity side, complex eyes demand huge processing network...... animals in a wide range of behaviours. It is intuitive that a complex eye is energetically very costly, not only in components but also in neural involvement. The increasing behavioural demand added pressure on design specifications and eye evolution is considered an optimization of the inverse...... fit their need. Visual neuroethology integrates optics, sensory equipment, neural network and motor output to explain how animals can perform behaviour in response to a specific visual stimulus. In this doctoral thesis, I will elucidate the individual steps in a visual neuroethological pathway...
Livshitz, I. I.; Kunakov, E.; Lontsikh, P. A.
This article is devoted to an improvement of the technological process's analysis with the information security requirements implementation. The aim of this research is the competition increase analysis in aircraft industry enterprises due to the information technology implementation by the example of the tube bending technological process. The article analyzes tube bending kinds and current technique. In addition, a potential risks analysis in a tube bending technological process is carried out in terms of information security.
Farnsworth, Keith D.; Nelson, John; Gershenson, Carlos
We extend the concept that life is an informational phenomenon, at every level of organisation, from molecules to the global ecological system. According to this thesis: (a) living is information processing, in which memory is maintained by both molecular states and ecological states as well as the more obvious nucleic acid coding; (b) this information processing has one overall function - to perpetuate itself; and (c) the processing method is filtration (cognition) of, and synthesis of, info...
Graph states are multiparticle states which are associated with graphs. Each vertex of the graph corresponds to a single system or particle. The links describe quantum correlations (entanglement) between pairs of connected particles. Graph states were initiated independently by two research groups: On the one hand, graph states were introduced by Briegel and Raussendorf as a resource for a new model of one-way quantum computing, where algorithms are implemented by a sequence of measurements at single particles. On the other hand, graph states were developed by the author of this thesis and ReinhardWerner in Braunschweig, as a tool to build quantum error correcting codes, called graph codes. The connection between the two approaches was fully realized in close cooperation of both research groups. This habilitation thesis provides a survey of the theory of graph codes, focussing mainly, but not exclusively on the author's own research work. We present the theoretical and mathematical background for the analysis of graph codes. The concept of one-way quantum computing for general graph states is discussed. We explicitly show how to realize the encoding and decoding device of a graph code on a one-way quantum computer. This kind of implementation is to be seen as a mathematical description of a quantum memory device. In addition to that, we investigate interaction processes, which enable the creation of graph states on very large systems. Particular graph states can be created, for instance, by an Ising type interaction between next neighbor particles which sits at the points of an infinitely extended cubic lattice. Based on the theory of quantum cellular automata, we give a constructive characterization of general interactions which create a translationally invariant graph state. (orig.)
Full Text Available Physics and information are intimately connected, and the ultimate information processing devices will be those that harness the principles of quantum mechanics. Many physical systems have been identified as candidates for quantum information processing, but none of them are immune from errors. The challenge remains to find a path from the experiments of today to a reliable and scalable quantum computer. Here, we develop an architecture based on a simple module comprising an optical cavity containing a single negatively charged nitrogen vacancy center in diamond. Modules are connected by photons propagating in a fiber-optical network and collectively used to generate a topological cluster state, a robust substrate for quantum information processing. In principle, all processes in the architecture can be deterministic, but current limitations lead to processes that are probabilistic but heralded. We find that the architecture enables large-scale quantum information processing with existing technology.
Weeks, G.E.; Daniel, W.E.; Edwards, R.E.; Jannarone, R.J.; Joshi, S.N.; Palakodety, S.S.; Qian, D.
For several decades, computerized information processing systems and human information processing models have developed with a good deal of mutual influence. Any comprehensive psychology text in this decade uses terms that originated in the computer industry, such as ''cache'' and ''memory'', to describe human information processing. Likewise, many engineers today are using ''artificial intelligence''and ''artificial neural network'' computing tools that originated as models of human thought to solve industrial problems. This paper concerns a recently developed human information processing model, called ''concurrent information processing'' (CIP), and a related set of computing tools for solving industrial problems. The problem of focus is adaptive gauge monitoring; the application is pneumatic pressure repeaters (Holledge gauges) used to measure liquid level and density in the Defense Waste Processing Facility and the Integrated DWPF Melter System
Aalst, van der W.M.P.; Jensen, K.; Aalst, van der W.M.P.
A Process-Aware Information System (PAIS) is a software system that manages and executes operational processes involving people, applications, and/or information sources on the basis of process models. Example PAISs are workflow management systems, case-handling systems, enterprise information
It is generally accepted that human vision is an extremely powerful information processing system that facilitates our interaction with the surrounding world. However, despite extended and extensive research efforts, which encompass many exploration fields, the underlying fundamentals and operational principles of visual information processing in human brain remain unknown. We still are unable to figure out where and how along the path from eyes to the cortex the sensory input perceived by the retina is converted into a meaningful object representation, which can be consciously manipulated by the brain. Studying the vast literature considering the various aspects of brain information processing, I was surprised to learn that the respected scholarly discussion is totally indifferent to the basic keynote question: "What is information?" in general or "What is visual information?" in particular. In the old days, it was assumed that any scientific research approach has first to define its basic departure points. Why was it overlooked in brain information processing research remains a conundrum. In this paper, I am trying to find a remedy for this bizarre situation. I propose an uncommon definition of "information", which can be derived from Kolmogorov's Complexity Theory and Chaitin's notion of Algorithmic Information. Embracing this new definition leads to an inevitable revision of traditional dogmas that shape the state of the art of brain information processing research. I hope this revision would better serve the challenging goal of human visual information processing modeling.
Nijstad, B.A.; de Dreu, C.K.W.
Much of the research into group and team functioning looks at groups that perform cognitive tasks, such as decision making, problem solving, and innovation. The Motivated Information Processing in Groups Model (MIP-G; De Dreu, Nijstad, & Van Knippenberg, 2008) conjectures that information processing
Chung, Chih-Hung; Angnakoon, Putthachat; Li, Jessica; Allen, Jeff
Purpose: The purpose of this study is to provide researchers with a better understanding of the cultural impact on information processing in virtual learning environment. Design/methodology/approach: This study uses a causal loop diagram to depict the cultural impact on information processing in the virtual human resource development (VHRD)…
Leach, Mark M; Stoltenberg, Cal D.
The relationship between mood and information processing, particularly when reviewing the Elaboration Likelihood Model of persuasion, lacks conclusive evidence. This study was designed to investigate the hypothesis that information processing would be greater for mood-topic congruence than non mood-topic congruence. Undergraduate students (N=216)…
Zimmermann, Peter; Iwanski, Alexandra
Attachment theory suggests that internal working models of self and significant others influence adjustment during development by controlling information processing and self-regulation. We provide a conceptual overview on possible mechanisms linking attachment and information processing and review the current literature in middle childhood.…
de Dreu, C.K.W.; Beersma, B.
According to the Motivated Information Processing in Groups (MIP-G) model, groups should perform ambiguous (non-ambiguous) tasks better when they have high (low) epistemic motivation and concomitant tendencies to engage in systematic (heuristic) information processing and exchange. The authors
Tempelmans Plat, H.; Deiman, E.P.; Beheshti, M.R.; Zreik, K.
Adequate decision making in the design process needs information about oost oonsequences over the life of the designed object. In succeeding stages the types of decisions change; as a consequence the type of oost information will differ as well. For each stage oost information about realized
Information is definite by the basic resource of activity of enterprises. Suggestion in relation to the selection of informative subsystems of strategic, tactical, operative management is borne. The list of indexes in relation to estimation of the informative providing of functional processes of enterprise is offered.
de Dreu, C.K.W.; Nijstad, B.A.; van Knippenberg, D.
This article expands the view of groups as information processors into a motivated information processing in groups (MIP-G) model by emphasizing, first, the mixedmotive structure of many group tasks and, second, the idea that individuals engage in more or less deliberate information search and
De Dreu, Carsten K. W.; Nijstad, Bernard A.; van Knippenberg, Daan
This article expands the view of groups as information processors into a motivated information processing in groups (MIP-G) model by emphasizing, first, the mixed-motive structure of many group tasks and, second, the idea that individuals engage in more or less deliberate information search and
Hu, Dewen; Liu, Huaping
"Foundations and Practical Applications of Cognitive Systems and Information Processing" presents selected papers from the First International Conference on Cognitive Systems and Information Processing, held in Beijing, China on December 15-17, 2012 (CSIP2012). The aim of this conference is to bring together experts from different fields of expertise to discuss the state-of-the-art in artificial cognitive systems and advanced information processing, and to present new findings and perspectives on future development. This book introduces multidisciplinary perspectives on the subject areas of Cognitive Systems and Information Processing, including cognitive sciences and technology, autonomous vehicles, cognitive psychology, cognitive metrics, information fusion, image/video understanding, brain-computer interfaces, visual cognitive processing, neural computation, bioinformatics, etc. The book will be beneficial for both researchers and practitioners in the fields of Cognitive Science, Computer Science and Cogni...
and improve business processes. As a consequence, there is a growing need to address managerial aspects of the relationships between information technologies and business processes. The aim of this PhD study is to investigate how the practice of conjoint management of business processes and information...... technologies can be supported and improved. The study is organized into five research papers and this summary. Each paper addresses a different aspect of conjoint management of business processes and information technologies, i.e. problem development and managerial practices on software...... and information technologies in a project environment. It states that both elements are intrinsically related and should be designed and considered together. The second case examines the relationships between information technology management and business process management. It discusses the multi-faceted role...
Angelarosa Longo; Viviana Ventre
Rational models in decision processes are marked out by many anomalies, caused by behavioral issues. We point out the importance of information in causing inconsistent preferences in a decision process. In a single or multi agent decision process each mental model is influenced by the presence, the absence or false information about the problem or about other members of the decision making group. The difficulty in modeling these effects increases because behavioral biases influence also the m...
Reichenbach, Alexandra; Diedrichsen, Jörn
A recent study suggests that reafferent hand-related visual information utilizes a privileged, attention-independent processing channel for motor control. This process was termed visuomotor binding to reflect its proposed function: linking visual reafferences to the corresponding motor control centers. Here, we ask whether the advantage of processing reafferent over exafferent visual information is a specific feature of the motor processing stream or whether the improved processing also benefits the perceptual processing stream. Human participants performed a bimanual reaching task in a cluttered visual display, and one of the visual hand cursors could be displaced laterally during the movement. We measured the rapid feedback responses of the motor system as well as matched perceptual judgments of which cursor was displaced. Perceptual judgments were either made by watching the visual scene without moving or made simultaneously to the reaching tasks, such that the perceptual processing stream could also profit from the specialized processing of reafferent information in the latter case. Our results demonstrate that perceptual judgments in the heavily cluttered visual environment were improved when performed based on reafferent information. Even in this case, however, the filtering capability of the perceptual processing stream suffered more from the increasing complexity of the visual scene than the motor processing stream. These findings suggest partly shared and partly segregated processing of reafferent information for vision for motor control versus vision for perception.
Dykas, Matthew J; Ehrlich, Katherine B; Cassidy, Jude
This chapter describes theory and research on intergenerational connections between parents' attachment and children's social information processing, as well as between parents' social information processing and children's attachment. The chapter begins with a discussion of attachment theorists' early insights into the role that social information processing plays in attachment processes. Next, current theory about the mechanisms through which cross-generational links between attachment and social information processing might emerge is presented. The central proposition is that the quality of attachment and/or the social information processing of the parent contributes to the quality of attachment and/or social information processing in the child, and these links emerge through mediating processes related to social learning, open communication, gate-keeping, emotion regulation, and joint attention. A comprehensive review of the literature is then presented. The chapter ends with the presentation of a current theoretical perspective and suggestions for future empirical and clinical endeavors.
Hamilton, Rachel K B; Newman, Joseph P
Hamilton and colleagues (2015) recently proposed that an integrative deficit in psychopathy restricts simultaneous processing, thereby leaving fewer resources available for information encoding, narrowing the scope of attention, and undermining associative processing. The current study evaluated this parallel processing deficit proposal using the Simultaneous-Sequential paradigm. This investigation marks the first a priori test of the Hamilton et al.'s theoretical framework. We predicted that psychopathy would be associated with inferior performance (as indexed by lower accuracy and longer response time) on trials requiring simultaneous processing of visual information relative to trials necessitating sequential processing. Results were consistent with these predictions, supporting the proposal that psychopathy is characterized by a reduced capacity to process multicomponent perceptual information concurrently. We discuss the potential implications of impaired simultaneous processing for the conceptualization of the psychopathic deficit. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Lamp, Sandra A.
There is information available in the literature that discusses information technology (IT) governance and investment decision making from an executive-level perception, yet there is little information available that offers the perspective of process owners and process managers pertaining to their role in IT process improvement and investment…
The present article highlights the importance of the motivational construct for the foreign language learning (FLL) process. More specifically, in the present article it is argued that motivation is likely to play a significant role at all three stages of the FLL process as they are discussed within the information processing model of FLL, namely,…
Occelli, Valeria; Spence, Charles; Zampini, Massimiliano
We highlight the results of those studies that have investigated the plastic reorganization processes that occur within the human brain as a consequence of visual deprivation, as well as how these processes give rise to behaviorally observable changes in the perceptual processing of auditory and tactile information. We review the evidence showing…
Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang
This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view. PMID:27876847
Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang
This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view.
Full Text Available Recent psychophysical evidence indicates that the vertical arrangement of horizontal information is particularly important for encoding facial identity. In this paper we extend this notion to examine the role that information at different (particularly cardinal orientations might play in a number of established phenomena each a behavioural “signature” of face processing. In particular we consider (a the face inversion effect (FIE, (b the facial identity after-effect, (c face-matching across viewpoint, and (d interactive, so-called holistic, processing of face parts. We report that filtering faces to remove all but the horizontal information largely preserves these effects but conversely, retaining vertical information generally diminishes or abolishes them. We conclude that preferential processing of horizontal information is a central feature of human face processing that supports many of the behavioural signatures of this critical visual operation.
How do humans make moral judgments about others' behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment.
How do humans make moral judgments about others’ behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment. PMID:26579022
Frey, Seth; Albino, Dominic K; Williams, Paul L
There is a tendency in decision-making research to treat uncertainty only as a problem to be overcome. But it is also a feature that can be leveraged, particularly in social interaction. Comparing the behavior of profitable and unprofitable poker players, we reveal a strategic use of information processing that keeps decision makers unpredictable. To win at poker, a player must exploit public signals from others. But using public inputs makes it easier for an observer to reconstruct that player's strategy and predict his or her behavior. How should players trade off between exploiting profitable opportunities and remaining unexploitable themselves? Using a recent multivariate approach to information theoretic data analysis and 1.75 million hands of online two-player No-Limit Texas Hold'em, we find that the important difference between winning and losing players is not in the amount of information they process, but how they process it. In particular, winning players are better at integrative information processing-creating new information from the interaction between their cards and their opponents' signals. We argue that integrative information processing does not just produce better decisions, it makes decision-making harder for others to reverse engineer, as an expert poker player's cards act like the private key in public-key cryptography. Poker players encrypt their reasoning with the way they process information. The encryption function of integrative information processing makes it possible for players to exploit others while remaining unexploitable. By recognizing the act of information processing as a strategic behavior in its own right, we offer a detailed account of how experts use endemic uncertainty to conceal their intentions in high-stakes competitive environments, and we highlight new opportunities between cognitive science, information theory, and game theory. Copyright © 2018 Cognitive Science Society, Inc.
Smerecnik, Chris M R; Mesters, Ilse; Candel, Math J J M; De Vries, Hein; De Vries, Nanne K
The role of information processing in understanding people's responses to risk information has recently received substantial attention. One limitation of this research concerns the unavailability of a validated questionnaire of information processing. This article presents two studies in which we describe the development and validation of the Information-Processing Questionnaire to meet that need. Study 1 describes the development and initial validation of the questionnaire. Participants were randomized to either a systematic processing or a heuristic processing condition after which they completed a manipulation check and the initial 15-item questionnaire and again two weeks later. The questionnaire was subjected to factor reliability and validity analyses on both measurement times for purposes of cross-validation of the results. A two-factor solution was observed representing a systematic processing and a heuristic processing subscale. The resulting scale showed good reliability and validity, with the systematic condition scoring significantly higher on the systematic subscale and the heuristic processing condition significantly higher on the heuristic subscale. Study 2 sought to further validate the questionnaire in a field study. Results of the second study corresponded with those of Study 1 and provided further evidence of the validity of the Information-Processing Questionnaire. The availability of this information-processing scale will be a valuable asset for future research and may provide researchers with new research opportunities. © 2011 Society for Risk Analysis.
Gómez, Jaime; Salazar, Idana; Vargas, Pilar
In this paper we use a panel of manufacturing firms in Spain to examine the extent to which they use internal and external sources of information (customers, suppliers, competitors, consultants and universities) to generate product and process innovation. Our results show that, although internal sources are influential, external sources of information are key to achieve innovation performance. These results are in line with the open innovation literature because they show that firms that are opening up their innovation process and that use different information sources have a greater capacity to generate innovations. We also find that the importance of external sources of information varies depending on the type of innovation (product or process) considered. To generate process innovation, firms mainly rely on suppliers while, to generate product innovation, the main contribution is from customers. The potential simultaneity between product and process innovation is also taken into consideration. We find that the generation of both types of innovation is not independent.
Full Text Available In this paper we use a panel of manufacturing firms in Spain to examine the extent to which they use internal and external sources of information (customers, suppliers, competitors, consultants and universities to generate product and process innovation. Our results show that, although internal sources are influential, external sources of information are key to achieve innovation performance. These results are in line with the open innovation literature because they show that firms that are opening up their innovation process and that use different information sources have a greater capacity to generate innovations. We also find that the importance of external sources of information varies depending on the type of innovation (product or process considered. To generate process innovation, firms mainly rely on suppliers while, to generate product innovation, the main contribution is from customers. The potential simultaneity between product and process innovation is also taken into consideration. We find that the generation of both types of innovation is not independent.
In this paper we use a panel of manufacturing firms in Spain to examine the extent to which they use internal and external sources of information (customers, suppliers, competitors, consultants and universities) to generate product and process innovation. Our results show that, although internal sources are influential, external sources of information are key to achieve innovation performance. These results are in line with the open innovation literature because they show that firms that are opening up their innovation process and that use different information sources have a greater capacity to generate innovations. We also find that the importance of external sources of information varies depending on the type of innovation (product or process) considered. To generate process innovation, firms mainly rely on suppliers while, to generate product innovation, the main contribution is from customers. The potential simultaneity between product and process innovation is also taken into consideration. We find that the generation of both types of innovation is not independent. PMID:27035456
Full Text Available In this paper we examine the domain of information search and propose a "goal-based" approach to study search strategy. We describe "goal-based information search" using a framework of Knowledge Discovery. We identify two Information Retrieval (IR goals using the constructs of Knowledge Acquisition (KA and Knowledge Explanation (KE. We classify these constructs into two specific information problems: An exploration-exploitation problem and an implicit-explicit problem. Our proposed framework is an extension of prior work in this domain, applying an IR Process Model originally developed for Legal-IR and adapted to Medical-IR. The approach in this paper is guided by the recent ACM-SIG Medical Information Retrieval (MedIR Workshop definition: "methodologies and technologies that seek to improve access to medical information archives via a process of information retrieval."
Yu, Yang; Krishnamachari, Bhaskar
This book presents state-of-the-art cross-layer optimization techniques for energy-efficient information processing and routing in wireless sensor networks. Besides providing a survey on this important research area, three specific topics are discussed in detail - information processing in a collocated cluster, information transport over a tree substrate, and information routing for computationally intensive applications. The book covers several important system knobs for cross-layer optimization, including voltage scaling, rate adaptation, and tunable compression. By exploring tradeoffs of en
Policy makers must consider the work force, technology, cost, and legal implications of their legislative proposals. AHIMA, AAMT, CHIA, and MTIA urge lawmakers to craft regulatory solutions that enforce HIPAA and support advancements in modern health information processing practices that improve the quality and cost of healthcare. We also urge increased investment in health information work force development and implementation of new technologies to advance critical healthcare outcomes--timely, accurate, accessible, and secure information to support patient care. It is essential that state legislatures reinforce the importance of improving information processing solutions for healthcare and not take actions that will produce unintended and detrimental consequences.
Sophia R. Sklan
Full Text Available Phonons, the quanta of mechanical vibration, are important to the transport of heat and sound in solid materials. Recent advances in the fundamental control of phonons (phononics have brought into prominence the potential role of phonons in information processing. In this review, the many directions of realizing phononic computing and information processing are examined. Given the relative similarity of vibrational transport at different length scales, the related fields of acoustic, phononic, and thermal information processing are all included, as are quantum and classical computer implementations. Connections are made between the fundamental questions in phonon transport and phononic control and the device level approach to diodes, transistors, memory, and logic.
Bazua Rueda, L.F.
The author has been working first in the National Institute of Nuclear Energy (Mexico) and then in URAMEX (Uranio Mexicano) since 1975 to 1983, integrated to radiometric and magnetometric aerial prospecting projects in computerized processing of information aspects. During this period the author participated in the work out of computing systems, information processing and mathematical procedures definition for the geophysical reduction of the calibration equipment data. With cumulated experience, in this thesis are presented aspects concerning to management and operation of computerized processing of information systems. Operation handbooks of the majority of modules are presented. Program lists are not included. (Author)
Full Text Available This paper presents and discuss a survey which describes how small-medium enterprises (SMEs implement and use their information system with respect to their logistic and production processes. The study first describes the rationale of the research, then it identifies the characteristics of the companies and detects their general attitude towards information technology (IT. In the following section the paper presents a set of detailed processes to verify the structure and workflow of companies and how IT supports their processes. In the last part we study the influence of some company characteristics to effective use of processes and to different technological approaches, to support defined logistic and production processes. The novelty of the study and its interest, both in academic and institutional context as in the real world, resides in the opportunity to verify and understand the different attitudes of SMEs towards information technology in defining, organizing, planning and control their processes.
Serghey A. Amelkin
Full Text Available Finite-time approach allows one to optimize regimes of processes in macrosystems when duration of the processes is restricted. Driving force of the processes is difference of intensive variables: temperatures in thermodynamics, values in economics, etc. In microeconomic systems two counterflow fluxes appear due to the only driving force. They are goods and money fluxes. Another possible case is two fluxes with the same direction. The processes of information exchange can be described by this formalism.
Through the application of specialized systems, future-orientated information processing integrates the sciences of processes, control systems, process control strategies, user behaviour and ergonomics. Improvements in process control can be attained, inter alia, by the preparation of the information contained (e.g. by suppressing the flow of signals and replacing it with signals which are found on substance) and also by an ergonomic representation of the study of the process. (orig.) [de
Full Text Available Rational models in decision processes are marked out by many anomalies, caused by behavioral issues. We point out the importance of information in causing inconsistent preferences in a decision process. In a single or multi agent decision process each mental model is influenced by the presence, the absence or false information about the problem or about other members of the decision making group. The difficulty in modeling these effects increases because behavioral biases influence also the modeler. Behavioral Operational Research (BOR studies these influences to create efficient models to define choices in similar decision processes.
Université de Genève
Geneva University Physics Department 24, Quai Ernest Ansermet CH-1211 Geneva 4 Monday 11 April 2011 17h00 - Ecole de Physique, Auditoire Stückelberg The optical route to quantum information processing Prof. Terry Rudolph/Imperial College, London Photons are attractive as carriers of quantum information both because they travel, and can thus transmit information, but also because of their good coherence properties and ease in undergoing single-qubit manipulations. The main obstacle to their use in information processing is inducing an effective interaction between them in order to produce entanglement. The most promising approach in photon-based information processing architectures is so-called measurement-based quantum computing. This relies on creating upfront a multi-qubit highly entangled state (the cluster state) which has the remarkable property that, once prepared, it can be used to perform quantum computation by making only single qubit measurements. In this talk I will discuss generically the...
Parker, Jonathan K.
Discusses the benefits of lecturing, when done properly, in high schools. Describes the positive attributes of effective lecturers. Provides a human information-processing model applicable to the task of lecturing to students. (HB)
Bliss-Moreau, Eliza; Moadab, Gilda; Machado, Christopher J
Despite evolutionary claims about the function of facial behaviors across phylogeny, rarely are those hypotheses tested in a comparative context-that is, by evaluating how nonhuman animals process such behaviors. Further, while increasing evidence indicates that humans make meaning of faces by integrating contextual information, including that from the body, the extent to which nonhuman animals process contextual information during affective displays is unknown. In the present study, we evaluated the extent to which rhesus macaques (Macaca mulatta) process dynamic affective displays of conspecifics that included both facial and body behaviors. Contrary to hypotheses that they would preferentially attend to faces during affective displays, monkeys looked for longest, most frequently, and first at conspecifics' bodies rather than their heads. These findings indicate that macaques, like humans, attend to available contextual information during the processing of affective displays, and that the body may also be providing unique information about affective states. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Full Text Available The performance of present-day informational technologies has two main properties: the universality of the structures used and the flexibility of the final user's interfaces. The first determines the potential cover area of the informational domain. The second determines the diversity and efficiency of processing methods of the proceedings being automated. The mentioned aspects are of great importance in agriculture and ecology because there are complex processes and considerable volumes of used information. For example, the meteoro-logical processes are a part of the ecological one like habitats' existential conditions and are known as a complex prognostic problem. The latter needs considerable computational resources to solve the appropriate equations. Likewise, agriculture as a controlled activity under strong impact from natural conditions has the same high requirements for diverse structures and flexibility of information processing.
V. A. Matyushenko
Full Text Available Information technology is rapidly conquering the world, permeating all spheres of human activity. Education is not an exception. An important direction of information of education is the development of university management systems. Modern information systems improve and facilitate the management of all types of activities of the institution. The purpose of this paper is development of system, which allows automating process of formation of accounting documents. The article describes the problem of preparation of the educational process documents. Decided to project and create the information system in Microsoft Access environment. The result is four types of reports obtained by using the developed system. The use of this system now allows you to automate the process and reduce the effort required to prepare accounting documents. All reports was implement in Microsoft Excel software product and can be used for further analysis and processing.
Rawl, Ruth K.; O'Tuel, Frances S.
The basic assumptions of information processing theories in cognitive psychology are reviewed, and the application of this approach to problem solving in gifted education is considered. Specific implications are cited on problem selection and instruction giving. (CL)
Cash, Philip; Kreye, Melanie
suggestions for improvements and support. One theory that may be particularly applicable to the early design stages is Information Processing Theory (IPT) as it is linked to the design process with regard to the key concepts considered. IPT states that designers search for information if they perceive......, the new knowledge is shared between the design team to reduce ambiguity with regards to its meaning and to build a shared understanding – reducing perceived uncertainty. Thus, we propose that Information-Processing Theory is suitable to describe designer activity in the early design stages...... uncertainty with regard to the knowledge necessary to solve a design challenge. They then process this information and compare if the new knowledge they have gained covers the previous knowledge gap. In engineering design, uncertainty plays a key role, particularly in the early design stages which has been...
Wollstadt, Patricia; Sellers, Kristin K; Rudelt, Lucas; Priesemann, Viola; Hutt, Axel; Fröhlich, Flavio; Wibral, Michael
The disruption of coupling between brain areas has been suggested as the mechanism underlying loss of consciousness in anesthesia. This hypothesis has been tested previously by measuring the information transfer between brain areas, and by taking reduced information transfer as a proxy for decoupling. Yet, information transfer is a function of the amount of information available in the information source-such that transfer decreases even for unchanged coupling when less source information is available. Therefore, we reconsidered past interpretations of reduced information transfer as a sign of decoupling, and asked whether impaired local information processing leads to a loss of information transfer. An important prediction of this alternative hypothesis is that changes in locally available information (signal entropy) should be at least as pronounced as changes in information transfer. We tested this prediction by recording local field potentials in two ferrets after administration of isoflurane in concentrations of 0.0%, 0.5%, and 1.0%. We found strong decreases in the source entropy under isoflurane in area V1 and the prefrontal cortex (PFC)-as predicted by our alternative hypothesis. The decrease in source entropy was stronger in PFC compared to V1. Information transfer between V1 and PFC was reduced bidirectionally, but with a stronger decrease from PFC to V1. This links the stronger decrease in information transfer to the stronger decrease in source entropy-suggesting reduced source entropy reduces information transfer. This conclusion fits the observation that the synaptic targets of isoflurane are located in local cortical circuits rather than on the synapses formed by interareal axonal projections. Thus, changes in information transfer under isoflurane seem to be a consequence of changes in local processing more than of decoupling between brain areas. We suggest that source entropy changes must be considered whenever interpreting changes in information
Kilpinen, R; Saunamäki, T; Jehkonen, M
To provide a comprehensive review of studies on information processing speed in patients with obstructive sleep apnea syndrome (OSAS) as compared to healthy controls and normative data, and to determine whether continuous positive airway pressure (CPAP) treatment improves information processing speed. A systematic review was performed on studies drawn from Medline and PsycINFO (January 1990-December 2011) and identified from lists of references in these studies. After inclusion criteria, 159 articles were left for abstract review, and after exclusion criteria 44 articles were fully reviewed. The number of patients in the studies reviewed ranged from 10 to 157 and the study samples consisted mainly of men. Half of the studies reported that patients with OSAS showed reduced information processing speed when compared to healthy controls. Reduced information processing speed was seen more often (75%) when compared to norm-referenced data. Psychomotor speed seemed to be particularly liable to change. CPAP treatment improved processing speed, but the improvement was marginal when compared to placebo or conservative treatment. Patients with OSAS are affected by reduced information processing speed, which may persist despite CPAP treatment. Information processing is usually assessed as part of other cognitive functioning, not as a cognitive domain per se. However, it is important to take account of information processing speed when assessing other aspects of cognitive functioning. This will make it possible to determine whether cognitive decline in patients with OSAS is based on lower-level or higher-level cognitive processes or both. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
The retina translates light into neuronal activity. Thus, it renders visual information of the external environment. The retina can only send a limited amount of information to the brain within a given period. To use this amount optimally, light stimuli are strongly processed in the retina. This
Wolleat, Patricia L.
Information processing theory could be made more sensitive to differences in career outcomes for males and females by (1) examining the nature of the career decision; (2) expanding the notion of information; (3) relating the vocational schema to the gender schema; and (4) noting whether variables are general, sex related, or sex specific. (SK)
The positioning process of marketing used by special libraries and information centers involves two key decisions from which other decisions are derived: to which user groups marketing programs and services will be directed; and which information needs will be served. Two cases are discussed and a bibliography is provided. (EJS)
Laudato, Nicholas C.; DeSantis, Dennis J.
The approach used by the University of Pittsburgh (Pennsylvania) in designing a campus-wide information architecture and a framework for reengineering the business process included building consensus on a general philosophy for information systems, using pattern-based abstraction techniques, applying data modeling and application prototyping, and…
Оксана Николаевна Ромашкова
Full Text Available This work concerns information model of the educational complex which includes several schools. A classification of educational complexes formed in Moscow is given. There are also a consideration of the existing organizational structure of the educational complex and a suggestion of matrix management structure. Basic management information processes of the educational complex were conceptualized.
Successful processing of quantum information is, to a large degree, based on two aspects: a) the implementation of high-fidelity quantum gates, as well as b) avoiding or suppressing decoherence processes that destroy quantum information. The presented work shows our progress in the field of experimental quantum information processing over the last years: the implementation and characterisation of several quantum operations, amongst others the first realisation of the quantum Toffoli gate in an ion-trap based quantum computer. The creation of entangled states with up to 14 qubits serves as basis for investigations of decoherence processes. Based on the realised quantum operations as well as the knowledge about dominant noise processes in the employed apparatus, entanglement swapping as well as quantum operations within a decoherence-free subspace are demonstrated. (author) [de
Detecting, investigating and prosecuting cybercrime? Extremely important, but not really the solution for the problem. Prevention is better! The sectors that have joined the Cybercrime Information Exchange have accepted the challenge of ensuring the effectiveness of the (information) security of process control systems (PCS), including SCADA. This publication makes it clear why it is vital that organizations establish and maintain control over the security of the information and communication...
Weber, Darren L
This review considers theory and evidence for abnormal information processing in post-traumatic stress disorder (PTSD). Cognitive studies have indicated sensitivity in PTSD for traumatic information, more so than general emotional information. These findings were supported by neuroimaging studies that identify increased brain activity during traumatic cognition, especially in affective networks (including the amygdala, orbitofrontal and anterior cingulate cortex). In theory, it is proposed th...
Ross, M. D.
Study of montages, tracings and reconstructions prepared from a series of 570 consecutive ultrathin sections shows that rat maculas are morphologically organized for parallel processing of linear acceleratory information. Type II cells of one terminal field distribute information to neighboring terminals as well. The findings are examined in light of physiological data which indicate that macular receptor fields have a preferred directional vector, and are interpreted by analogy to a computer technology known as an information network.
Yakov Mikhajlovich Dalinger
Full Text Available The organization of service production attributed to airports activity is analyzed. The importance and the actuality of information interaction problem solution between productive processes as a problem of organization of modern produc- tion are shown.Possibilities and features of information interaction system construction in form of multi-level hierarchical struc- ture have been shown. The airport is considered as an enterprise aimed at service production where it is necessary to analyze much in- formation in a limited time-frame. The production schedule often changes under the influence of many factors. This leads to the increase of the role of computerization and informatization of production processes what predetermines automation of production, creation of information environment and organization of information interaction needed for realization of production processes. The integrated organization form is proposed because it is oriented to the integration of different processes into a universal production system and it allows to conduct the coordination of local goals of particular processes in the context of the global purpose aimed at the improvement of the effectiveness of the airport activity. The main conditions needed for organization of information interaction between production processes and techno- logical operations are considered, and the list of the following problems is determined. The attention is paid to the necessity of compatibility of structure and organization of interaction system in the conditions of the airline and the necessity of be- ing its reflection in the information space of the airline. The usefulness of the intergrated organization form of information interaction based on information exchange between processes and service customers according to the network structure is explained. Multi-level character of this structure confirms its advantage over other items, however it also has a series of features presented
Full Text Available This paper analyzes the quality of the administration of information, identifying deficiencies in the information systems, used in the negotiation process for concession of bank credit, to small and mid-sized companies, under the business managers' perspective. The results make the deficiencies evident and confirm the need for change in the systems of administration of information, in order to allow for both an improvement in the negotiation process of bank credit as well as a larger economical efficiency of the available resources.
Full Text Available In essence, process of maintaining equipment is a support process, because it indirectly contributes to operational ability of the production process necessary for the supply chain of the new value. Taking into account increased levels of automatization and quality, this proces s becomes more and more significant and for some branches of industry, even crucial. Due to the fact that the quality of the entire process is more and more dependent on the maintenance process, these processes must be carefully designed and effectively im plemented. There are various techniques and approaches at our disposal, such as technical, logistical and intensive application of the information - communication technologies. This last approach is presented in this work. It begins with organizational goa ls, especially quality objectives. Then, maintenance processes and integrated information system structures are defined. Maintenance process quality and improvement processes are defined using a set of performances, with a special emphasis placed on effectiveness and quality economics. At the end of the work, information system for improving maintenance economics is structured. Besides theoretical analysis, work also presents results authors obtained analyzing food industry, metal processing industry an d building materials industry.
Full Text Available The paper presents an overview of some problems of information science which are explicitly portrayed in literature. It covers the following issues: information explosion, information flood and data deluge, information retrieval and relevance of information, and finally, the problem of scientific communication. The purpose of this paper is to explain why knowledge acquisition, can be considered as an issue in information sciences. The existing theoretical foundation within the information sciences, i.e. the DIKW hierarchy and its key concepts - data, information, knowledge and wisdom, is recognized as a symbolic representation as well as the theoretical foundation of the knowledge acquisition process. Moreover, it seems that the relationship between the DIKW hierarchy and the knowledge acquisition process is essential for a stronger foundation of information sciences in the 'body' of the overall human knowledge. In addition, the history of both the human and machine knowledge acquisition has been considered, as well as a proposal that the DIKW hierarchy take place as a symbol of general knowledge acquisition process, which could equally relate to both human and machine knowledge acquisition. To achieve this goal, it is necessary to modify the existing concept of the DIKW hierarchy. The appropriate modification of the DIKW hierarchy (one of which is presented in this paper could result in a much more solid theoretical foundation of the knowledge acquisition process and information sciences as a whole. The theoretical assumptions on which the knowledge acquisition process may be established as a problem of information science are presented at the end of the paper. The knowledge acquisition process does not necessarily have to be the subject of epistemology. It may establish a stronger link between the concepts of data and knowledge; furthermore, it can be used in the context of scientific research, but on the more primitive level than conducting
Rose, Susan A.; Feldman, Judith F.; Jankowski, Jeffery J.
This study examined the relation of 3-year core information-processing abilities to lexical growth and development. The core abilities covered four domains--memory, representational competence (cross-modal transfer), processing speed, and attention. Lexical proficiency was assessed at 3 and 13 years with the Peabody Picture Vocabulary Test (PPVT)…
Jepma, Marieke; Wagenmakers, Eric-Jan; Nieuwenhuis, Sander
People are able to use temporal cues to anticipate the timing of an event, enabling them to process that event more efficiently. We conducted two experiments, using the fixed-foreperiod paradigm (Experiment 1) and the temporal-cueing paradigm (Experiment 2), to assess which components of information processing are speeded when subjects use such…
Beer, Randall D.; Williams, Paul L.
There has been considerable debate in the literature about the relative merits of information processing versus dynamical approaches to understanding cognitive processes. In this article, we explore the relationship between these two styles of explanation using a model agent evolved to solve a relational categorization task. Specifically, we…
Fulk, Janet; And Others
Presents a model to examine how social influence processes affect individuals' attitudes toward communication media and media use behavior, integrating two research areas: media use patterns as the outcome of objectively rational choices and social information processing theory. Asserts (in a synthesis) that media characteristics and attitudes are…
Romantsov, V. A.; Dyubkin, I. A.; Klyukbin, L. N.
An automated system for primary and scientific analysis of deep water hydrological information is presented. Primary processing of the data in this system is carried out on a drifting station, which also calculates the parameters of vertical stability of the sea layers, as well as their depths and altitudes. Methods of processing the raw data are described.
Suggests that information architecture design is primarily an inductive process, partly because it lacks internal theory and partly because it is an activity that supports emergent phenomena (user experiences) from basic design components. Suggests a resemblance to Constructive Induction, a design process that locates the best representational…
Lee, Jee Hoh; Kim, Tae Hwan; Choi, Kwang; Chung, Hyun Sook; Keum, Jong Yong
This project is to establish high-quality information circulation system by developing serials-control system to improve serials management from ordering to distributing and availability on R and D and to advance in quality of information service needed in R and D by fast retrieval and providing of research information with CD-Net. The results of the project are as follows. 1. Serials management process which covers from ordering to distributing have higher efficiency by development of subscription information system. 2. Systematic control on each issue of serials is achieved by development of serials checking system. 3. It is possible to provide vol. and no. information of issue received currently to researchers promptly by improvement of serials holding information system. 4. Retrieval of research information contained in various CD-ROM DB throughout KAERI-NET is possible by research on construction methods of CD-Net. 2 figs, 25 refs. (Author)
Lee, Jee Hoh; Kim, Tae Hwan; Choi, Kwang; Chung, Hyun Sook; Keum, Jong Yong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)
This project is to establish high-quality information circulation system by developing serials-control system to improve serials management from ordering to distributing and availability on R and D and to advance in quality of information service needed in R and D by fast retrieval and providing of research information with CD-Net. The results of the project are as follows. 1. Serials management process which covers from ordering to distributing have higher efficiency by development of subscription information system. 2. Systematic control on each issue of serials is achieved by development of serials checking system. 3. It is possible to provide vol. and no. information of issue received currently to researchers promptly by improvement of serials holding information system. 4. Retrieval of research information contained in various CD-ROM DB throughout KAERI-NET is possible by research on construction methods of CD-Net. 2 figs, 25 refs. (Author).
Oprea, I.; Oprea, M.; Stoica, M.; Badea, E.; Guta, V.
The real-time information and processing system has as main task to record, collect, process and transmit the radiation level and weather data, being proposed for radiation protection, environmental monitoring around nuclear facilities and for civil defence. Such a system can offer information in order to provide mapping, data base, modelling and communication and to assess the consequences of nuclear accidents. The system incorporates a number of stationary or mobile radiation monitoring equipment, weather parameter measuring station, a GIS-based information processing center and the communication network, all running on a real-time operating system. It provides the automatic data collection on-line and off-line, remote diagnostic, advanced presentation techniques, including a graphically oriented executive support, which has the ability to respond to an emergency by geographical representation of the hazard zones on the map.The system can be integrated into national or international environmental monitoring systems, being based on local intelligent measuring and transmission units, simultaneous processing and data presentation using a real-time operating system for PC and geographical information system (GIS). Such an integrated system is composed of independent applications operating under the same computer, which is capable to improve the protection of the population and decision makers efforts, updating the remote GIS data base. All information can be managed directly from the map by multilevel data retrieving and presentation by using on-line dynamic evolution of the events, environment information, evacuation optimization, image and voice processing
Full Text Available Information technology outsourcing relationship is one of the key issues to IT outsourcing success. To explore how to manage and promote IT outsourcing relationship, it is necessary to understand its evolution process. Firstly, the types of IT outsourcing based on relationship quality and IT outsourcing project level will be analyzed; Secondly, two evolution process models of IT outsourcing relationship are proposed based on relationship quality and IT outsourcing project level, and the IT outsourcing relationship evolution process is indicated; Finally, an IT outsourcing relationship evolution process model is developed, and the development process of IT outsourcing relationship from low to high under the internal and external power is explained.
Đurović Aleksandar M.
Full Text Available This paper examines the role of information communication technologies in reengineering processes. General analysis of a process will show that information communication technologies improve their efficiency. Reengineering model based on the BPMN 2.0 standard will be applied to the process of seeking internship/job by students from Faculty of Transport and Traffic Engineering. In the paper, after defining the technical characteristics and required functionalities, web / mobile application is proposed, enabling better visibility of traffic engineers to companies seeking that education profile.
Ceklic, Tijana; Bastien, Célyne H
Insomnia sufferers (INS) are cortically hyperaroused during sleep, which seems to translate into altered information processing during nighttime. While information processing, as measured by event-related potentials (ERPs), during wake appears to be associated with sleep quality of the preceding night, the existence of such an association during nighttime has never been investigated. This study aims to investigate nighttime information processing among good sleepers (GS) and INS while considering concomitant sleep quality. Following a multistep clinical evaluation, INS and GS participants underwent 4 consecutive nights of PSG recordings in the sleep laboratory. Thirty nine GS (mean age 34.56±9.02) and twenty nine INS (mean age 43.03±9.12) were included in the study. ERPs (N1, P2, N350) were recorded all night on Night 4 (oddball paradigm) during NREM sleep. Regardless of sleep quality, INS presented a larger N350 amplitude during SWS (p=0.042) while GS showed a larger N350 amplitude during late-night stage 2 sleep (p=0.004). Regardless of diagnosis, those who slept objectively well showed a smaller N350 amplitude (p=0.020) while those who slept subjectively well showed a smaller P2 (pInformation processing seems to be associated with concomitant subjective and objective sleep quality for both GS and INS. However, INS show an alteration in information processing during sleep, especially for inhibition processes, regardless of their sleep quality. Copyright © 2015 Elsevier B.V. All rights reserved.
Wendt, Mike; Luna-Rodriguez, Aquiles; Jacobsen, Thomas
Humans are selective information processors who efficiently prevent goal-inappropriate stimulus information to gain control over their actions. Nonetheless, stimuli, which are both unnecessary for solving a current task and liable to cue an incorrect response (i.e., "distractors"), frequently modulate task performance, even when consistently paired with a physical feature that makes them easily discernible from target stimuli. Current models of cognitive control assume adjustment of the processing of distractor information based on the overall distractor utility (e.g., predictive value regarding the appropriate response, likelihood to elicit conflict with target processing). Although studies on distractor interference have supported the notion of utility-based processing adjustment, previous evidence is inconclusive regarding the specificity of this adjustment for distractor information and the stage(s) of processing affected. To assess the processing of distractors during sensory-perceptual phases we applied EEG recording in a stimulus identification task, involving successive distractor-target presentation, and manipulated the overall distractor utility. Behavioral measures replicated previously found utility modulations of distractor interference. Crucially, distractor-evoked visual potentials (i.e., posterior N1) were more pronounced in high-utility than low-utility conditions. This effect generalized to distractors unrelated to the utility manipulation, providing evidence for item-unspecific adjustment of early distractor processing to the experienced utility of distractor information. Copyright © 2014 the authors 0270-6474/14/3416720-06$15.00/0.
Aso, Kenji; Hanakawa, Takashi; Aso, Toshihiko; Fukuyama, Hidenao
The neural basis of temporal information processing remains unclear, but it is proposed that the cerebellum plays an important role through its internal clock or feed-forward computation functions. In this study, fMRI was used to investigate the brain networks engaged in perceptual and motor aspects of subsecond temporal processing without accompanying coprocessing of spatial information. Direct comparison between perceptual and motor aspects of time processing was made with a categorical-design analysis. The right lateral cerebellum (lobule VI) was active during a time discrimination task, whereas the left cerebellar lobule VI was activated during a timed movement generation task. These findings were consistent with the idea that the cerebellum contributed to subsecond time processing in both perceptual and motor aspects. The feed-forward computational theory of the cerebellum predicted increased cerebro-cerebellar interactions during time information processing. In fact, a psychophysiological interaction analysis identified the supplementary motor and dorsal premotor areas, which had a significant functional connectivity with the right cerebellar region during a time discrimination task and with the left lateral cerebellum during a timed movement generation task. The involvement of cerebro-cerebellar interactions may provide supportive evidence that temporal information processing relies on the simulation of timing information through feed-forward computation in the cerebellum.
Gao, S.; Mioc, Darka; Yi, X.L.
facilitated the online processing, mapping and sharing of health information, with the use of HERXML and Open Geospatial Consortium (OGC) services. It brought a new solution in better health data representation and initial exploration of the Web-based processing of health information. Conclusion: The designed......Background: There is great concern within health surveillance, on how to grapple with environmental degradation, rapid urbanization, population mobility and growth. The Internet has emerged as an efficient way to share health information, enabling users to access and understand data....... For the representation of health information through Web-mapping applications, there still lacks a standard format to accommodate all fixed (such as location) and variable (such as age, gender, health outcome, etc) indicators in the representation of health information. Furthermore, net-centric computing has not been...
Axson, Sydney A; Giordano, Nicholas A; Hermann, Robin M; Ulrich, Connie M
Informed consent is fundamental to the autonomous decision-making of patients, yet much is still unknown about the process in the clinical setting. In an evolving healthcare landscape, nurses must be prepared to address patient understanding and participate in the informed consent process to better fulfill their well-established role as patient advocates. This study examines hospital-based nurses' experiences and understandings of the informed consent process. This qualitative descriptive study utilized a semi-structured interview approach identifying thematic concerns, experiences, and knowledge of informed consent across a selected population of clinically practicing nurses. Participants and research context: In all, 20 baccalaureate prepared registered nurses practicing in various clinical settings (i.e. critical care, oncology, medical/surgical) at a large northeastern academic medical center in the United States completed semi-structured interviews and a demographic survey. The mean age of participants was 36.6 years old, with a mean of 12.2 years of clinical experience. Ethical considerations: Participation in this study involved minimal risk and no invasive measures. This study received Institutional Review Board approval from the University of Pennsylvania. All participants voluntarily consented. The majority of participants (N = 19) believe patient safety is directly linked to patient comprehension of the informed consent process. However, when asked if nurses have a defined role in the informed consent process, nearly half did not agree (N = 9). Through this qualitative approach, three major nursing roles emerged: the nurse as a communicator, the nurse as an advocate, and the clerical role of the nurse. This investigation contributes to the foundation of ethical research that will better prepare nurses for patient engagement, advance current understanding of informed consent, and allow for future development of solutions. Nurses are at the forefront of
Review of the safeguards of information technology, its current developments and status of safeguards in Member States are described concerning especially the role of domestic safeguards in cooperation with IAEA Safeguards. A Number of reports is dealing with declarations provided to the IAEA pursuant to Protocols Additional to Safeguard agreements. The Information Section of the IAEA Safeguards Information Technology Division is responsible for the data entry, loading and quality control od State supplied declarations. A software system is used to process information which should be readily accessible and usable in implementation of the strengthened safeguards system. Experiences in combating illegal trafficking of nuclear materials in a number of countries are included
Critchley, Frank; Dodson, Christopher
This book focuses on the application and development of information geometric methods in the analysis, classification and retrieval of images and signals. It provides introductory chapters to help those new to information geometry and applies the theory to several applications. This area has developed rapidly over recent years, propelled by the major theoretical developments in information geometry, efficient data and image acquisition and the desire to process and interpret large databases of digital information. The book addresses both the transfer of methodology to practitioners involved in database analysis and in its efficient computational implementation.
Weir, Charlene; Gibson, Bryan; Taft, Teresa; Slager, Stacey; Lewis, Lacey; Staggers, Nancy
Delirium is a fluctuating disturbance of cognition and/or consciousness associated with poor outcomes. Caring for patients with delirium requires integration of disparate information across clinicians, settings and time. The goal of this project was to characterize the information processes involved in nurses' assessment, documentation, decisionmaking and communication regarding patients' mental status in the inpatient setting. VA nurse managers of medical wards (n=18) were systematically selected across the US. A semi-structured telephone interview focused on current assessment, documentation, and communication processes, as well as clinical and administrative decision-making was conducted, audio-recorded and transcribed. A thematic analytic approach was used. Five themes emerged: 1) Fuzzy Concepts, 2) Grey Data, 3) Process Variability 4) Context is Critical and 5) Goal Conflict. This project describes the vague and variable information processes related to delirium and mental status that undermine effective risk, prevention, identification, communication and mitigation of harm.
Zhou, Zude; Xiao, Zheng; Liu, Quan; Ai, Qingsong
'Customer requirements' (CRs) management is a key component of customer relationship management (CRM). By processing customer-focused information, CRs management plays an important role in enterprise systems (ESs). Although two main CRs analysis methods, quality function deployment (QFD) and Kano model, have been applied to many fields by many enterprises in the past several decades, the limitations such as complex processes and operations make them unsuitable for online businesses among small- and medium-sized enterprises (SMEs). Currently, most SMEs do not have the resources to implement QFD or Kano model. In this article, we propose a method named customer requirement information (CRI), which provides a simpler and easier way for SMEs to run CRs analysis. The proposed method analyses CRs from the perspective of information and applies mathematical methods to the analysis process. A detailed description of CRI's acquisition, classification and processing is provided.
Bodatsch, Mitja; Klosterkötter, Joachim; Müller, Ralf; Ruhrmann, Stephan
The basic symptoms (BS) approach provides a valid instrument in predicting psychosis onset and represents moreover a significant heuristic framework for research. The term "basic symptoms" denotes subtle changes of cognition and perception in the earliest and prodromal stages of psychosis development. BS are thought to correspond to disturbances of neural information processing. Following the heuristic implications of the BS approach, the present paper aims at exploring disturbances of information processing, revealed by functional magnetic resonance imaging (fMRI) and electro-encephalographic as characteristics of the at-risk state of psychosis. Furthermore, since high-risk studies employing ultra-high-risk criteria revealed non-conversion rates commonly exceeding 50%, thus warranting approaches that increase specificity, the potential contribution of neural information processing disturbances to psychosis prediction is reviewed. In summary, the at-risk state seems to be associated with information processing disturbances. Moreover, fMRI investigations suggested that disturbances of language processing domains might be a characteristic of the prodromal state. Neurophysiological studies revealed that disturbances of sensory processing may assist psychosis prediction in allowing for a quantification of risk in terms of magnitude and time. The latter finding represents a significant advancement since an estimation of the time to event has not yet been achieved by clinical approaches. Some evidence suggests a close relationship between self-experienced BS and neural information processing. With regard to future research, the relationship between neural information processing disturbances and different clinical risk concepts warrants further investigations. Thereby, a possible time sequence in the prodromal phase might be of particular interest.
Living organisms perform incredibly well in detecting a signal present in the environment. This information processing is achieved near optimally and quite reliably, even though the sources of signals are highly variable and complex. The work in the last few decades has given us a fair understanding of how individual signal processing units like neurons and cell receptors process signals, but the principles of collective information processing on biological networks are far from clear. Information processing in biological networks, like the brain, metabolic circuits, cellular-signaling circuits, etc., involves complex interactions among a large number of units (neurons, receptors). The combinatorially large number of states such a system can exist in makes it impossible to study these systems from the first principles, starting from the interactions between the basic units. The principles of collective information processing on such complex networks can be identified using coarse graining approaches. This could provide insights into the organization and function of complex biological networks. Here I study models of biological networks using continuum dynamics, renormalization, maximum likelihood estimation and information theory. Such coarse graining approaches identify features that are essential for certain processes performed by underlying biological networks. We find that long-range connections in the brain allow for global scale feature detection in a signal. These also suppress the noise and remove any gaps present in the signal. Hierarchical organization with long-range connections leads to large-scale connectivity at low synapse numbers. Time delays can be utilized to separate a mixture of signals with temporal scales. Our observations indicate that the rules in multivariate signal processing are quite different from traditional single unit signal processing.
Dumas, M.; Aalst, van der W.M.P.; Hofstede, ter A.H.M.
A unifying foundation to design and implement process-aware information systems This publication takes on the formidable task of establishing a unifying foundation and set of common underlying principles to effectively model, design, and implement process-aware information systems. Authored by
Hohimer, Ryan E; Greitzer, Frank L; Hampton, Shawn D
Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.
Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.
Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.
Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.
Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.
Miculescu Marius Nicolae
Full Text Available This article provides information on business and therefore need managers to obtain information relevant accounting, reliable, clear, accurate and lowest costs to optimize decision making. This need derives from the current economic environment. The survival of organizations in a competitive environment, to which they must adapt, is conditioned by obtaining accounting information which should be qualitative, opportune, vital, and in a short time. This information is related to patrimony, analytical results, the market (dynamics, dimensions, and structure, and relationships with business partners, competitors, suppliers. Therefore focus more intensely on the quality of accounting information. Definition of quality of accounting information but leave the boundaries and features of accounting communication process and aims to determine \\\\\\"quality criteria\\\\\\" or \\\\\\"qualitative characteristics\\\\\\" to develop a measurement tool. Note that the reviewliterature was found that the normalization and accounting dotrine, criteria for definition of quality of accounting infornation are not identical, their selection and ranking is different. Theory and practice also identifies the fact that information itself is worthless. Instead it is valuable once it is used in a decisional process. Thus, the economic value of the accounting information depends on the earnings obtained after making a decision, diminished by information cost. To be more specific, it depends on the table or on the implemented decision tree, on the informational cost and on the optimal condition established by the decision maker (due to the fact that producing accounting information implies costs which are often considerable and profits arise only form shares. The problem of convergence between content and interpretation of information sent by users also take, and the quality of information to be intelligible. In this case, those who use, say users should have sufficient
Hutchison, Catherine; McCreaddie, May
The aim of this project was to produce audiovisual patient information, which was user friendly and fit for purpose. The purpose of the audiovisual patient information is to inform patients about randomized controlled trials, as a supplement to their trial-specific written information sheet. Audiovisual patient information is known to be an effective way of informing patients about treatment. User involvement is also recognized as being important in the development of service provision. The aim of this paper is (i) to describe and discuss the process of developing the audiovisual patient information and (ii) to highlight the challenges and opportunities, thereby identifying implications for practice. A future study will test the effectiveness of the audiovisual patient information in the cancer clinical trial setting. An advisory group was set up to oversee the project and provide guidance in relation to information content, level and delivery. An expert panel of two patients provided additional guidance and a dedicated operational team dealt with the logistics of the project including: ethics; finance; scriptwriting; filming; editing and intellectual property rights. Challenges included the limitations of filming in a busy clinical environment, restricted technical and financial resources, ethical needs and issues around copyright. There were, however, substantial opportunities that included utilizing creative skills, meaningfully involving patients, teamworking and mutual appreciation of clinical, multidisciplinary and technical expertise. Developing audiovisual patient information is an important area for nurses to be involved with. However, this must be performed within the context of the multiprofessional team. Teamworking, including patient involvement, is crucial as a wide variety of expertise is required. Many aspects of the process are transferable and will provide information and guidance for nurses, regardless of specialty, considering developing this
Hasson, Uri; Chen, Janice; Honey, Christopher J.
Models of working memory commonly focus on how information is encoded into and retrieved from storage at specific moments. However, in the majority of real-life processes, past information is used continuously to process incoming information across multiple timescales. Considering single unit, electrocorticography, and functional imaging data, we argue that (i) virtually all cortical circuits can accumulate information over time, and (ii) the timescales of accumulation vary hierarchically, from early sensory areas with short processing timescales (tens to hundreds of milliseconds) to higher-order areas with long processing timescales (many seconds to minutes). In this hierarchical systems perspective, memory is not restricted to a few localized stores, but is intrinsic to information processing that unfolds throughout the brain on multiple timescales. “The present contains nothing more than the past, and what is found in the effect was already in the cause.”Henri L Bergson PMID:25980649
Zhang, M L; Xu, E H; Dong, H Q; Zhang, J W
To study the information processing speed and the influential factors in multiple sclerosis (MS) patients. A total of 36 patients with relapsing-remitting MS (RRMS), 21 patients with secondary progressive MS (SPMS), and 50 healthy control subjects from Xuanwu Hospital of Capital Medical University between April 2010 and April 2012 were included into this cross-sectional study.Neuropsychological tests was conducted after the disease had been stable for 8 weeks, including information processing speed, memory, executive functions, language and visual perception.Correlation between information processing speed and depression, fatigue, Expanded Disability Status Scale (EDSS) were studied. (1)MS patient groups demonstrated cognitive deficits compared to healthy controls.The Symbol Digit Modalities Test (SDMT) (control group 57±12; RRMS group 46±17; SPMS group 35±10, Pinformation processing (Pinformation processing speed, verbal memory and executive functioning are seen in MS patients, especially in SPMS subtype, while visual-spatial function is relatively reserved.Age, white matter change scales, EDSS scores, depression are negatively associated with information processing speed.
Beer, Randall D; Williams, Paul L
There has been considerable debate in the literature about the relative merits of information processing versus dynamical approaches to understanding cognitive processes. In this article, we explore the relationship between these two styles of explanation using a model agent evolved to solve a relational categorization task. Specifically, we separately analyze the operation of this agent using the mathematical tools of information theory and dynamical systems theory. Information-theoretic analysis reveals how task-relevant information flows through the system to be combined into a categorization decision. Dynamical analysis reveals the key geometrical and temporal interrelationships underlying the categorization decision. Finally, we propose a framework for directly relating these two different styles of explanation and discuss the possible implications of our analysis for some of the ongoing debates in cognitive science. Copyright © 2014 Cognitive Science Society, Inc.
Py, J-Y; Sandid, I; Jbilou, S; Dupuis, M; Adda, R; Narbey, D; Djoudi, R
Postdonation information is the knowledge of information about the donor or his donation, occurring after it, which challenges quality or safety of the blood products stemming from this or other donations. Classical hemovigilance sub-processes concerning donors or recipients adverse events do not cover this topic. France is just about to make it official as a fourth sub-process. Less formal management of postdonation information is already set up for more than ten years. French data of the year 2013 are presented, including the regional notification level and the national reporting one. A significant level of heterogeneity is observed as for other hemovigilance sub-processes. It is mainly due to subjective rather than objective differences in risk appreciation. A real consensual work is expected about it in the future. Copyright © 2014 Elsevier Masson SAS. All rights reserved.
How does central nervous system process information? Current theories are based on two tenets: (a) information is transmitted by action potentials, the language by which neurons communicate with each other-and (b) homogeneous neuronal assemblies of cortical circuits operate on these neuronal messages where the operations are characterized by the intrinsic connectivity among neuronal populations. In this view, the size and time course of any spike is stereotypic and the information is restricted to the temporal sequence of the spikes; namely, the "neural code". However, an increasing amount of novel data point towards an alternative hypothesis: (a) the role of neural code in information processing is overemphasized. Instead of simply passing messages, action potentials play a role in dynamic coordination at multiple spatial and temporal scales, establishing network interactions across several levels of a hierarchical modular architecture, modulating and regulating the propagation of neuronal messages. (b) Information is processed at all levels of neuronal infrastructure from macromolecules to population dynamics. For example, intra-neuronal (changes in protein conformation, concentration and synthesis) and extra-neuronal factors (extracellular proteolysis, substrate patterning, myelin plasticity, microbes, metabolic status) can have a profound effect on neuronal computations. This means molecular message passing may have cognitive connotations. This essay introduces the concept of "supramolecular chemistry", involving the storage of information at the molecular level and its retrieval, transfer and processing at the supramolecular level, through transitory non-covalent molecular processes that are self-organized, self-assembled and dynamic. Finally, we note that the cortex comprises extremely heterogeneous cells, with distinct regional variations, macromolecular assembly, receptor repertoire and intrinsic microcircuitry. This suggests that every neuron (or group of
Farfán, Fernando D; Politti, Julio C; Felice, Carmelo J
Electromyographic signals can be used in biomedical engineering and/or rehabilitation field, as potential sources of control for prosthetics and orthotics. In such applications, digital processing techniques are necessary to follow efficient and effectively the changes in the physiological characteristics produced by a muscular contraction. In this paper, two methods based on information theory are proposed to evaluate the processing techniques. These methods determine the amount of information that a processing technique is able to extract from EMG signals. The processing techniques evaluated with these methods were: absolute mean value (AMV), RMS values, variance values (VAR) and difference absolute mean value (DAMV). EMG signals from the middle deltoid during abduction and adduction movement of the arm in the scapular plane was registered, for static and dynamic contractions. The optimal window length (segmentation), abduction and adduction movements and inter-electrode distance were also analyzed. Using the optimal segmentation (200 ms and 300 ms in static and dynamic contractions, respectively) the best processing techniques were: RMS, AMV and VAR in static contractions, and only the RMS in dynamic contractions. Using the RMS of EMG signal, variations in the amount of information between the abduction and adduction movements were observed. Although the evaluation methods proposed here were applied to standard processing techniques, these methods can also be considered as alternatives tools to evaluate new processing techniques in different areas of electrophysiology.
Felice Carmelo J
Full Text Available Abstract Background Electromyographic signals can be used in biomedical engineering and/or rehabilitation field, as potential sources of control for prosthetics and orthotics. In such applications, digital processing techniques are necessary to follow efficient and effectively the changes in the physiological characteristics produced by a muscular contraction. In this paper, two methods based on information theory are proposed to evaluate the processing techniques. Methods These methods determine the amount of information that a processing technique is able to extract from EMG signals. The processing techniques evaluated with these methods were: absolute mean value (AMV, RMS values, variance values (VAR and difference absolute mean value (DAMV. EMG signals from the middle deltoid during abduction and adduction movement of the arm in the scapular plane was registered, for static and dynamic contractions. The optimal window length (segmentation, abduction and adduction movements and inter-electrode distance were also analyzed. Results Using the optimal segmentation (200 ms and 300 ms in static and dynamic contractions, respectively the best processing techniques were: RMS, AMV and VAR in static contractions, and only the RMS in dynamic contractions. Using the RMS of EMG signal, variations in the amount of information between the abduction and adduction movements were observed. Conclusions Although the evaluation methods proposed here were applied to standard processing techniques, these methods can also be considered as alternatives tools to evaluate new processing techniques in different areas of electrophysiology.
G?mez, Jaime; Salazar, Idana; Vargas, Pilar
In this paper we use a panel of manufacturing firms in Spain to examine the extent to which they use internal and external sources of information (customers, suppliers, competitors, consultants and universities) to generate product and process innovation. Our results show that, although internal sources are influential, external sources of information are key to achieve innovation performance. These results are in line with the open innovation literature because they show that firms that are ...
Fischer, Peter; Fischer, Julia; Weisweiler, Silke; Frey, Dieter
We investigated whether different modes of decision making (deliberate, intuitive, distracted) affect subsequent confirmatory processing of decision-consistent and inconsistent information. Participants showed higher levels of confirmatory information processing when they made a deliberate or an intuitive decision versus a decision under distraction (Studies 1 and 2). As soon as participants have a cognitive (i.e., deliberate cognitive analysis) or affective (i.e., intuitive and gut feeling) reason for their decision, the subjective confidence in the validity of their decision increases, which results in increased levels of confirmatory information processing (Study 2). In contrast, when participants are distracted during decision making, they are less certain about the validity of their decision and thus are subsequently more balanced in the processing of decision-relevant information.
Cao, Yuansheng; Gong, Zongping; Quan, H T
Motivated by the recent proposed models of the information engine [Proc. Natl. Acad. Sci. USA 109, 11641 (2012)] and the information refrigerator [Phys. Rev. Lett. 111, 030602 (2013)], we propose a minimal model of the information pump and the information eraser based on enzyme kinetics. This device can either pump molecules against the chemical potential gradient by consuming the information to be encoded in the bit stream or (partially) erase the information initially encoded in the bit stream by consuming the Gibbs free energy. The dynamics of this model is solved exactly, and the "phase diagram" of the operation regimes is determined. The efficiency and the power of the information machine is analyzed. The validity of the second law of thermodynamics within our model is clarified. Our model offers a simple paradigm for the investigating of the thermodynamics of information processing involving the chemical potential in small systems.
Cao, Yuansheng; Gong, Zongping; Quan, H. T.
Motivated by the recent proposed models of the information engine [Proc. Natl. Acad. Sci. USA 109, 11641 (2012), 10.1073/pnas.1204263109] and the information refrigerator [Phys. Rev. Lett. 111, 030602 (2013), 10.1103/PhysRevLett.111.030602], we propose a minimal model of the information pump and the information eraser based on enzyme kinetics. This device can either pump molecules against the chemical potential gradient by consuming the information to be encoded in the bit stream or (partially) erase the information initially encoded in the bit stream by consuming the Gibbs free energy. The dynamics of this model is solved exactly, and the "phase diagram" of the operation regimes is determined. The efficiency and the power of the information machine is analyzed. The validity of the second law of thermodynamics within our model is clarified. Our model offers a simple paradigm for the investigating of the thermodynamics of information processing involving the chemical potential in small systems.
Endres, Michael J; Houpt, Joseph W; Donkin, Chris; Finn, Peter R
Working memory capacity (WMC) is typically measured by the amount of task-relevant information an individual can keep in mind while resisting distraction or interference from task-irrelevant information. The current research investigated the extent to which differences in WMC were associated with performance on a novel redundant memory probes (RMP) task that systematically varied the amount of to-be-remembered (targets) and to-be-ignored (distractor) information. The RMP task was designed to both facilitate and inhibit working memory search processes, as evidenced by differences in accuracy, response time, and Linear Ballistic Accumulator (LBA) model estimates of information processing efficiency. Participants (N = 170) completed standard intelligence tests and dual-span WMC tasks, along with the RMP task. As expected, accuracy, response-time, and LBA model results indicated memory search and retrieval processes were facilitated under redundant-target conditions, but also inhibited under mixed target/distractor and redundant-distractor conditions. Repeated measures analyses also indicated that, while individuals classified as high (n = 85) and low (n = 85) WMC did not differ in the magnitude of redundancy effects, groups did differ in the efficiency of memory search and retrieval processes overall. Results suggest that redundant information reliably facilitates and inhibits the efficiency or speed of working memory search, and these effects are independent of more general limits and individual differences in the capacity or space of working memory.
Full Text Available The aim of the research was to establish the meaning of the colors and geometric shapes in transmitting information in the work process. The sample of 100 students connected 50 situations which could be associated with regular tasks in the work process with 12 colors and 4 geometric forms in previously chosen color. Based on chosen color-geometric shape-situation regulation, the idea of the research was to find out regularities in coding of information and to examine if those regularities can provide meaningful data assigned to each individual code and to explain which codes are better and applicable represents of examined situations.
Damos, D. L.
This paper describes several experiments examining the source of individual differences in the experience of mental workload. Three sources of such differences were examined: information processing abilities, timesharing abilities, and personality traits/behavior patterns. On the whole, there was little evidence that individual differences in information processing abilities or timesharing abilities are related to perceived differences in mental workload. However, individuals with strong Type A coronary prone behavior patterns differed in both single- and multiple-task performance from individuals who showed little evidence of such a pattern. Additionally, individuals with a strong Type A pattern showed some dissociation between objective performance and the experience of mental workload.
Learning to rank refers to machine learning techniques for training a model in a ranking task. Learning to rank is useful for many applications in information retrieval, natural language processing, and data mining. Intensive studies have been conducted on its problems recently, and significant progress has been made. This lecture gives an introduction to the area including the fundamental problems, major approaches, theories, applications, and future work.The author begins by showing that various ranking problems in information retrieval and natural language processing can be formalized as tw
Full Text Available The ability to encode and maintain the exact order of short sequences of stimuli or events is often crucial to our ability for effective high-order planning. However, it is not yet clear which neural mechanisms underpin this process. Several studies suggest that in comparison with item recognition temporal order coding activates prefrontal and parietal brain regions. Results of various studies tend to favour the hypothesis that the order of the stimuli is represented and encoded on several stages, from primacy and recency estimates to the exact position of the item in a sequence. Different brain regions play a different role in this process. Dorsolateral prefrontal cortex has a more general role in attention, while the premotor cortex is more involved in the process of information grouping. Parietal lobe and hippocampus also play a significant role in order processing as they enable the representation of distance. Moreover, order maintenance is associated with the existence of neural oscillators that operate at different frequencies. Electrophysiological studies revealed that theta and alpha oscillations play an important role in the maintenance of temporal order information. Those EEG oscillations are differentially associated with processes that support the maintenance of order information and item recognition. Various studies suggest a link between prefrontal areas and memory for temporal order, implying that EEG neural oscillations in the prefrontal cortex may play a role in the maintenance of information on temporal order.
Samadbeik, Mahnaz; Shahrokhi, Nafiseh; Saremian, Marzieh; Garavand, Ali; Birjandi, Mahdi
In recent years, information technology has been introduced in the nursing departments of many hospitals to support their daily tasks. Nurses are the largest end user group in Hospital Information Systems (HISs). This study was designed to evaluate data processing in the Nursing Information Systems (NISs) utilized in many university hospitals in Iran. This was a cross-sectional study. The population comprised all nurse managers and NIS users of the five training hospitals in Khorramabad city ( N = 71). The nursing subset of HIS-Monitor questionnaire was used to collect the data. Data were analyzed by the descriptive-analytical method and the inductive content analysis. The results indicated that the nurses participating in the study did not take a desirable advantage of paper (2.02) and computerized (2.34) information processing tools to perform nursing tasks. Moreover, the less work experience nurses have, the further they utilize computer tools for processing patient discharge information. The "readability of patient information" and "repetitive and time-consuming documentation" were stated as the most important expectations and problems regarding the HIS by the participating nurses, respectively. The nurses participating in the present study used to utilize paper and computerized information processing tools together to perform nursing practices. Therefore, it is recommended that the nursing process redesign coincides with NIS implementation in the health care centers.
Belew, W.W.; Wood, B.L.; Marle, T.L.; Reinhardt, C.L.
The results of a series of telephone interviews with groups of users of information on solar industrial and agricultural process heat (IAPH) are described. These results, part of a larger study on many different solar technologies, identify types of information each group needed and the best ways to get information to each group. In the current study only high-priority groups were examined. Results from 10 IAPH groups of respondents are analyzed in this report: IPH Researchers; APH Researchers; Representatives of Manufacturers of Concentrating and Nonconcentrating Collectors; Plant, Industrial, and Agricultural Engineers; Educators; Representatives of State Agricultural Offices; and County Extension Agents.
Duncan, Lauren E; Peterson, Bill E
Intolerance of ambiguity and cognitive rigidity are unifying aspects of authoritarianism as defined by Adorno, Frenkel-Brunswik, Levinson, and Sanford (1982/1950), who hypothesized that authoritarians view the world in absolute terms (e.g., good or evil). Past studies have documented the relationship between authoritarianism and intolerance of ambiguity and rigidity. Frenkel-Brunswik (1949) hypothesized that this desire for absolutism was rooted in perceptual processes. We present a study with three samples that directly tests the relationship between right wing authoritarianism (RWA) and the processing of ideologically neutral but ambiguous visual stimuli. As hypothesized, in all three samples we found that RWA was related to the slower processing of visual information that required participants to recategorize objects. In a fourth sample, RWA was unrelated to speed of processing visual information that did not require recategorization. Overall, results suggest a relationship between RWA and rigidity in categorization.
Introduction: The article contributes to the conceptual studies of affective factors in information seeking by examining Kuhlthau's information search process model. Method: This random-digit dial telephone survey of 253 people (75% female) living in a rural, medically under-serviced area of Ontario, Canada, follows-up a previous interview study…
Information technology outsourcing relationship is one of the key issues to IT outsourcing success. To explore how to manage and promote IT outsourcing relationship, it is necessary to understand its evolution process. Firstly, the types of IT outsourcing based on relationship quality and IT outsourcing project level will be analyzed; Secondly, two evolution process models of IT outsourcing relationship are proposed based on relationship quality and IT outsourcing project level, and the IT ou...
Dixit, Abhinav; Goyal, Abhishek; Thawani, Rajat; Vaney, Neelam
Background: Caffeine is a pyschostimulant present in various beverages and known to alter alertness and performance by acting on the central nervous system. Its effects on central nervous system have been studied using EEG, evoked potentials, fMRI, and neuropsychological tests. The Stroop task is a widely used tool in psychophysiology to understand the attention processes and is based on the principle that processing of two different kinds of information (like the word or colour) is parallel ...
Mesároš, P.; Mandičák, T.
The article describes the options for the management of material flows in the construction process. Management and resource planning is one of the key factors influencing the effectiveness of construction project. It is very difficult to set these flows correctly. The current period offers several options and tools to do this. Information systems and their modules can be used just for the management of materials in the construction process.
Benchmarking has traditionally been viewed as a way to compare data only; however, its utilisation as a more investigative, research-informed process to add rigor to decision-making processes at the institutional level is gaining momentum in the higher education sector. Indeed, with recent changes in the Australian quality environment from the…
In quantum information processing it may be possible to have efficient computation and secure communication beyond the limitations of classical systems. In a fundamental point of view, however, evolution of quantum systems by the laws of quantum mechanics is more restrictive than classical systems, identified to a specific form of dynamics, that is, unitary transformations and, consequently, positive and completely positive maps to subsystems. This also characterizes classes of disallowed transformations on quantum systems, among which positive but not completely maps are of particular interest as they characterize entangled states, a general resource in quantum information processing. Structural physical approximation offers a systematic way of approximating those non-physical maps, positive but not completely positive maps, with quantum channels. Since it has been proposed as a method of detecting entangled states, it has stimulated fundamental problems on classifications of positive maps and the structure of Hermitian operators and quantum states, as well as on quantum measurement such as quantum design in quantum information theory. It has developed efficient and feasible methods of directly detecting entangled states in practice, for which proof-of-principle experimental demonstrations have also been performed with photonic qubit states. Here, we present a comprehensive review on quantum information processing with structural physical approximations and the related progress. The review mainly focuses on properties of structural physical approximations and their applications toward practical information applications.
Jepma, M.; Wagenmakers, E.-J.; Nieuwenhuis, S.
People are able to use temporal cues to anticipate the timing of an event, enabling them to process that event more efficiently. We conducted two experiments, using the fixed-foreperiod paradigm (Experiment 1) and the temporal-cueing paradigm (Experiment 2), to assess which components of information
de Dreu, C.K.W.; Nijstad, B.A.; Bechtoldt, M.N.; Baas, M.
The authors review the Motivated Information Processing in Groups Model (De Dreu, Nijstad, & Van Knippenberg, 2008) to understand group creativity and innovation. Although distinct phenomena, group creativity and innovation are both considered a function of epistemic motivation (EM; the degree to
Goodman, Jodi S.; Wood, Robert E.; Chen, Zheng
This study examines the effects of feedback specificity on transfer of training and the mechanisms through which feedback can enhance or inhibit transfer. We used concurrent verbal protocol methodology to elicit and operationalize the explicit information processing activities used by 48 trainees performing the Furniture Factory computer…
Whitney, Jane; Joormann, Jutta; Gotlib, Ian H.; Kelley, Ryan G.; Acquaye, Tenah; Howe, Meghan; Chang, Kiki D.; Singh, Manpreet K.
Background: Cognitive models of bipolar I disorder (BD) may aid in identification of children who are especially vulnerable to chronic mood dysregulation. Information-processing biases related to memory and attention likely play a role in the development and persistence of BD among adolescents; however, these biases have not been extensively…
Bier, H.H.; Mostafavi, S.
Hyperbody’s materially informed Design-to-Robotic-Production (D2RP) processes for additive and subtractive manufacturing aim to achieve performative porosity in architecture at various scales. An extended series of D2RP experiments aiming to produce prototypes at 1:1 scale wherein design materiality
Slade, Peter D.; Onion, Carl W. R.
The current emphasis in medical education is on engaging learners in deep processing of information to achieve better understanding of the subject matter. Traditional approaches aimed for memorization of medical facts; however, a good memory for medical facts is still essential in clinical practice. This study demonstrates that deep information…
Leoni, de M.; Marrella, A.; Russo, A.; Cezon, M.; Wolfsthal, Y.
Nowadays, Process-aware Information Systems (PAISs) are widely used in many business scenarios, e.g., by government agencies, by insurance companies, and by banks. Despite this widespread usage, the typical application of such systems is predominantly in the context of business scenarios.
This slide presentation reviews the importance of Risk Informed Design (RID) as an important feature of the systems engineering process. RID is based on the principle that risk is a design commodity such as mass, volume, cost or power. It also reviews Probabilistic Risk Assessment (PRA) as it is used in the product life cycle in the development of NASA's Constellation Program.
Lists some of the better-known and more accessible books on the history of computing and information processing, covering: (1) popular general works; (2) more technical general works; (3) microelectronics and computing; (4) artificial intelligence and robotics; (5) works relating to Charles Babbage; (6) other biographical and personal accounts;…
Reardon, Robert C.; Lenz, Janet G.; Sampson, James P., Jr.; Peterson, Gary W.
This article draws upon the authors' experience in developing cognitive information processing theory in order to examine three important questions facing vocational psychology and assessment: (a) Where should new knowledge for vocational psychology come from? (b) How do career theories and research find their way into practice? and (c) What is…
Pedersen, Jesper; Flindt, Christian; Mortensen, Niels Asger
We propose a new physical implementation of spin qubits for quantum information processing, namely defect states in antidot lattices de¯ned in the two-dimensional electron gas at a semiconductor heterostructure. Calculations of the band structure of the periodic antidot lattice are presented...
Tangen, Jodi L.; Borders, L. DiAnne
Although clinical supervision is an educational endeavor (Borders & Brown, [Borders, L. D., 2005]), many scholars neglect theories of learning in working with supervisees. The authors describe 1 learning theory--information processing theory (Atkinson & Shiffrin, 1968, 1971; Schunk, 2016)--and the ways its associated interventions may…
A model derived from information processing theory is described, which helps to explain the complex verbal learning of students and suggests implications for lecturing techniques. Other factors affecting learning, which are not covered by the model, are discussed in relationship to it: student's intellectual development and effects of individual…
Dillon, Ronna F.
Undergraduates were given complex figural analogies items, and eye movements were observed under three types of feedback: (1) elaborate feedback; (2) subjects verbalized their thinking and application of rules; and (3) no feedback. Both feedback conditions enhanced the rule-governed information processing during inductive reasoning. (Author/GDC)
Smoliar, Stephen W.
This review of the book, "Conceptual Structures: Information Processing in Mind and Machine," by John F. Sowa, argues that anyone who plans to get involved with issues of knowledge representation should have at least a passing acquaintance with Sowa's conceptual graphs for a database interface. (Used to model the underlying semantics of…
Full Text Available The aim of this study was to examine whether age-related changes in the speed of information processing are the best predictors of the increase in sensitivity to time throughout childhood. Children aged 5 and 8 years old, as well adults, were given two temporal bisection tasks, one with short (0.5/1-s and the other with longer (4/8-s anchor durations. In addition, the participants' scores on different neuropsychological tests assessing both information processing speed and other dimensions of cognitive control (short-term memory, working memory, selective attention were calculated. The results showed that the best predictor of individual variances in sensitivity to time was information processing speed, although working memory also accounted for some of the individual differences in time sensitivity, albeit to a lesser extent. In sum, the faster the information processing speed of the participants, the higher their sensitivity to time was. These results are discussed in the light of the idea that the development of temporal capacities has its roots in the maturation of the dynamic functioning of the brain.
Tremayne, Mark; Dunwoody, Sharon
Examines the role of interactivity in the presentation of science news on the World Wide Web. Proposes and tests a model of interactive information processing that suggests that characteristics of users and Web sites influence interactivity, which influences knowledge acquisition. Describes use of a think-aloud method to study participants' mental…
Johann M. Schepers
Full Text Available The primary goal of the study was to construct a computerised information-processing test battery to measure choice reaction time for up to and including six bits of information, to measure discrimination reaction time with regard to colour patterns and form patterns, to measure rate of information processing with regard to perceptual stimuli and conceptual reasoning, and to develop a suitable scoring system for the respective tests. The battery of tests was applied to 58 pilots. Opsomming Die hoofdoel van die studie was om ‘n gerekenariseerde inligtingverwerkingstoets-battery te konstrueer om keusereaksietyd tot en met ses bis inligting te meet, om diskriminasie-reaksietyd ten opsigte van kleurpatrone en vormpatrone te meet, om tempo van inligtingverwerking ten opsigte van perseptuele stimuli en konseptuele redenering te meet en om ‘n gepaste nasienstelsel vir die onderskeie toetse te ontwikkel. Die battery toetse is op 58 vlieëniers toegepas
Re-entrant feedback, either within sensory cortex or arising from prefrontal areas, has been strongly linked to the emergence of consciousness, both in theoretical and experimental work. This idea, together with evidence for local micro-consciousness, suggests the generation of qualia could in some way result from local network activity under re-entrant activation. This paper explores the possibility by examining the processing of information by local cortical networks. It highlights the difference between the information structure (how the information is physically embodied), and the information message (what the information is about). It focuses on the network's ability to recognize information structures amongst its inputs under conditions of extensive local feedback, and to then assign information messages to those structures. It is shown that if the re-entrant feedback enables the network to achieve an attractor state, then the message assigned in any given pass of information through the network is a representation of the message assigned in the previous pass-through of information. Based on this ability the paper argues that as information is repeatedly cycled through the network, the information message that is assigned evolves from a recognition of what the input structure is, to what it is like, to how it appears, to how it seems. It could enable individual networks to be the site of qualia generation. The paper goes on to show networks in cortical layers 2/3 and 5a have the connectivity required for the behavior proposed, and reviews some evidence for a link between such local cortical cyclic activity and conscious percepts. It concludes with some predictions based on the theory discussed.
Li, Yu; Oberweis, Andreas
Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.
Yukalov, V. I.; Yukalova, E. P.; Sornette, D.
We suggest a model of a multi-agent society of decision makers taking decisions being based on two criteria, one is the utility of the prospects and the other is the attractiveness of the considered prospects. The model is the generalization of quantum decision theory, developed earlier for single decision makers realizing one-step decisions, in two principal aspects. First, several decision makers are considered simultaneously, who interact with each other through information exchange. Second, a multistep procedure is treated, when the agents exchange information many times. Several decision makers exchanging information and forming their judgment, using quantum rules, form a kind of a quantum information network, where collective decisions develop in time as a result of information exchange. In addition to characterizing collective decisions that arise in human societies, such networks can describe dynamical processes occurring in artificial quantum intelligence composed of several parts or in a cluster of quantum computers. The practical usage of the theory is illustrated on the dynamic disjunction effect for which three quantitative predictions are made: (i) the probabilistic behavior of decision makers at the initial stage of the process is described; (ii) the decrease of the difference between the initial prospect probabilities and the related utility factors is proved; (iii) the existence of a common consensus after multiple exchange of information is predicted. The predicted numerical values are in very good agreement with empirical data.
development of the company. The accounting information is the raw material for the decisional process. The accounting decisions lead to elaborating all the other types of decisions that help achieving the main goal of the management: continuing the activity. In our oppinion the informational resources of an entity can be part of the asset, the economic information being a precious resource. If used efficiently it can be used to achieve the goals. Any business idea, designing a new concept or tranzaction, have one thing as base: information.
Heimbuerger, H.; Kautto, A.; Norros, L.; Ranta, J.
One of the proposed solutions to the man-process interface problem in nuclear power plants is the integration of a system in the control room that can provide the operator with a display of a minimum set of critical plant parameters defining the safety status of the plant. Such a system has been experimentally validated using the Loviisa training simulator during the fall of 1982. The project was a joint effort between Combustion Engineering Inc., the Halden Reactor Project, Imatran Voima Oy and VTT. Alarm systems are used in nuclear power plants to tell the control room operators that an unexpected change in the plant operation state has occurred. One difficulty in using the alarms for checking the actions of the operator is that the conventional way of realizing the alarm systems implies that several alarms are active also during normal operation. The coding and representation of alarm information will be discussed in the paper. An important trend in control room design is the move away from direct, concrete indication of process parameters towards use of more abstract/logical representation of information as a basis for plant supervision. Recent advances in computer graphics provide the possibility that, in the future, visual information will be utilized to make the essential dynamics of the process more intelligible. A set of criteria for use of visual information will be necessary. The paper discusses practical aspects for the realisation of such criteria in the context of nuclear power plant. The criteria of the decomposition of the process information concerning the sub-goals safety and availability and also the tentative results of the conceptualization of a PWR-process are discussed in the paper
Shirley Guimarães Pimenta
Full Text Available The interaction amongst the ‘user’, ‘information’, and ‘text’ is of interest to Information Science although it has deserved insufficient attention in the literature. This issue is addressed by this paper whose main purpose is to contribute to the discussion of theoretical affinity between the cognitive viewpoint in Information Science and the information processing approach in Cognitive Psychology. Firstly, the interdisciplinary nature of Information Science is discussed and justified as a means to deepen and strengthen its theoretical framework. Such interdisciplinarity helps to avoid stagnation and keep pace with other disciplines. Secondly, the discussion takes into consideration the cognitive paradigm, which originates the cognitive viewpoint approach in Information Science. It is highlighted that the cognitive paradigm represented a change in the Social Sciences due to the shift of focus from the object and the signal to the individual. Besides that, it sheds light to the notion of models of worlds, i.e., the systems of categories and concepts that guide the interaction between the individual and his/her environment. Thirdly, the theoretical assumptions of the cognitive viewpoint approach are discussed, with emphasis on the concept of ‘information’, as resulting of cognitive processes and as related to the notion of ‘text’. This approach points out the relevance of understanding the interaction amongst users, information, and text. However, it lacks further development. Using notions which are common to both approaches, some of the gaps can be fulfilled. Finally, the concept of ‘text’, its constituents and structures are presented from the perspective of text comprehension models and according to the information processing approach. As a concluding remark, it is suggested that bringing together the cognitive viewpoint and the information processing approach can be enriching and fruitful to the both Information
Di Paola, R.; Todd-Pokropek, A.E.; CEA, 91 - Orsay
Processing of scintigraphic data has passed through different stages in the past fifteen years. After an 'euphoric' era when large off-line computer facilities were used to process very low-quality rectilinear scan pictures, a much more critical period followed the introduction of on-line minicomputer systems to acquire, process and visualize scintillation camera data. A selection of some of the available techniques that could improve the extraction of information from scintigraphic examinations in routine is presented. Tomography has been excluded. As examples, the different techniques of (a) inhomogeneity correction of camera response and (b) respiratory motion corrections are used to show one evolutionary process in the use of computer systems. Filtering has been for a long time the major area of research in scintigraphic image processing. Only very simple (usually smoothing) filters are widely distributed. Little use of more 'powerful' filters in clinical data has been made, and very few serious evaluations have been published. Nevertheless, the number of installed minicomputer and microprocessor systems is increasing rapidly, but in general performing tasks other than filtering. The reasons for this (relative) failure are examined. Some 'new' techniques of image processing are presented. The compression of scintigraphic information is important because of the expected need in the near future for handling of large numbers of static pictures as in dynamic and tomographic studies. For dynamic information processing, the present methodology has been narrowed to those techniques that permit the entire 'data space' to be manipulated (as opposed to curve fitting after region of interest definition). 'Functional' imaging was the first step in this process. 'Factor analysis' could be the next. The results obtained by various research laboratories are reviewed. (author)
Jun, Liu; Qiong, Wang; Le-Man, Kuang; Hao-Sheng, Zeng
We propose a scheme to engineer a non-local two-qubit phase gate between two remote quantum-dot spins. Along with one-qubit local operations, one can in principal perform various types of distributed quantum information processing. The scheme employs a photon with linearly polarisation interacting one after the other with two remote quantum-dot spins in cavities. Due to the optical spin selection rule, the photon obtains a Faraday rotation after the interaction process. By measuring the polarisation of the final output photon, a non-local two-qubit phase gate between the two remote quantum-dot spins is constituted. Our scheme may has very important applications in the distributed quantum information processing
Review of the safeguards of information technology, its current developments and status of safeguards in Member States are described concerning especially the role of domestic safeguards in cooperation with IAEA Safeguards. A Number of reports is dealing with declarations provided to the IAEA pursuant to Protocols Additional to Safeguard agreements. The Information Section of the IAEA Safeguards Information Technology Division is responsible for the data entry, loading and quality control od State supplied declarations. A software system is used to process information which should be readily accessible and usable in implementation of the strengthened safeguards system. Experiences in combating illegal trafficking of nuclear materials in a number of countries are included Refs, figs, 1 tab
Wang, Xin; Li, Weihua; Liu, Longzhao; Pei, Sen; Tang, Shaoting; Zheng, Zhiming
For information diffusion in multiplex networks, the effect of interlayer contagion on spreading dynamics has been explored in different settings. Nevertheless, the impact of interlayer recovery processes, i.e., the transition of nodes to stiflers in all layers after they become stiflers in any layer, still remains unclear. In this paper, we propose a modified ignorant-spreader-stifler model of rumor spreading equipped with an interlayer recovery mechanism. We find that the information diffusion can be effectively promoted for a range of interlayer recovery rates. By combining the mean-field approximation and the Markov chain approach, we derive the evolution equations of the diffusion process in two-layer homogeneous multiplex networks. The optimal interlayer recovery rate that achieves the maximal enhancement can be calculated by solving the equations numerically. In addition, we find that the promoting effect on a certain layer can be strengthened if information spreads more extensively within the counterpart layer. When applying the model to two-layer scale-free multiplex networks, with or without degree correlation, similar promoting effect is also observed in simulations. Our work indicates that the interlayer recovery process is beneficial to information diffusion in multiplex networks, which may have implications for designing efficient spreading strategies.
Yang, Lixia; Chen, Wenfeng; Ng, Andy H; Fu, Xiaolan
Literature on cross-cultural differences in cognition suggests that categorization, as an information processing and organization strategy, was more often used by Westerners than by East Asians, particularly for older adults. This study examines East-West cultural differences in memory for categorically processed items and sources in young and older Canadians and native Chinese with a conceptual source memory task (Experiment 1) and a reality monitoring task (Experiment 2). In Experiment 1, participants encoded photographic faces of their own ethnicity that were artificially categorized into GOOD or EVIL characters and then completed a source memory task in which they identified faces as old-GOOD, old-EVIL, or new. In Experiment 2, participants viewed a series of words, each followed either by a corresponding image (i.e., SEEN) or by a blank square within which they imagined an image for the word (i.e., IMAGINED). At test, they decided whether the test words were old-SEEN, old-IMAGINED, or new. In general, Canadians outperformed Chinese in memory for categorically processed information, an effect more pronounced for older than for young adults. Extensive exercise of culturally preferred categorization strategy differentially benefits Canadians and reduces their age group differences in memory for categorically processed information.
Li, K.; Wang, S.; Zeng, Z.; Wei, J.; Ren, Z. [China University of Mining and Technology, Xuzhou (China). Dept of Mining Engineering
Based on the concept of geological statistic, mathematical program, condition simulation, system engineering, and the features and duties of each main department in surface mine production, an integrated system for surface mine production information was studied systematically and developed by using the technology of data warehousing, CAD, object-oriented and system integration, which leads to the systematizing and automating of the information management, data processing, optimization computing and plotting. In this paper, its overall object, system design, structure and functions and some key techniques were described. 2 refs., 3 figs.
The thesis studies three important applications of random matrices to information processing. Our main contribution is that we consider probabilistic systems involving more general random matrix ensembles than the classical ensembles with iid entries, i.e. models that account for statistical...... dependence between the entries. Specifically, the involved matrices are invariant or fulfill a certain asymptotic freeness condition as their dimensions grow to infinity. Informally speaking, all latent variables contribute to the system model in a democratic fashion – there are no preferred latent variables...
A. V. Vilovatykh
Full Text Available Th e role of information in shaping geopolitical situation needs a greater academic attention, because the infl uence of external actors in favour of theor political preferences forms the vectors of state development. Modern world’s turbulence is based on technologies of managing social reality. It requires a principally new methodology in evaluating and forecasting geopolitical processes. Nowadays it makes sense to speak of information factor of geopolitical dynamics, which shapes a new quality of geopolitical space.
Quantum Information Processing and Quantum Error Correction is a self-contained, tutorial-based introduction to quantum information, quantum computation, and quantum error-correction. Assuming no knowledge of quantum mechanics and written at an intuitive level suitable for the engineer, the book gives all the essential principles needed to design and implement quantum electronic and photonic circuits. Numerous examples from a wide area of application are given to show how the principles can be implemented in practice. This book is ideal for the electronics, photonics and computer engineer
Full Text Available In dynamical systems, local interactions between dynamical units generate correlations which are stored and transmitted throughout the system, generating the macroscopic behavior. However a framework to quantify exactly how these correlations are stored, transmitted, and combined at the microscopic scale is missing. Here we propose to characterize the notion of “information processing” based on all possible Shannon mutual information quantities between a future state and all possible sets of initial states. We apply it to the 256 elementary cellular automata (ECA, which are the simplest possible dynamical systems exhibiting behaviors ranging from simple to complex. Our main finding is that only a few information features are needed for full predictability of the systemic behavior and that the “information synergy” feature is always most predictive. Finally we apply the idea to foreign exchange (FX and interest-rate swap (IRS time-series data. We find an effective “slowing down” leading indicator in all three markets for the 2008 financial crisis when applied to the information features, as opposed to using the data itself directly. Our work suggests that the proposed characterization of the local information processing of units may be a promising direction for predicting emergent systemic behaviors.
Full Text Available We are witnessing a need for a quick and intelligent reaction from organizations to the level and speed of change in business processes.New information technologies and systems (IT/IS are challenging business models and products. One of the great shakes comes from the online and/or mobile apps and platforms.These are having a tremendous impact in launching innovative and competitive services through the combination of digital and physical features. This leads to actively rethink enterprise information systems’ portfolio, its management and suitability. One relevant way for enterprises to manage their IT/IS in order to cope with those challenges is enterprise and process architecture. A decision-making culture based on processes helps to understand and define the different elements that shape an organization and how those elements inter-relate inside and outside it. IT/IS portfolio management requires an increasing need of modeling data and process flows for better discerning and acting at its selection and alignment with business goals. The new generation of enterprise architecture (NGEA helps to design intelligent processes that answer quickly and creatively to new and challenging trends. This has to be open, agile and context-aware to allow well-designed services that match users’ expectations. This study includes two real cases/problems to solve quickly in companies and solutions are presented in line with this architectural approach.
Gantz, Stephen D
The Basics of IT Audit: Purposes, Processes, and Practical Information provides you with a thorough, yet concise overview of IT auditing. Packed with specific examples, this book gives insight into the auditing process and explains regulations and standards such as the ISO-27000, series program, CoBIT, ITIL, Sarbanes-Oxley, and HIPPA. IT auditing occurs in some form in virtually every organization, private or public, large or small. The large number and wide variety of laws, regulations, policies, and industry standards that call for IT auditing make it hard for organizations to consistent
Guilherme Guedes Xavier
Full Text Available Recently, as part of the restructuration of the global economy, new technologies and new consumer trends have led to changes in markets around the world. Based on this restructuration, some markets have faced extremely high degrees of competition, which led to the emergence of new concepts in the productive sector. One of the most important concepts was the spread of JIT systems outside Japan in the early 80s. Currently, another prominent concept, named supply-chain management is emerging In this paper, I take an information processing perspective to analyze both and conclude that, although complementary in dealing with uncertainties, both use different information processing strategies. Understanding the concepts and the relationship between them is important to their further development and diffusion among practioners and scholars' communities.
Man-machine-communication in electrical power plants is increasingly based on the capabilities of minicomputers. Rather than just displaying raw process data more complex processing is done to aid operators by improving information quality. Advanced operator aids for nuclear power plants are, e.g. alarm reduction, disturbance analysis and expert systems. Operator aids use complex combinations and computations of plant signals, which have to be described in a formal and homogeneous way. The design of such computer-based information systems requires extensive software and engineering efforts. The STAR software concept reduces the software effort to a minimum by proving an advanced program package which facilitates specification and implementation of engineering know-how necessary for sophisticated operator aids. (orig./HP) [de
Over the last sixty years, classical information theory has revolutionized the understanding of the nature of information, and how it can be quantified and manipulated. Quantum information processing extends these lessons to quantum systems, where the properties of intrinsic uncertainty and entanglement fundamentally defy classical explanation. This growing field has many potential applications, including computing, cryptography, communication, and metrology. As inherently mobile quantum particles, photons are likely to play an important role in any mature large-scale quantum information processing system. However, the available methods for producing and detecting complex multi-photon states place practical limits on the feasibility of sophisticated optical quantum information processing experiments. In a typical quantum information protocol, a source first produces an interesting or useful quantum state (or set of states), perhaps involving superposition or entanglement. Then, some manipulations are performed on this state, perhaps involving quantum logic gates which further manipulate or entangle the intial state. Finally, the state must be detected, obtaining some desired measurement result, e.g., for secure communication or computationally efficient factoring. The work presented here concerns the first and last stages of this process as they relate to photons: sources and detectors. Our work on sources is based on the need for optimized non-classical states of light delivered at high rates, particularly of single photons in a pure quantum state. We seek to better understand the properties of spontaneous parameteric downconversion (SPDC) sources of photon pairs, and in doing so, produce such an optimized source. We report an SPDC source which produces pure heralded single photons with little or no spectral filtering, allowing a significant rate enhancement. Our work on detectors is based on the need to reliably measure single-photon states. We have focused on
Full Text Available The study on radar polarization information acquisition and processing has currently been one important part of radar techniques. The development of the polarization theory is simply reviewed firstly. Subsequently, some key techniques which include polarization measurement, polarization anti-jamming, polarization recognition, imaging and parameters inversion using radar polarimetry are emphatically analyzed in this paper. The basic theories, the present states and the development trends of these key techniques are presented and some meaningful conclusions are derived.
Spagnolo, Nicolo'; Vitelli, Chiara; Giacomini, Sandro; Sciarrino, Fabio; De Martini, Francesco
We present the realization of a ultra fast shutter for optical fields, which allows to preserve a generic polarization state, based on a self-stabilized interferometer. It exhibits high (or low) transmittivity when turned on (or inactive), while the fidelity of the polarization state is high. The shutter is realized through two beam displacing prisms and a longitudinal Pockels cell. This can represent a useful tool for controlling light-atom interfaces in quantum information processing.
Silbering, Ana F.; Okada, Ryuichi; Ito, Kei; Galizia, Cosmas Giovanni
When an animal smells an odor, olfactory sensory neurons generate an activity pattern across olfactory glomeruli of the first sensory neuropil, the insect antennal lobe or the vertebrate olfactory bulb. Here, several networks of local neurons interact with sensory neurons and with output neurons-insect projection neurons, or vertebrate mitral/tufted cells. The extent and form of information processing taking place in these local networks has been subject of controversy. To investigate the ro...
Jason M Gold
Full Text Available Perception is often influenced by context. A well-known class of perceptual context effects is perceptual contrast illusions, in which proximate stimulus regions interact to alter the perception of various stimulus attributes, such as perceived brightness, color and size. Although the phenomenal reality of contrast effects is well documented, in many cases the connection between these illusions and how information is processed by perceptual systems is not well understood. Here, we use noise as a tool to explore the information processing correlates of one such contrast effect: the Ebbinghaus-Titchener size-contrast illusion. In this illusion, the perceived size of a central dot is significantly altered by the sizes of a set of surrounding dots, such that the presence of larger surrounding dots tends to reduce the perceived size of the central dot (and vise-versa. In our experiments, we first replicated previous results that have demonstrated the subjective reality of the Ebbinghaus-Titchener illusion. We then used visual noise in a detection task to probe the manner in which observers processed information when experiencing the illusion. By correlating the noise with observers’ classification decisions, we found that the sizes of the surrounding contextual elements had a direct influence on the relative weight observers assigned to regions within and surrounding the central element. Specifically, observers assigned relatively more weight to the surrounding region and less weight to the central region in the presence of smaller surrounding contextual elements. These results offer new insights into the connection between the subjective experience of size-contrast illusions and their associated information processing correlates.
Title: Direct thermodynamic measurements of the energetics of information processing Report Term: 0-Other Email : email@example.com Distribution...INVESTIGATOR(S): Phone Number: 6263952916 Principal: Y Name: PhD Michael L. Roukes Email : firstname.lastname@example.org PARTICIPANTS: Person Months Worked: 1.00... writing of this final DURIP report. These initial data directly demonstrate our ability to drive and detect nanomechanical motion at ultralow
Nagle, Gail; Alger, Linda; Kemp, Alexander
The purpose of this document is to provide the software requirements and specifications for the Input/Output Network Management Services for the Advanced Information Processing System. This introduction and overview section is provided to briefly outline the overall architecture and software requirements of the AIPS system before discussing the details of the design requirements and specifications of the AIPS I/O Network Management software. A brief overview of the AIPS architecture followed by a more detailed description of the network architecture.
Dodin, I.Y.; Fisch, N.J.
A method of producing holograms of three-dimensional optical pulses is proposed. It is shown that both the amplitude and the phase profile of three-dimensional optical pulse can be stored in dynamic perturbations of a Raman medium, such as plasma. By employing Raman scattering in a nonlinear medium, information carried by a laser pulse can be captured in the form of a slowly propagating low-frequency wave that persists for a time large compared with the pulse duration. If such a hologram is then probed with a short laser pulse, the information stored in the medium can be retrieved in a second scattered electromagnetic wave. The recording and retrieving processes can conserve robustly the pulse shape, thus enabling the recording and retrieving with fidelity of information stored in optical signals. While storing or reading the pulse structure, the optical information can be processed as an analogue or digital signal, which allows simultaneous transformation of three-dimensional continuous images or computing discrete arrays of binary data. By adjusting the phase fronts of the reference pulses, one can also perform focusing, redirecting, and other types of transformation of the output pulses
Full Text Available Catechols offer diverse properties and are used in biology to perform various functions that range from adhesion (e.g., mussel proteins to neurotransmission (e.g., dopamine, and mimicking the capabilities of biological catechols have yielded important new materials (e.g., polydopamine. It is well known that catechols are also redox-active and we have observed that biomimetic catechol-modified chitosan films are redox-active and possess interesting molecular electronic properties. In particular, these films can accept, store and donate electrons, and thus offer redox-capacitor capabilities. We are enlisting these capabilities to bridge communication between biology and electronics. Specifically, we are investigating an interactive redox-probing approach to access redox-based chemical information and convert this information into an electrical modality that facilitates analysis by methods from signal processing. In this review, we describe the broad vision and then cite recent examples in which the catechol–chitosan redox-capacitor can assist in accessing and understanding chemical information. Further, this redox-capacitor can be coupled with synthetic biology to enhance the power of chemical information processing. Potentially, the progress with this biomimetic catechol–chitosan film may even help in understanding how biology uses the redox properties of catechols for redox signaling.
Ismael de Moura Costa
Full Text Available Introduction: Paper to presentation the MAIA Method for Architecture of Information Applied evolution, its structure, results obtained and three practical applications.Objective: Proposal of a methodological constructo for treatment of complex information, distinguishing information spaces and revealing inherent configurations of those spaces. Metodology: The argument is elaborated from theoretical research of analitical hallmark, using distinction as a way to express concepts. Phenomenology is used as a philosophical position, which considers the correlation between Subject↔Object. The research also considers the notion of interpretation as an integrating element for concepts definition. With these postulates, the steps to transform the information spaces are formulated. Results: This article explores not only how the method is structured to process information in its contexts, starting from a succession of evolutive cicles, divided in moments, which, on their turn, evolve to transformation acts. Conclusions: This article explores not only how the method is structured to process information in its contexts, starting from a succession of evolutive cicles, divided in moments, which, on their turn, evolve to transformation acts. Besides that, the article presents not only possible applications as a cientific method, but also as configuration tool in information spaces, as well as generator of ontologies. At last, but not least, presents a brief summary of the analysis made by researchers who have already evaluated the method considering the three aspects mentioned.
Stoverinck, F.; Dubois, G.; Amelung, B.
Climate change forces society to adapt. Adaptation strategies are preferably based on the best available climate information. Climate projections, however, often inform adaptation strategies after being interpreted once or several times. This process affects the original message put forward by climate scientists when presenting the basic climate projections, in particular regarding uncertainties. The nature of this effect and its implications for decision-making are as yet poorly understood. This paper explores the nature and consequences of a) the communication tools used by scientists and experts, and b)changes in the communicated information as it travels through the decision-making process. It does so by analysing the interpretative steps taken in a sample of 25 documents, pertaining to the field of public policies for climate change impact assessment and adaptation strategies. Five phases in the provisioning of climate information are distinguished: pre-existing knowledge (i.e. climate models and data), climate- change projection, impact assessment, adaptation strategy, and adaptation plan. Between the phases, climate information is summarized and synthesised in order to be passed on. The results show that in the sample information on uncertainty is under-represented: e.g. studies focus on only one scenario, and/or disregard probability distributions. In addition, visualization tools are often used ineffectively, leading to confusion and unintended interpretations. Several recommendations are presented. A better training of climatologists to communication issues, but also a training to climatology for decision makers are required, as well as more cautious and robust adaptation strategies, accounting for the uncertainty inherent to climate projections. (authors)
Papadakis, Emmanouil; Anantpinijwatna, Amata; Woodley, John
This article describes the development of a reaction database with the objective to collect data for multiphase reactions involved in small molecule pharmaceutical processes with a search engine to retrieve necessary data in investigations of reaction-separation schemes, such as the role of organic......; compounds participating in the reaction; use of organic solvents and their function; information for single step and multistep reactions; target products; reaction conditions and reaction data. Information for reactor scale-up together with information for the separation and other relevant information...
Full Text Available Objective - The main purpose of this study was to understand the information research process of experienced online information researchers in a variety of disciplines, gather their ideas for improvement and as part of this to validate a proposed research framework for use in future development of Ontario’s Scholars Portal.Methods - This was a qualitative research study in which sixty experienced online information researchers participated in face-to-face workshops that included a collaborative design component. The sessions were conducted and recorded by usability specialists who subsequently analyzed the data and identified patterns and themes.Results - Key themes included the similarities of the information research process across all disciplines, the impact of interdisciplinarity, the social aspect of research and opportunities for process improvement. There were many specific observations regarding current and ideal processes. Implications for portal development and further research included: supporting a common process while accommodating user-defined differences; supporting citation chaining practices with new opportunities for data linkage and granularity; enhancing keyword searching with various types of intervention; exploring trusted social networks; exploring new mental models for data manipulation while retaining traditional objects; improving citation and document management. Conclusion – The majority of researchers in the study had almost no routine in their information research processes, had developed few techniques to assist themselves and had very little awareness of the tools available to help them. There are many opportunities to aid researchers in the research process that can be explored when developing scholarly research portals. That development will be well guided by the framework ‘discover, gather, synthesize, create, share.’
Full Text Available Purpose: The purpose of this study is to characterize, analyze, and demonstrate machine-understandable semantic process for validating, integrating, and processing technical design information. This establishes both a vision and tools for information reuse and semi-automatic processing in engineering design projects, including virtual machine laboratory applications with generated components.Design/methodology/approach: The process model has been developed iteratively in terms of action research, constrained by the existing technical design practices and assumptions (design documents, expert feedback, available technologies (pre-studies and experiments with scripting and pipeline tools, benchmarking with other process models and methods (notably the RUP and DITA, and formal requirements (computability and the critical information paths for the generated applications. In practice, the work includes both quantitative and qualitative components.Findings: Technical design processes may be greatly enhanced in terms of semantic process thinking, by enriching design information, and automating information validation and transformation tasks. Contemporary design information, however, is mainly intended for human consumption, and needs to be explicitly enriched with the currently missing data and interfaces. In practice, this may require acknowledging the role of technical information or knowledge engineer, to lead the development of the semantic design information process in a design organization. There is also a trade-off between machine-readability and system complexity that needs to be studied further, both empirically and in theory.Research limitations/implications: The conceptualization of the semantic process is essentially an abstraction based on the idea of progressive design. While this effectively allows implementing semantic processes with, e.g., pipeline technologies, the abstraction is valid only when technical design is organized into
Atick, Joseph J
The sensory pathways of animals are well adapted to processing a special class of signals, namely stimuli from the animal's environment. An important fact about natural stimuli is that they are typically very redundant and hence the sampled representation of these signals formed by the array of sensory cells is inefficient. One could argue for some animals and pathways, as we do in this review, that efficiency of information representation in the nervous system has several evolutionary advantages. Consequently, one might expect that much of the processing in the early levels of these sensory pathways could be dedicated towards recoding incoming signals into a more efficient form. In this review, we explore the principle of efficiency of information representation as a design principle for sensory processing. We give a preliminary discussion on how this principle could be applied in general to predict neural processing and then discuss concretely some neural systems where it recently has been shown to be successful. In particular, we examine the fly's LMC coding strategy and the mammalian retinal coding in the spatial, temporal and chromatic domains.
Full Text Available Nestmate recognition is a hallmark of social insects. It is based on the match/mismatch of an identity signal carried by members of the society with that of the perceiving individual. While the behavioral response, amicable or aggressive, is very clear, the neural systems underlying recognition are not fully understood. Here we contrast two alternative hypotheses for the neural mechanisms that are responsible for the perception and information processing in recognition. We focus on recognition via chemical signals, as the common modality in social insects. The first, classical, hypothesis states that upon perception of recognition cues by the sensory system the information is passed as is to the antennal lobes and to higher brain centers where the information is deciphered and compared to a neural template. Match or mismatch information is then transferred to some behavior-generating centers where the appropriate response is elicited. An alternative hypothesis, that of “pre-filter mechanism”, posits that the decision as to whether to pass on the information to the central nervous system takes place in the peripheral sensory system. We suggest that, through sensory adaptation, only alien signals are passed on to the brain, specifically to an “aggressive-behavior-switching center”, where the response is generated if the signal is above a certain threshold.
Gerlach, Christian; Klargaard, Solja K.; Petersen, Anders
individuals with DP in Navon’s paradigm we find evidence of a reduced global precedence effect: The DPs are slower than controls to process global but not local shape information. Importantly, and in contrast to previous studies, we demonstrate that the DPs perform normally in a comprehensive test of visual......There is accumulating evidence suggesting that a central deficit in developmental prosopagnosia (DP), a disorder characterized by profound and lifelong difficulties with face recognition, concerns impaired holistic processing. Some of this evidence comes from studies using Navon’s paradigm where...... individuals with DP show a greater local or reduced global bias compared with controls. However, it has not been established what gives rise to this altered processing bias. Is it a reduced global precedence effect, changes in susceptibility to interference effects or both? By analyzing the performance of 10...
Full Text Available Building Information Modeling (BIM itself contains huge potential, how to increase effectiveness of every project in its all life cycle. It means from initial investment plan through project and building-up activities to long-term usage and property maintenance and finally demolition. Knowledge Management or better say Knowledge Sharing covers two sets of tools, managerial and technological. Manager`s needs are real expectations and desires of final users in terms of how could they benefit from managing long-term projects, covering whole life cycle in terms of sparing investment money and other resources. Technology employed can help BIM processes to support and deliver these benefits to users. How to use this technology for data and metadata collection, storage and sharing, which processes may these new technologies deploy. We will touch how to cover optimized processes proposal for better and smooth support of knowledge sharing within project time-scale, and covering all its life cycle.
Sarah E. Marzen
Full Text Available Renewal processes are broadly used to model stochastic behavior consisting of isolated events separated by periods of quiescence, whose durations are specified by a given probability law. Here, we identify the minimal sufficient statistic for their prediction (the set of causal states, calculate the historical memory capacity required to store those states (statistical complexity, delineate what information is predictable (excess entropy, and decompose the entropy of a single measurement into that shared with the past, future, or both. The causal state equivalence relation defines a new subclass of renewal processes with a finite number of causal states despite having an unbounded interevent count distribution. We use the resulting formulae to analyze the output of the parametrized Simple Nonunifilar Source, generated by a simple two-state hidden Markov model, but with an infinite-state ϵ-machine presentation. All in all, the results lay the groundwork for analyzing more complex processes with infinite statistical complexity and infinite excess entropy.
Golightly, Andrew; Wilkinson, Darren J
In this paper we consider the problem of parameter inference for Markov jump process (MJP) representations of stochastic kinetic models. Since transition probabilities are intractable for most processes of interest yet forward simulation is straightforward, Bayesian inference typically proceeds through computationally intensive methods such as (particle) MCMC. Such methods ostensibly require the ability to simulate trajectories from the conditioned jump process. When observations are highly informative, use of the forward simulator is likely to be inefficient and may even preclude an exact (simulation based) analysis. We therefore propose three methods for improving the efficiency of simulating conditioned jump processes. A conditioned hazard is derived based on an approximation to the jump process, and used to generate end-point conditioned trajectories for use inside an importance sampling algorithm. We also adapt a recently proposed sequential Monte Carlo scheme to our problem. Essentially, trajectories are reweighted at a set of intermediate time points, with more weight assigned to trajectories that are consistent with the next observation. We consider two implementations of this approach, based on two continuous approximations of the MJP. We compare these constructs for a simple tractable jump process before using them to perform inference for a Lotka-Volterra system. The best performing construct is used to infer the parameters governing a simple model of motility regulation in Bacillus subtilis.
Defense Documentation Center, Alexandria, VA.
This unclassified-unlimited bibliography contains 183 references, with abstracts, dealing specifically with optical or graphic information processing. Citations are grouped under three headings: display devices and theory, character recognition, and pattern recognition. Within each group, they are arranged in accession number (AD-number) sequence.…
The main object of this tutorial article is first to review the main inference tools using Bayesian approach, Entropy, Information theory and their corresponding geometries. This review is focused mainly on the ways these tools have been used in data, signal and image processing. After a short introduction of the different quantities related to the Bayes rule, the entropy and the Maximum Entropy Principle (MEP), relative entropy and the Kullback-Leibler divergence, Fisher information, we will study their use in different fields of data and signal processing such as: entropy in source separation, Fisher information in model order selection, different Maximum Entropy based methods in time series spectral estimation and finally, general linear inverse problems.
Environmental information is presented relating to a staged version of the proposed Defense Waste Processing Facility (DWPF) at the Savannah River Plant. The information is intended to provide the basis for an Environmental Impact Statement. In either the integral or the staged design, the DWPF will convert the high-level waste currently stored in tanks into: a leach-resistant form containing about 99.9% of all the radioactivity, and a residual, slightly contaminated salt, which is disposed of as saltcrete. In the first stage of the staged version, the insoluble sludge portion of the waste and the long lived radionuclides contained therein will be vitrified. The waste glass will be sealed in canisters and stored onsite until shipped to a Federal repository. In the second stage, the supernate portion of the waste will be decontaminated by ion exchange. The recovered radionuclides will be transferred to the Stage 1 facility, and mixed with the sludge feed before vitrification. The residual, slightly contaminated salt solution will be mixed with Portland cement to form a concrete product (saltcrete) which will be buried onsite in an engineered landfill. This document describes the conceptual facilities and processes for producing glass waste and decontaminated salt. The environmental effects of facility construction, normal operations, and accidents are then presented. Descriptions of site and environs, alternative sites and waste disposal options, and environmental consultations and permits are given in the base Environmental Information Document
... Information Collection: Technical Processing Requirements for Multifamily Project Mortgage Insurance AGENCY: Office of the Chief Information Officer, HUD. ACTION: Notice. SUMMARY: HUD has submitted the proposed... Information Collection Title of Information Collection: Technical Processing Requirements for Multifamily...
...-01] Announcing Approval of Federal Information Processing Standard (FIPS) Publication 201-2, Personal... Commerce's approval of Federal Information Processing Standard (FIPS) Publication 201-2, Personal Identity... Information Processing Standards (FIPS). Homeland Security Presidential Directive (HSPD) 12, entitled ``Policy...
Schultze, Thomas; Pfeiffer, Felix; Schulz-Hardt, Stefan
Escalation of commitment denotes decision makers' increased reinvestment of resources in a losing course of action. Despite the relevance of this topic, little is known about how information is processed in escalation situations, that is, whether decision makers who receive negative outcome feedback on their initial decision search for and/or process information biasedly and whether these biases contribute to escalating commitment. Contrary to a widely cited study by E. J. Conlon and J. M. Parks (1987), in 3 experiments, the authors found that biases do not occur on the level of information search. Neither in a direct replication and extension of the original study with largely increased test power (Experiment 1) nor under methodologically improved conditions (Experiments 2 and 3) did decision makers responsible for failure differ from nonresponsible decision makers with regards to information search, and no selective search for information supporting the initial decision or voting for further reinvestment was observed. However, Experiments 3 and 4 show that the evaluation of the previously sought information is biased among participants who were responsible for initiating the course of action. Mediation analyses show that this evaluation bias in favor of reinvestment partially mediated the responsibility effect on escalation of commitment.
All natural processes are accompanied by fluctuations, characterized as noise or chaos. Biological systems, which have evolved during billions of years, are likely to have adapted, not only to cope with such fluctuations, but also to make use of them. We investigate how the complex dynamics of the brain, including oscillations, chaos and noise, can affect the efficiency of neural information processing. In particular, we consider the amplification and functional role of internal fluctuations. Using computer simulations of a neural network model of the olfactory cortex and hippocampus, we demonstrate how microscopic fluctuations can result in global effects at the network level. We show that the rate of information processing in associative memory tasks can be maximized for optimal noise levels, analogous to stochastic resonance phenomena. Noise can also induce transitions between different dynamical states, which could be of significance for learning and memory. A chaotic-like behavior, induced by noise or by an increase in neuronal excitability, can enhance system performance if it is transient and converges to a limit cycle memory state. We speculate whether this dynamical behavior perhaps could be related to (creative) thinking.
Dixit, Abhinav; Goyal, Abhishek; Thawani, Rajat; Vaney, Neelam
Caffeine is a pyschostimulant present in various beverages and known to alter alertness and performance by acting on the central nervous system. Its effects on central nervous system have been studied using EEG, evoked potentials, fMRI, and neuropsychological tests. The Stroop task is a widely used tool in psychophysiology to understand the attention processes and is based on the principle that processing of two different kinds of information (like the word or colour) is parallel and at different speeds with a common response channel. To study the effect of caffeine on classical color word Stroop task. This study was conducted on 30 male undergraduate students by performing a test before and 40 minutes after consuming 3 mg/Kg caffeine and evaluating the effect of caffeine on Stroop interference and facilitation. The results revealed that practice has no effect on the performance in a Stroop task. However, there was reduction in Stroop interference and increase in facilitation after consumption of caffeine as was evident by changes in the reaction times in response to neutral, incongruent, and congruent stimuli. We hypothesize that caffeine led to faster processing of relevant information.
Sidorova Alexandra, M.
Full Text Available The paper deals with the impact of informal institutions on the definition of the vector integration processes and the development of integration processes in the countries of the Customs Union and Ukraine. The degree of scientific development of the phenomenon in different economic schools is determined in this article. Economic mentality is a basic informal institutions, which determines the degree of effectiveness of the integration processes. This paper examines the nature, characteristics and effects of economic mentality on the economic activities of people. Ethnometrichal method allows to quantify the economic mentality that enables deeper understanding and analysis of the formation and functioning of political and economic system, especially business and management, establishing contacts with other cultures. It was measured modern Belarusian economic mentality based on international methodology Hofstede and compared with the economic mentality of Russia, Ukraine and Kazakhstan. With the help of cluster analysis congruence economic mentality of the Customs Union and Ukraine was determined. Economic mentality of these countries was also compared with the economic mentality of other countries in order to identify the main types of economic culture.
The scientific work presented in this paper could be described as a novel, systemic approach to the process of organization of CLIC documentation. The latter refers to the processing of various sets of archived data found on various CERN archiving services in a more friendly and organized way. From physics aspect, this is equal to having an initial system characterized by high entropy, which after some transformation of energy and matter will produce a final system of reduced entropy. However, this reduction in entropy can be considered valid for open systems only, which are sub-systems of grander isolated systems, to which the total entropy will always increase. Thus, using as basis elements from information theory, systems theory and thermodynamics, the unorganized form of data pending to be organized to a higher form, is modeled as an initial open sub-system with increased entropy, which, after the processing of information, will produce a final system with decreased entropy. This systemic approach to the ...
Cresswell, Pip; Gilmour, Jean
Clinical trials are carried out with human participants to answer questions about the best way to diagnose, treat and prevent illness. Participants must give informed consent to take part in clinical trials that requires understanding of how clinical trials work and their purpose. Randomised controlled trials provide strong evidence but their complex design is difficult for both clinicians and participants to understand. Increasingly, ensuring informed consent in randomised controlled trials has become part of the clinical research nurse role. The aim of this study was to explore in depth the clinical research nurse role in the informed consent process using a qualitative descriptive approach. Three clinical research nurses were interviewed and data analysed using a thematic analysis approach. Three themes were identified to describe the process of ensuring informed consent. The first theme, Preparatory partnerships, canvassed the relationships required prior to initiation of the informed consent process. The second theme, Partnering the participant, emphasises the need for ensuring voluntariness and understanding, along with patient advocacy. The third theme, Partnership with the project, highlights the clinical research nurse contribution to the capacity of the trial to answer the research question through appropriate recruiting and follow up of participants. Gaining informed consent in randomised controlled trials was complex and required multiple partnerships. A wide variety of skills was used to protect the safety of trial participants and promote quality research. The information from this study contributes to a greater understanding of the clinical research nurse role, and suggests the informed consent process in trials can be a nurse-led one. In order to gain collegial, employer and industry recognition it is important this aspect of the nursing role is acknowledged.
Zhou, Chao; Wang, Ziying; Guo, Jing; Guo, Yajuan; Huang, Wei
Informatization has penetrated the whole process of production and operation of electric power enterprises. It not only improves the level of lean management and quality service, but also faces severe security risks. The internal network terminal is the outermost layer and the most vulnerable node of the inner network boundary. It has the characteristics of wide distribution, long depth and large quantity. The user and operation and maintenance personnel technical level and security awareness is uneven, which led to the internal network terminal is the weakest link in information security. Through the implementation of security of management, technology and physics, we should establish an internal network terminal security protection system, so as to fully protect the internal network terminal information security.
Ponulak, Filip; Kasinski, Andrzej
The concept that neural information is encoded in the firing rate of neurons has been the dominant paradigm in neurobiology for many years. This paradigm has also been adopted by the theory of artificial neural networks. Recent physiological experiments demonstrate, however, that in many parts of the nervous system, neural code is founded on the timing of individual action potentials. This finding has given rise to the emergence of a new class of neural models, called spiking neural networks. In this paper we summarize basic properties of spiking neurons and spiking networks. Our focus is, specifically, on models of spike-based information coding, synaptic plasticity and learning. We also survey real-life applications of spiking models. The paper is meant to be an introduction to spiking neural networks for scientists from various disciplines interested in spike-based neural processing.
Burns, DeWitt; Finckenor, Miria; Henrie, Ben
Materials International Space Station Experiment (MISSE) data is now being collected and distributed through the Materials and Processes Technical Information System (MAPTIS) at Marshall Space Flight Center in Huntsville, Alabama. MISSE data has been instrumental in many programs and continues to be an important source of data for the space community. To facilitate great access to the MISSE data the International Space Station (ISS) program office and MAPTIS are working to gather this data into a central location. The MISSE database contains information about materials, samples, and flights along with pictures, pdfs, excel files, word documents, and other files types. Major capabilities of the system are: access control, browsing, searching, reports, and record comparison. The search capabilities will search within any searchable files so even if the desired meta-data has not been associated data can still be retrieved. Other functionality will continue to be added to the MISSE database as the Athena Platform is expanded
Bergou, J.A.; Steinberg, A.M.; Mohseni, M.
Full text: Photons are the ideal systems for carrying quantum information. Although performing large-scale quantum computation on optical systems is extremely demanding, non scalable linear-optics quantum information processing may prove essential as part of quantum communication networks. In addition efficient (scalable) linear-optical quantum computation proposal relies on the same optical elements. Here, by constructing multirail optical networks, we experimentally study two central problems in quantum information science, namely optimal discrimination between nonorthogonal quantum states, and controlling decoherence in quantum systems. Quantum mechanics forbids deterministic discrimination between nonorthogonal states. This is one of the central features of quantum cryptography, which leads to secure communications. Quantum state discrimination is an important primitive in quantum information processing, since it determines the limitations of a potential eavesdropper, and it has applications in quantum cloning and entanglement concentration. In this work, we experimentally implement generalized measurements in an optical system and demonstrate the first optimal unambiguous discrimination between three non-orthogonal states with a success rate of 55 %, to be compared with the 25 % maximum achievable using projective measurements. Furthermore, we present the first realization of unambiguous discrimination between a pure state and a nonorthogonal mixed state. In a separate experiment, we demonstrate how decoherence-free subspaces (DFSs) may be incorporated into a prototype optical quantum algorithm. Specifically, we present an optical realization of two-qubit Deutsch-Jozsa algorithm in presence of random noise. By introduction of localized turbulent airflow we produce a collective optical dephasing, leading to large error rates and demonstrate that using DFS encoding, the error rate in the presence of decoherence can be reduced from 35 % to essentially its pre
Full Text Available This paper deals with the concept of innovation of the urban regeneration process in context of transformations which are generated by information-communication technologies. From one aspect, Serbia has an exceptional human potential presented in number of 13,000 graduates each year, or in share of 42% of population who speaks English, which is the largest among all Eastern and Central European countries. This forms a basis for formulation of strategies of information society development in Serbia as well as for economic adjustments based upon knowledge, and for tracing the way to future knowledge society, i.e. eEurope 2020. On the other hand, we are witnessing an intensive development of huge complexes of mega and hypermarkets as a present dominant way for our city spaces' regeneration. At the same time, experiences from some other locations point to the deterioration of cities' urban identity as a consequence of the global capital infiltration and of development within an urban tissue of a huge complex of multi-national companies. Aiming to overcome the mistakes portrayed by international experience, as well as potential oversights that may occur because of routine and mismatch between certain phases of the sustainable development process, this paper makes an emphasis on the importance of an integral evaluation of the information society development trends and the spatial aspects of urban regeneration. It is essential to adjust devastated urban spaces as artifacts of one technological era to the actual information era with indication of future digital knowledge era, i.e. to plan, design and develop *according to new technological requirements and possibilities for new working places and new quality of living.
Quantin-Nataf, C.; Lozac'h, L.; Thollot, P.; Loizeau, D.; Bultel, B.; Fernando, J.; Allemand, P.; Dubuffet, F.; Poulet, F.; Ody, A.; Clenet, H.; Leyrat, C.; Harrisson, S.
MarsSI (Acronym for Mars System of Information, https://emars.univ-lyon1.fr/MarsSI/, is a web Geographic Information System application which helps managing and processing martian orbital data. The MarsSI facility is part of the web portal called PSUP (Planetary SUrface Portal) developed by the Observatories of Paris Sud (OSUPS) and Lyon (OSUL) to provide users with efficient and easy access to data products dedicated to the martian surface. The portal proposes 1) the management and processing of data thanks to MarsSI and 2) the visualization and merging of high level (imagery, spectral, and topographic) products and catalogs via a web-based user interface (MarsVisu). The portal PSUP as well as the facility MarsVisu is detailed in a companion paper (Poulet et al., 2018). The purpose of this paper is to describe the facility MarsSI. From this application, users are able to easily and rapidly select observations, process raw data via automatic pipelines, and get back final products which can be visualized under Geographic Information Systems. Moreover, MarsSI also contains an automatic stereo-restitution pipeline in order to produce Digital Terrain Models (DTM) on demand from HiRISE (High Resolution Imaging Science Experiment) or CTX (Context Camera) pair-images. This application is funded by the European Union's Seventh Framework Programme (FP7/2007-2013) (ERC project eMars, No. 280168) and has been developed in the scope of Mars, but the design is applicable to any other planetary body of the solar system.
Lee, Ciarán M.; Hoban, Matty J.
The violation of certain Bell inequalities allows for device-independent information processing secure against nonsignaling eavesdroppers. However, this only holds for the Bell network, in which two or more agents perform local measurements on a single shared source of entanglement. To overcome the practical constraints that entangled systems can only be transmitted over relatively short distances, large-scale multisource networks have been employed. Do there exist analogs of Bell inequalities for such networks, whose violation is a resource for device independence? In this Letter, the violation of recently derived polynomial Bell inequalities will be shown to allow for device independence on multisource networks, secure against nonsignaling eavesdroppers.
Full Text Available We propose an intelligent and an efficient query processing approach for semantic mediation of information systems. We propose also a generic multi agent architecture that supports our approach. Our approach focuses on the exploitation of intelligent agents for query reformulation and the use of a new technology for the semantic representation. The algorithm is self-adapted to the changes of the environment, offers a wide aptitude and solves the various data conflicts in a dynamic way; it also reformulates the query using the schema mediation method for the discovered systems and the context mediation for the other systems.
A computer-based data management system has been developed to handle data associated with target fabrication processes including glass microballoon characterization, gas filling, materials coating, and storage locations. The system provides automatic data storage and computation, flexible data entry procedures, fast access, automated report generation, and secure data transfer. It resides on a CDC CYBER 175 computer and is compatible with the CDC data base language Query Update, but is based on custom fortran software interacting directly with the CYBER's file management system. The described data base maintains detailed, accurate, and readily available records of fusion targets information
Siomau, Michael; Jiang, Ning
Precise and elegant coordination of a prosthesis across many degrees of freedom represents a significant challenge to efficient rehabilitation of people with limb deficiency. Processing the electrical neural signals collected from the surface of the remnant muscles of the stump is a common way to initiate and control the different movements available to the artificial limb. Based on the assumption that there are distinguishable and repeatable signal patterns among different types of muscular activation, the problem of prosthesis control reduces to one of pattern recognition. Widely accepted classical methods for pattern recognition, however, cannot provide simultaneous and proportional control of the artificial limb. Here we show that, in principle, quantum information processing of the neural signals allows us to overcome the above-mentioned difficulties, suggesting a very simple scheme for myoelectric control of artificial limb with advanced functionalities. (paper)
Buot, Anne; Welter, Marie-Laure; Karachi, Carine; Pochon, Jean-Baptiste; Bardinet, Eric; Yelnik, Jérôme; Mallet, Luc
The subthalamic nucleus (STN) is an efficient target for treating patients with Parkinson's disease as well as patients with obsessive-compulsive disorder (OCD) using high frequency stimulation (HFS). In both Parkinson's disease and OCD patients, STN-HFS can trigger abnormal behaviours, such as hypomania and impulsivity. To investigate if this structure processes emotional information, and whether it depends on motor demands, we recorded subthalamic local field potentials in 16 patients with Parkinson's disease using deep brain stimulation electrodes. Recordings were made with and without dopaminergic treatment while patients performed an emotional categorisation paradigm in which the response varied according to stimulus valence (pleasant, unpleasant and neutral) and to the instruction given (motor, non-motor and passive). Pleasant, unpleasant and neutral stimuli evoked an event related potential (ERP). Without dopamine medication, ERP amplitudes were significantly larger for unpleasant compared with neutral pictures, whatever the response triggered by the stimuli; and the magnitude of this effect was maximal in the ventral part of the STN. No significant difference in ERP amplitude was observed for pleasant pictures. With dopamine medication, ERP amplitudes were significantly increased for pleasant compared with neutral pictures whatever the response triggered by the stimuli, while ERP amplitudes to unpleasant pictures were not modified. These results demonstrate that the ventral part of the STN processes the emotional valence of stimuli independently of the motor context and that dopamine enhances processing of pleasant information. These findings confirm the specific involvement of the STN in emotional processes in human, which may underlie the behavioural changes observed in patients with deep brain stimulation.
By combining artificial intelligence concepts with the human information processing model of Rasmussen, a conceptual framework was developed for real time artificial intelligence systems which provides a foundation for system organization, control and validation. The approach is based on the description of system processing terms of an abstraction hierarchy of states of knowledge. The states of knowledge are organized along one dimension which corresponds to the extent to which the concepts are expressed in terms of the system inouts or in terms of the system response. Thus organized, the useful states form a generally triangular shape with the sensors and effectors forming the lower two vertices and the full evaluated set of courses of action the apex. Within the triangle boundaries are numerous processing paths which shortcut the detailed processing, by connecting incomplete levels of analysis to partially defined responses. Shortcuts at different levels of abstraction include reflexes, sensory motor control, rule based behavior, and satisficing. This approach was used in the design of a real time tactical decision aiding system, and in defining an intelligent aiding system for transport pilots.
Full Text Available Mental images are mental representations of people, objects and situations that are not present and are formed by using the imagination. Many studies have addressed this psychological ability, its typology and its involvement in the academic environment. Along these lines, the aim of our study was to assess the information processing style (verbal, object, spatial scales, and mental rotation that is commonly used by students from different specialties of Compulsory Secondary Education. To that end, two tests: The Mental Rotation Test (MRT and the Object-Spatial Imagery and Verbal Questionnaire (OSIVQ were administered to a sample of 126 Compulsory Secondary Education students. MRT assessed any significant difference in the ability to mentally rotate images depending on gender and specialty. Significant differences were found by specialty, showing that science students had better ability to mentally rotate images than humanities ones. Significant differences were found by gender and specialty in the OSIVQ. Men showed better spatial and verbal processing style than women, and humanities students excelled in object processing (in comparison to science students and in verbal processing (in comparison to science and art students.
Basta, Claudia [DIRC Sustainable Urban Areas, Section of Material Science and Sustainable Construction, Delft University of Technology, Stevinweg 1, 2600 GA, Delft (Netherlands)]. E-mail: email@example.com; Neuvel, Jeroen M.M. [Land Use Planning, Wageningen University, Droevendaalsesteeg 3, Postbus 47, 6700 AA Wageningen (Netherlands)]. E-mail: firstname.lastname@example.org; Zlatanova, Sisi [Section GISt, OTB Research Institute for Housing, Urban and Mobility Studies, Delft University of Technology, Jaffalaan 9, P.O. Box 5030, 2600 GA, Delft (Netherlands)]. E-mail: email@example.com; Ale, Ben [Safety Science Group, TBM Faculty, Delft University of Technology, Jaffalaan 5, 2600 GA, Delft (Netherlands)
The definition of safety distances as required by Art 12 of the Seveso II Directive on dangerous substances (96/82/EC) is necessary to minimize the consequences of potential major accidents. As they affect the land-use destinations of involved areas, safety distances can be considered as risk tolerability criteria with a territorial reflection. Recent studies explored the suitability of using Geographical Information System technologies to support their elaboration and visual rendering. In particular, the elaboration of GIS 'risk-maps' has been recognized as functional to two objectives: connecting spatial planners and safety experts during decision making processes and communicating risk to non-experts audiences. In order to elaborate on these findings and to verify their reflection on European practices, the article presents the result of a comparative study between the United Kingdom and the Netherlands recent developments. Their land-use planning practices for areas falling under Seveso II requirements are explored. The role of GIS risk-maps within decisional processes is analyzed and the reflection on the transparency and accessibility of risk-information is commented. Recommendations for further developments are given.
Prokushkin, Sergey F.; Galil, Erez
Modern image processing pipelines (e.g., those used in digital cameras) are full of advanced, highly adaptive filters that often have a large number of tunable parameters (sometimes > 100). This makes the calibration procedure for these filters very complex, and the optimal results barely achievable in the manual calibration; thus an automated approach is a must. We will discuss an information theory based metric for evaluation of algorithm adaptive characteristics ("adaptivity criterion") using noise reduction algorithms as an example. The method allows finding an "orthogonal decomposition" of the filter parameter space into the "filter adaptivity" and "filter strength" directions. This metric can be used as a cost function in automatic filter optimization. Since it is a measure of a physical "information restoration" rather than perceived image quality, it helps to reduce the set of the filter parameters to a smaller subset that is easier for a human operator to tune and achieve a better subjective image quality. With appropriate adjustments, the criterion can be used for assessment of the whole imaging system (sensor plus post-processing).
van Reemst, Lisa; Fischer, Tamar F C; Zwirs, Barbara W C
The aim of the current literature review, which is based on 64 empirical studies, was to assess to what extent mechanisms of the Social Information Processing (SIP) model of Crick and Dodge (1994) are related to victimization. The reviewed studies have provided support for the relation between victimization and several social information processing mechanisms, especially the interpretation of cues and self-efficacy (as part of the response decision). The relationship between victimization and other mechanisms, such as the response generation, was only studied in a few articles. Until now research has often focused on just one step of the model, instead of attempting to measure the associations between multiple mechanisms and victimization in multivariate analyses. Such analyses would be interesting to gain more insight into the SIP model and its relationship with victimization. The few available longitudinal studies show that mechanisms both predict victimization (internal locus of control, negative self-evaluations and less assertive response selection) and are predicted by victimization (hostile attribution of intent and negative evaluations of others). Associations between victimization and SIP mechanisms vary across different types and severity of victimization (stronger in personal and severe victimization), and different populations (stronger among young victims). Practice could focus on these stronger associations and the interpretation of cues. More research is needed however, to investigate whether intervention programs that address SIP mechanisms are suitable for victimization and all relevant populations. © The Author(s) 2014.
Basta, Claudia; Neuvel, Jeroen M.M.; Zlatanova, Sisi; Ale, Ben
The definition of safety distances as required by Art 12 of the Seveso II Directive on dangerous substances (96/82/EC) is necessary to minimize the consequences of potential major accidents. As they affect the land-use destinations of involved areas, safety distances can be considered as risk tolerability criteria with a territorial reflection. Recent studies explored the suitability of using Geographical Information System technologies to support their elaboration and visual rendering. In particular, the elaboration of GIS 'risk-maps' has been recognized as functional to two objectives: connecting spatial planners and safety experts during decision making processes and communicating risk to non-experts audiences. In order to elaborate on these findings and to verify their reflection on European practices, the article presents the result of a comparative study between the United Kingdom and the Netherlands recent developments. Their land-use planning practices for areas falling under Seveso II requirements are explored. The role of GIS risk-maps within decisional processes is analyzed and the reflection on the transparency and accessibility of risk-information is commented. Recommendations for further developments are given
Hamstra, D A; de Kloet, E R; van Hemert, A M; de Rijk, R H; Van der Does, A J W
Oral contraceptives (OCs) affect mood in some women and may have more subtle effects on emotional information processing in many more users. Female carriers of mineralocorticoid receptor (MR) haplotype 2 have been shown to be more optimistic and less vulnerable to depression. To investigate the effects of oral contraceptives on emotional information processing and a possible moderating effect of MR haplotype. Cross-sectional study in 85 healthy premenopausal women of West-European descent. We found significant main effects of oral contraceptives on facial expression recognition, emotional memory and decision-making. Furthermore, carriers of MR haplotype 1 or 3 were sensitive to the impact of OCs on the recognition of sad and fearful faces and on emotional memory, whereas MR haplotype 2 carriers were not. Different compounds of OCs were included. No hormonal measures were taken. Most naturally cycling participants were assessed in the luteal phase of their menstrual cycle. Carriers of MR haplotype 2 may be less sensitive to depressogenic side-effects of OCs. Copyright © 2015 IBRO. Published by Elsevier Ltd. All rights reserved.
This book deals with the issues of modelling management processes of information technology and IT projects while its core is the model of information technology management and its component models (contextual, local) describing initial processing and the maturity capsule as well as a decision-making system represented by a multi-level sequential model of IT technology selection, which acquires a fuzzy rule-based implementation in this work. In terms of applicability, this work may also be useful for diagnosing applicability of IT standards in evaluation of IT organizations. The results of this diagnosis might prove valid for those preparing new standards so that – apart from their own visions – they could, to an even greater extent, take into account the capabilities and needs of the leaders of project and manufacturing teams. The book is intended for IT professionals using the ITIL, COBIT and TOGAF standards in their work. Students of computer science and management who are interested in the issue of IT...
Michael S. Harré
Full Text Available Iterated games are an important framework of economic theory and application, at least since the original work of Axelrod’s computational tournaments of the early 80’s. Recent theoretical results have shown that games (the economic context and game theory (the decision-making process are both formally equivalent to computational logic gates. Here these results are extended to behavioural data obtained from an experiment in which rhesus monkeys sequentially played thousands of the “matching pennies” game, an empirical example similar to Axelrod’s tournaments in which algorithms played against one another. The results show that the monkeys exhibit a rich variety of behaviours, both between and within subjects when playing opponents of varying complexity. Despite earlier suggestions, there is no clear evidence that the win-stay, lose-switch strategy is used, however there is evidence of non-linear strategy-based interactions between the predictors of future choices. It is also shown that there is consistent evidence across protocols and across individuals that the monkeys extract non-markovian information, i.e., information from more than just the most recent state of the game. This work shows that the use of information theory in game theory can test important hypotheses that would otherwise be more difficult to extract using traditional statistical methods.
An immediate need of information technology is designing fast, small and low-loss devices. One of the ways to design such devices is using the bosonic quasiparticles, such as magnons, for information transfer/processing. This is the main idea behind the field of magnonics. When a magnon propagates through a magnetic medium, no electrical charge transport is involved and therefore no energy losses, creating Joule heating, occur. This is the most important advantage of using magnons for information transfer. Moreover the mutual conversion between magnons and the other carriers e.g. electrons, photons and plasmons shall open new opportunities to realize tunable multifunctional devices. Magnons cover a very wide range of frequency, from sub-gigahertz up to a few hundreds of terahertz. The magnon frequency has an important impact on the performance of magnon-based devices (the larger the excitation frequency, the faster the magnons). This means that the use of high-frequency (terahertz) magnons would provide a great opportunity for the design of ultrafast devices. However, up to now the focus in magnonics has been on the low-frequency gigahertz magnons. Here we discuss the feasibility of using terahertz magnons for application in magnonic devices. We shall bring the concept of terahertz magnonics into discussion. We discuss how the recently discovered phenomena in the field of terahertz magnons may inspire ideas for designing new magnonic devices. We further introduce methods to tune the fundamental properties of terahertz magnons, e.g. their eigenfrequency and lifetime.
Lara, Antonio; Robledo Moreno, Javier; Guslienko, Konstantin Y; Aliev, Farkhad G
Low dissipation data processing with spins is one of the promising directions for future information and communication technologies. Despite a significant progress, the available magnonic devices are not broadband yet and have restricted capabilities to redirect spin waves. Here we propose a breakthrough approach to spin wave manipulation in patterned magnetic nanostructures with unmatched characteristics, which exploits a spin wave analogue to edge waves propagating along a water-wall boundary. Using theory, micromagnetic simulations and experiment we investigate spin waves propagating along the edges in magnetic structures, under an in-plane DC magnetic field inclined with respect to the edge. The proposed edge spin waves overcome important challenges faced by previous technologies such as the manipulation of the spin wave propagation direction, and they substantially improve the capability of transmitting information at frequencies exceeding 10 GHz. The concept of the edge spin waves allows to design a broad of logic devices such as splitters, interferometers, or edge spin wave transistors with unprecedented characteristics and a potentially strong impact on information technologies.
Full Text Available This article describes the development of a reaction database with the objective to collect data for multiphase reactions involved in small molecule pharmaceutical processes with a search engine to retrieve necessary data in investigations of reaction-separation schemes, such as the role of organic solvents in reaction performance improvement. The focus of this reaction database is to provide a data rich environment with process information available to assist during the early stage synthesis of pharmaceutical products. The database is structured in terms of reaction classification of reaction types; compounds participating in the reaction; use of organic solvents and their function; information for single step and multistep reactions; target products; reaction conditions and reaction data. Information for reactor scale-up together with information for the separation and other relevant information for each reaction and reference are also available in the database. Additionally, the retrieved information obtained from the database can be evaluated in terms of sustainability using well-known “green” metrics published in the scientific literature. The application of the database is illustrated through the synthesis of ibuprofen, for which data on different reaction pathways have been retrieved from the database and compared using “green” chemistry metrics.
Mosier, K. L.; Hart, S. G.
State-of-the-art flight technology has restructured the task of human operators, decreasing the need for physical and sensory resources, and increasing the quantity of cognitive effort required, changing it qualitatively. Recent technological advances have the most potential for impacting a pilot in two areas: performance and mental workload. In an environment in which timing is critical, additional cognitive processing can cause performance decrements, and increase a pilot's perception of the mental workload involved. The effects of stimulus processing demands on motor response performance and subjective mental workload are examined, using different combinations of response selection and target acquisition tasks. The information processing demands of the response selection were varied (e.g., Sternberg memory set tasks, math equations, pattern matching), as was the difficulty of the response execution. Response latency as well as subjective workload ratings varied in accordance with the cognitive complexity of the task. Movement times varied according to the difficulty of the response execution task. Implications in terms of real-world flight situations are discussed.
Modjaev, A. D.; Leonova, N. M.
Recent years, a new scientific branch connected with the activities in social sphere management developing intensively and it is called "Social Cybernetics". In the framework of this scientific branch, theory and methods of management of social sphere are formed. Considerable attention is paid to the management, directly in real time. However, the decision of such management tasks is largely constrained by the lack of or insufficiently deep study of the relevant sections of the theory and methods of management. The article discusses the use of cybernetic principles in solving problems of control in social systems. Applying to educational activities a model of composite interrelated objects representing the behaviour of students at various stages of educational process is introduced. Statistical processing of experimental data obtained during the actual learning process is being done. If you increase the number of features used, additionally taking into account the degree and nature of variability of levels of current progress of students during various types of studies, new properties of students' grouping are discovered. L-clusters were identified, reflecting the behaviour of learners with similar characteristics during lectures. It was established that the characteristics of the clusters contain information about the dynamics of learners' behaviour, allowing them to be used in additional lessons. The ways of solving the problem of adaptive control based on the identified dynamic characteristics of the learners are planned.
Tavakolian, Kouhyar; Vaseghi, Ali; Kaminska, Bozena
In this paper a novel methodology for processing of a ballistocardiogram (BCG) is proposed in which the respiration signal is utilized to improve the averaging of the BCG signal and ultimately the annotation and interpretation of the signal. Previous research works filtered out the respiration signal while the novelty of the current research is that, rather than removing the respiration effect from the signal, we utilize the respiration information to improve the averaging and thus analysis and interpretation of the BCG signal in diagnosis of cardiac malfunctions. This methodology is based on our investigation that BCG cycles corresponding to the inspiration and expiration phases of the respiration cycle are different in morphology. BCG cycles corresponding to the expiration phase of respiration have been proved to be more closely related to each other when compared to cycles corresponding to inspiration, and therefore expiration cycles are better candidates to be selected for the calculation of the averaged BCG signal. The new BCG average calculated based on this methodology is then considered as the representative and a template of the BCG signal for further processing. This template can be considered as the output of a clinical BCG instrument with higher reliability and accuracy compared to the previous processing methods
Mc Mahon, Siobhan S; Sim, Aaron; Filippi, Sarah; Johnson, Robert; Liepe, Juliane; Smith, Dominic; Stumpf, Michael P H
Sensing and responding to the environment are two essential functions that all biological organisms need to master for survival and successful reproduction. Developmental processes are marshalled by a diverse set of signalling and control systems, ranging from systems with simple chemical inputs and outputs to complex molecular and cellular networks with non-linear dynamics. Information theory provides a powerful and convenient framework in which such systems can be studied; but it also provides the means to reconstruct the structure and dynamics of molecular interaction networks underlying physiological and developmental processes. Here we supply a brief description of its basic concepts and introduce some useful tools for systems and developmental biologists. Along with a brief but thorough theoretical primer, we demonstrate the wide applicability and biological application-specific nuances by way of different illustrative vignettes. In particular, we focus on the characterisation of biological information processing efficiency, examining cell-fate decision making processes, gene regulatory network reconstruction, and efficient signal transduction experimental design. Copyright © 2014 Elsevier Ltd. All rights reserved.
Fitzpatrick, Casey A.
Quantum information processing (QIP) is an interdisciplinary field concerned with the development of computers and information processing systems that utilize quantum mechanical properties of nature to carry out their function. QIP systems have become vastly more practical since the turn of the century. Today, QIP applications span imaging, cryptographic security, computation, and simulation (quantum systems that mimic other quantum systems). Many important strategies improve quantum versions of classical information system hardware, such as single photon detectors and quantum repeaters. Another more abstract strategy engineers high-dimensional quantum state spaces, so that each successful event carries more information than traditional two-level systems allow. Photonic states in particular bring the added advantages of weak environmental coupling and data transmission near the speed of light, allowing for simpler control and lower system design complexity. In this dissertation, numerous novel, scalable designs for practical high-dimensional linear-optical QIP systems are presented. First, a correlated photon imaging scheme using orbital angular momentum (OAM) states to detect rotational symmetries in objects using measurements, as well as building images out of those interactions is reported. Then, a statistical detection method using chains of OAM superpositions distributed according to the Fibonacci sequence is established and expanded upon. It is shown that the approach gives rise to schemes for sorting, detecting, and generating the recursively defined high-dimensional states on which some quantum cryptographic protocols depend. Finally, an ongoing study based on a generalization of the standard optical multiport for applications in quantum computation and simulation is reported upon. The architecture allows photons to reverse momentum inside the device. This in turn enables realistic implementation of controllable linear-optical scattering vertices for
Hamstra, Danielle A; de Kloet, E Ronald; Quataert, Ina; Jansen, Myrthe; Van der Does, Willem
Carriers of MR-haplotype 1 and 3 (GA/CG; rs5522 and rs2070951) are more sensitive to the influence of oral contraceptives (OC) and menstrual cycle phase on emotional information processing than MR-haplotype 2 (CA) carriers. We investigated whether this effect is associated with estradiol (E2) and/or progesterone (P4) levels. Healthy MR-genotyped premenopausal women were tested twice in a counterbalanced design. Naturally cycling (NC) women were tested in the early-follicular and mid-luteal phase and OC-users during OC-intake and in the pill-free week. At both sessions E2 and P4 were assessed in saliva. Tests included implicit and explicit positive and negative affect, attentional blink accuracy, emotional memory, emotion recognition, and risky decision-making (gambling). MR-haplotype 2 homozygotes had higher implicit happiness scores than MR-haplotype 2 heterozygotes (p=0.031) and MR-haplotype 1/3 carriers (pemotion recognition test than MR-haplotype 1/3 (p=0.001). Practice effects were observed for most measures. The pattern of correlations between information processing and P4 or E2 differed between sessions, as well as the moderating effects of the MR genotype. In the first session the MR-genotype moderated the influence of P4 on implicit anxiety (sr=-0.30; p=0.005): higher P4 was associated with reduction in implicit anxiety, but only in MR-haplotype 2 homozygotes (sr=-0.61; p=0.012). In the second session the MR-genotype moderated the influence of E2 on the recognition of facial expressions of happiness (sr=-0.21; p=0.035): only in MR-haplotype 1/3 higher E2 was correlated with happiness recognition (sr=0.29; p=0.005). In the second session higher E2 and P4 were negatively correlated with accuracy in lag2 trials of the attentional blink task (pemotional information processing. This moderating effect may depend on the novelty of the situation. Copyright © 2016 Elsevier Ltd. All rights reserved.
It is difficult not to be amazed by the ability of the human brain to process, to structure and to memorize information. Even by the toughest standards the behaviour of this network of about 1011 neurons qualifies as complex, and both the scientific community and the public take great interest in the growing field of neuroscience. The scientific endeavour to learn more about the function of the brain as an information processing system is here a truly interdisciplinary one, with important contributions from biology, computer science, physics, engineering and mathematics as the authors quite rightly point out in the introduction of their book. The role of the theoretical disciplines here is to provide mathematical models of information processing systems and the tools to study them. These models and tools are at the centre of the material covered in the book by Coolen, Kühn and Sollich. The book is divided into five parts, providing basic introductory material on neural network models as well as the details of advanced techniques to study them. A mathematical appendix complements the main text. The range of topics is extremely broad, still the presentation is concise and the book well arranged. To stress the breadth of the book let me just mention a few keywords here: the material ranges from the basics of perceptrons and recurrent network architectures to more advanced aspects such as Bayesian learning and support vector machines; Shannon's theory of information and the definition of entropy are discussed, and a chapter on Amari's information geometry is not missing either. Finally the statistical mechanics chapters cover Gardner theory and the replica analysis of the Hopfield model, not without being preceded by a brief introduction of the basic concepts of equilibrium statistical physics. The book also contains a part on effective theories of the macroscopic dynamics of neural networks. Many dynamical aspects of neural networks are usually hard to find in the
Mei Feng; Yu Yafei; Feng Mang; Zhang Zhiming
We present a scheme for scalable quantum information processing with atomic ensembles and flying photons. Using the Rydberg blockade, we encode the qubits in the collective atomic states, which could be manipulated fast and easily due to the enhanced interaction in comparison to the single-atom case. We demonstrate that our proposed gating could be applied to generation of two-dimensional cluster states for measurement-based quantum computation. Moreover, the atomic ensembles also function as quantum repeaters useful for long-distance quantum state transfer. We show the possibility of our scheme to work in bad cavity or in weak coupling regime, which could much relax the experimental requirement. The efficient coherent operations on the ensemble qubits enable our scheme to be switchable between quantum computation and quantum communication using atomic ensembles.
Leibfried, D.; Wineland, D. J.; Blakestad, R. B.; Bollinger, J. J.; Britton, J.; Chiaverini, J.; Epstein, R. J.; Itano, W. M.; Jost, J. D.; Knill, E.; Langer, C.; Ozeri, R.; Reichle, R.; Seidelin, S.; Shiga, N.; Wesenberg, J. H.
Recent theoretical advances have identified several computational algorithms that can be implemented utilizing quantum information processing (QIP), which gives an exponential speedup over the corresponding (known) algorithms on conventional computers. QIP makes use of the counter-intuitive properties of quantum mechanics, such as entanglement and the superposition principle. Unfortunately it has so far been impossible to build a practical QIP system that outperforms conventional computers. Atomic ions confined in an array of interconnected traps represent a potentially scalable approach to QIP. All basic requirements have been experimentally demonstrated in one and two qubit experiments. The remaining task is to scale the system to many qubits while minimizing and correcting errors in the system. While this requires extremely challenging technological improvements, no fundamental roadblocks are currently foreseen.
Burkhardt, Laura; Masotto, Tom; Sims, J. Terry; Whittredge, Roy; Alger, Linda S.
The purpose is to document the functional requirements and detailed specifications for the Inter-Computer Communications Services (ICCS) of the Advanced Information Processing System (AIPS). An introductory section is provided to outline the overall architecture and functional requirements of the AIPS and to present an overview of the ICCS. An overview of the AIPS architecture as well as a brief description of the AIPS software is given. The guarantees of the ICCS are provided, and the ICCS is described as a seven-layered International Standards Organization (ISO) Model. The ICCS functional requirements, functional design, and detailed specifications as well as each layer of the ICCS are also described. A summary of results and suggestions for future work are presented.
Casado, Alba; Palma, Alfonso; Paolieri, Daniela
Three different tasks (word repetition, lexical decision, and gender decision) were designed to explore the impact of the sex clues (sex of the speaker, sex of the addressee) and the type of gender (semantic, arbitrary) on the processing of isolated Spanish gendered words. The findings showed that the grammatical gender feature was accessed when no mandatory attentional focus was required. In addition, the results indicated that the participants organize information according to their own sex role, which provides more salience to the words that match in grammatical gender with their own sex role representation, even when the gender assignment is arbitrary. Finally, the sex of the speaker biased the lexical access and the grammatical gender selection, serving as a semantic prime when the two dimensions have a congruent relationship. Furthermore, the masculine form serves as the generic gender representing both male and female figures.
Ortín, Silvia; San-Martín, Daniel; Pesquera, Luis; Gutiérrez, José Manuel
An electro-optical system with a delay loop based on semiconductor lasers is investigated for information processing by performing numerical simulations. This system can replace a complex network of many nonlinear elements for the implementation of Reservoir Computing. We show that a single nonlinear-delay dynamical system has the basic properties to perform as reservoir: short-term memory and separation property. The computing performance of this system is evaluated for two prediction tasks: Lorenz chaotic time series and nonlinear auto-regressive moving average (NARMA) model. We sweep the parameters of the system to find the best performance. The results achieved for the Lorenz and the NARMA-10 tasks are comparable to those obtained by other machine learning methods.
Dr. Filiz Gürder
Full Text Available Nowadays, organizations are required to develop quick and accurate responses to internal and external changes that gain momentum. In this context, knowledge management activities become more important to all organizations. On the other hand, Geographic Information Systems (GIS become common more and more. GIS which address a broad spectrum of users such as public agencies, local communities, civil society organizations, the private sector, academic environment, and personal users have been aiming to solve problems which occurred in location-based areas. GIS are important to get, combine, analyze and transfer the spatial data. Common use of PCs for personal needs, digital geography and improvements of software technologies, also the need to make socially acceptable business decisions facilitated development and widespread use of GIS applications. The main purpose of this paper is to discuss using areas and contribution potentials of GIS in enterprise-wide knowledge management processes.
During the last ten years, superconducting circuits have passed from being interesting physical devices to becoming contenders for near-future useful and scalable quantum information processing (QIP). Advanced quantum simulation experiments have been shown with up to nine qubits, while a demonstration of quantum supremacy with fifty qubits is anticipated in just a few years. Quantum supremacy means that the quantum system can no longer be simulated by the most powerful classical supercomputers. Integrated classical-quantum computing systems are already emerging that can be used for software development and experimentation, even via web interfaces. Therefore, the time is ripe for describing some of the recent development of superconducting devices, systems and applications. As such, the discussion of superconducting qubits and circuits is limited to devices that are proven useful for current or near future applications. Consequently, the centre of interest is the practical applications of QIP, such as computation and simulation in Physics and Chemistry.
Orbán, Levente L; Chartier, Sylvain
Untrained, "flower-naïve" bumblebees display behavioural preferences when presented with visual properties such as colour, symmetry, spatial frequency and others. Two unsupervised neural networks were implemented to understand the extent to which these models capture elements of bumblebees' unlearned visual preferences towards flower-like visual properties. The computational models, which are variants of Independent Component Analysis and Feature-Extracting Bidirectional Associative Memory, use images of test-patterns that are identical to ones used in behavioural studies. Each model works by decomposing images of floral patterns into meaningful underlying factors. We reconstruct the original floral image using the components and compare the quality of the reconstructed image to the original image. Independent Component Analysis matches behavioural results substantially better across several visual properties. These results are interpreted to support a hypothesis that the temporal and energetic costs of information processing by pollinators served as a selective pressure on floral displays: flowers adapted to pollinators' cognitive constraints.
Ben Nanfeng Luo
Full Text Available We propose a compensatory misfits theory which holds that an “over-fitting” organization structure can compensate for an “under-fitting” structure, thereby reducing the total misfit. In organizations, over-fit occurs when structural features misfit the core contingencies because the structural level is too high to fit the contingencies. An under-fit occurs when structural features misfit the contingencies because the structural level is too low. When an under-fit is compensated by an over-fit, the combination can produce performance outcomes that approximate those from fit. The reason inheres in information processing being a higher level factor that cuts across different contingencies and structural features that are mis-fitted to each other, so that compensation is possible. We identify the specific conditions that must be fulfilled for compensation to occur, and we discuss implications for organization design theory and practice.
Serikawa, Takahiro; Shiozawa, Yu; Ogawa, Hisashi; Takanashi, Naoto; Takeda, Shuntaro; Yoshikawa, Jun-ichi; Furusawa, Akira
We exploit quantum information processing on a traveling wave of light, expecting emancipation from thermal noise, easy coupling to fiber communication, and potentially high operation speed. Although optical memories are technically challenging, we have an alternative approach to apply multi-step operations on traveling light, that is, continuous-variable one-way computation. So far our achievement includes generation of a one-million-mode entangled chain in time-domain, mode engineering of nonlinear resource states, and real-time nonlinear feedforward. Although they are implemented with free space optics, we are also investigating photonic integration and performed quantum teleportation with a passive liner waveguide chip as a demonstration of entangling, measurement, and feedforward. We also suggest a loop-based architecture as another model of continuous-variable computing.
Sen, Aditi; Sen, Ujjwal; Ahufinger, Veronica; Briegel, Hans J.; Sanpera, Anna; Lewenstein, Maciej
We study quantum information processing in complex disordered many body systems that can be implemented by using lattices of ultracold atomic gases and trapped ions. We demonstrate, first in the short range case, the generation of entanglement and the local realization of quantum gates in a disordered magnetic model describing a quantum spin glass. We show that in this case it is possible to achieve fidelities of quantum gates higher than in the classical case. Complex systems with long range interactions, such as ions chains or dipolar atomic gases, can be used to model neural network Hamiltonians. For such systems, where both long range interactions and disorder appear, it is possible to generate long range bipartite entanglement. We provide an efficient analytical method to calculate the time evolution of a given initial state, which in turn allows us to calculate its quantum correlations
Cohn, Elizabeth; Larson, Elaine
To critically analyze studies published within the past decade about participants' comprehension of informed consent in clinical research and to identify promising intervention strategies. Integrative review of literature. The Cumulative Index of Nursing and Allied Health Literature (CINAHL), PubMed, and the Cochrane Database of Systematic Reviews and Cochrane Central Register of Controlled Trials were searched. Inclusion criteria included studies (a) published between January 1, 1996 and January 1, 2007, (b) designed as descriptive or interventional studies of comprehension of informed consent for clinical research, (c) conducted in nonpsychiatric adult populations who were either patients or volunteer participants, (d) written in English, and (e) published in peer-reviewed journals. Of the 980 studies identified, 319 abstracts were screened, 154 studies were reviewed, and 23 met the inclusion criteria. Thirteen studies (57%) were descriptive, and 10 (43%) were interventional. Interventions tested included simplified written consent documents, multimedia approaches, and the use of a trained professional (consent educator) to assist in the consent process. Collectively, no single intervention strategy was consistently associated with improved comprehension. Studies also varied in regard to the definition of comprehension and the tools used to measure it. Despite increasing regulatory scrutiny, deficiencies still exist in participant comprehension of the research in which they participate, as well as differences in how comprehension is measured and assessed. No single intervention was identified as consistently successful for improving participant comprehension, and results indicated that any successful consent process should at a minimum include various communication modes and is likely to require one-to-one interaction with someone knowledgeable about the study.
Brickman Soderberg, Kathy-Anne; Gemelke, Nathan; Chin Cheng
In this paper, we describe a novel scheme to implement scalable quantum information processing using Li-Cs molecular states to entangle 6 Li and 133 Cs ultracold atoms held in independent optical lattices. The 6 Li atoms will act as quantum bits to store information and 133 Cs atoms will serve as messenger bits that aid in quantum gate operations and mediate entanglement between distant qubit atoms. Each atomic species is held in a separate optical lattice and the atoms can be overlapped by translating the lattices with respect to each other. When the messenger and qubit atoms are overlapped, targeted single-spin operations and entangling operations can be performed by coupling the atomic states to a molecular state with radio-frequency pulses. By controlling the frequency and duration of the radio-frequency pulses, entanglement can be either created or swapped between a qubit messenger pair. We estimate operation fidelities for entangling two distant qubits and discuss scalability of this scheme and constraints on the optical lattice lasers. Finally we demonstrate experimental control of the optical potentials sufficient to translate atoms in the lattice.
Gaebler, John; Tan, Ting; Lin, Yiheng; Bowler, Ryan; Jost, John; Meier, Adam; Knill, Emanuel; Leibfried, Dietrich; Wineland, David; Ion Storage Team
We report recent results on qubit manipulation techniques for trapped-ions towards scalable quantum information processing (QIP). We demonstrate a platform-independent benchmarking protocol for evaluating the performance of Clifford gates, which form a basis for fault-tolerant QIP. We report a demonstration of an entangling gate scheme proposed by Bermudez et al. [Phys. Rev. A. 85, 040302 (2012)] and achieve a fidelity of 0.974(4). This scheme takes advantage of dynamic decoupling which protects the qubit against dephasing errors. It can be applied directly on magnetic-field-insensitive states, and provides a number of simplifications in experimental implementation compared to some other entangling gates with trapped ions. We also report preliminary results on dissipative creation of entanglement with trapped-ions. Creation of an entangled pair does not require discrete logic gates and thus could reduce the level of quantum-coherent control needed for large-scale QIP. Supported by IARPA, ARO contract No. EAO139840, ONR, and the NIST Quantum Information Program.
Flachot, Alban; Gegenfurtner, Karl R
Deep convolutional neural networks are a class of machine-learning algorithms capable of solving non-trivial tasks, such as object recognition, with human-like performance. Little is known about the exact computations that deep neural networks learn, and to what extent these computations are similar to the ones performed by the primate brain. Here, we investigate how color information is processed in the different layers of the AlexNet deep neural network, originally trained on object classification of over 1.2M images of objects in their natural contexts. We found that the color-responsive units in the first layer of AlexNet learned linear features and were broadly tuned to two directions in color space, analogously to what is known of color responsive cells in the primate thalamus. Moreover, these directions are decorrelated and lead to statistically efficient representations, similar to the cardinal directions of the second-stage color mechanisms in primates. We also found, in analogy to the early stages of the primate visual system, that chromatic and achromatic information were segregated in the early layers of the network. Units in the higher layers of AlexNet exhibit on average a lower responsivity for color than units at earlier stages.
... Information Collection: Comment Request; Technical Processing Requirements for Multifamily Project Mortgage... information: Title of Proposal: Technical Processing Requirements for Multifamily Project Mortgage Insurance... information collection requirement described below will be submitted to the Office of Management and Budget...
... Information Collection: Technical Processing Requirements for Multifamily Project Mortgage Insurance AGENCY: Office of the Chief Information Officer, HUD. ACTION: Correction, notice. SUMMARY: On October 25, 2013 at... Collection Title of Information Collection: Technical Processing Requirements for Multifamily Project...
B.R.J. George (Bert); S. Desmidt (Sebastian)
textabstractThis study draws on information processing theory to investigate predictors of strategic-decision quality in public organizations. Information processing theory argues that (a) rational planning practices contribute to strategic-decision quality by injecting information into decision
Full Text Available One of the main roles of Information Technology in Knowledge Management programs is to accelerate the speed of knowledge transfer and creation. The Knowledge Management tools intend to help the processes of collecting and organizing the knowledge of groups of individuals in order to make this knowledge available in a shared base. Due to the largeness of the concept of knowledge, the software market for Knowledge Management seems to be quite confusing. Technology vendors are developing different implementations of the Knowledge Management concepts in their software products. Because of the variety and quantity of Knowledge Management tools available on the market, a typology may be a valuable aid to organizations that are looking for answers to specific needs. The objective of this article is to present guidelines that help to design such a typology. Knowledge Management solutions such as intranet systems, Electronic Document Management (EDM, groupware, workflow, artificial intelligence-based systems, Business Intelligence (BI, knowledge map systems, innovation support, competitive intelligence tools and knowledge portals are discussed in terms of their potential contributions to the processes of creating, registering and sharing knowledge. A number of Knowledge Management tools (Lotus Notes, Microsoft Exchange, Business Objects, Aris Toolset, File Net, Gingo, Vigipro, Sopheon have been checked. The potential of each category of solutions to support the transfer of tacit and/or explicit knowledge and to facilitate the knowledge conversion spiral in the sense of Nonaka and Takeuchi (1995 is discussed.
Selbmann, H K
The paradigma of problem-orientated assuring of the professional quality of medical case is a kind of "control loop system" consisting of the following 5 steps: routine observation, identification of the problem, analysis of the problem, translation of problem solutions into daily practice and control as to whether the problem has been solved or eliminated. Medical data processing, which involves documentation, electronic data processing and statistics, can make substantial contributions especially to the steps of observation, identification of the problem, and follow-up control. Perinatal data collection, which has already been introduced in 6 Länder of the Federal Republic of Germany, has supplied ample proof of this. These operations were conducted under the heading "internal clinical assuring of quality with external aid". Those clinics who participated in this programme, were given the necessary aid in self-observation (questionnaires, clinical statistics), and they were also given comparative informative data to help them in identifying the problems (clinical profiles, etc.). It is entirely left to the responsibility of the clinics themselves--voluntary cooperation and guarantee of remaining anonymous being a matter of course -- to draw their own consequences from the collected data and to translate these into clinical everyday practice.
The framework of safeguard activities of the IAEA from the viewpoint of informanagement is described. As methodology, major sources of information are, member state supplied information, information obtained by the Agency through its verification activity, and open source information. Software tools are provided to retrieve and to filter information for storage. Organizational structure of the Agency's information activities, and the changing roles of the inspectors are also described. (Yamamoto, A.)
Charles F. Mactutus
Full Text Available One clue regarding the basis of cocaine-induced deficits in attentional processing is provided by the clinical findings of changes in the infants’ startle response; observations buttressed by neurophysiological evidence of alterations in brainstem transmission time. Using the IV route of administration and doses that mimic the peak arterial levels of cocaine use in humans, the present study examined the effects of prenatal cocaine on auditory information processing via tests of the acoustic startle response (ASR, habituation, and prepulse inhibition (PPI in the offspring. Nulliparous Long-Evans female rats, implanted with an IV access port prior to breeding, were administered saline, 0.5, 1.0, or 3.0 mg/kg/injection of cocaine HCL (COC from gestation day (GD8-20 (1x/day-GD8-14, 2x/day-GD15-20. COC had no significant effects on maternal/litter parameters or growth of the offspring. At 18-20 days of age, one male and one female, randomly selected from each litter displayed an increased ASR (>30% for males at 1.0 mg/kg and >30% for females at 3.0 mg/kg. When reassessed in adulthood (D90-100, a linear dose-response increase was noted on response amplitude. At both test ages, within-session habituation was retarded by prenatal cocaine treatment. Testing the females in diestrus vs. estrus did not alter the results. Prenatal cocaine altered the PPI response function across interstimulus interval (ISI and induced significant sex-dependent changes in response latency. Idazoxan, an alpha2-adrenergic receptor antagonist, significantly enhanced the ASR, but less enhancement was noted with increasing doses of prenatal cocaine. Thus, in utero exposure to cocaine, when delivered via a protocol designed to capture prominent features of recreational usage, causes persistent, if not permanent, alterations in auditory information processing, and suggests dysfunction of the central noradrenergic circuitry modulating, if not mediating, these responses.
Many companies use mergers to achieve their growth goals or target technology position. To realise synergies that justify the merger transaction, an integration of the merged companies is often necessary. Such integartion takes place across company business areas (such as finance or sales) and across the layers of management consideration, which are strategy, human resources, organisation, processes, and information technology. In merger integration techniques, there is a significant gap ...
Social Security Administration — The data store houses detail information pertaining to visitors' wait times, visits, calls, and other customer relationship information relating to VIPR and CHIP....
..., Advertising, Consumer information, Marketing agreements, Processed Raspberries, Promotion, Reporting and...-705-FR] RIN 0581-AC79 Processed Raspberry Promotion, Research, and Information Order; Referendum... referendum to determine whether the issuance of the proposed Processed Raspberry Promotion, Research, and...
...: Alternatives Process in Hydropower Licensing AGENCY: Office of the Secretary, Office of Environmental Policy... approval for the collection of information for Alternatives Process in Hydropower Licensing. This... comments should reference Alternatives Process in Hydropower Licensing. FOR FURTHER INFORMATION CONTACT: To...
Paula Regina Dal' Evedove
Full Text Available Introduction: Studies about human cognition represent a relevant perspective in information science, considering the subjective actions of information professionals and dialogic process that should permeate the activity of subjects dealing with the organization and representation of information.Objective: Explore the approach of the cognitive perspective in information science and their new settings by contemporary needs of information to reflect on the process of meeting the professional information through the social reality that permeates the contexts of information.Methodology: Reflection on theoretical aspects that deal with the cognitive development to discuss the implications of the cognitive approach in information science and its evolution in the scope of the representation and processing of information.Results: Research in Information Science must consider issues of cognitive and social order that underlie information processing and the process of knowing the information professional as knowledge structures must be explained from the social context of knowing subjects.Conclusions: There is a need to investigate the process of knowing the information professional in the bias of socio-cognitive approach, targeting new elements for the understanding of the relationship information (cognitive manifestations and its implications on the social dimension.
..., consumer information, advertising, sales promotion, producer information, market development, and product... Raspberry Promotion, Research and Information Program; Request for Extension and Revision of a Currently... National Processed Raspberry Promotion, Research, and Information Program. DATES: Comments on this document...
Chandrasekaran, Bharath; Chan, Alice H. D.; Wong, Patrick C. M.
Human speech is composed of two types of information, related to content (lexical information, i.e., "what" is being said [e.g., words]) and to the speaker (indexical information, i.e., "who" is talking [e.g., voices]). The extent to which lexical versus indexical information is represented separately or integrally in the brain is unresolved. In…
Glazebrook, Cheryl M; Welsh, Timothy N; Tremblay, Luc
Presenting target and non-target information in different modalities influences target localization if the non-target is within the spatiotemporal limits of perceptual integration. When using auditory and visual stimuli, the influence of a visual non-target on auditory target localization is greater than the reverse. It is not known, however, whether or how such perceptual effects extend to goal-directed behaviours. To gain insight into how audio-visual stimuli are integrated for motor tasks, the kinematics of reaching movements towards visual or auditory targets with or without a non-target in the other modality were examined. When present, the simultaneously presented non-target could be spatially coincident, to the left, or to the right of the target. Results revealed that auditory non-targets did not influence reaching trajectories towards a visual target, whereas visual non-targets influenced trajectories towards an auditory target. Interestingly, the biases induced by visual non-targets were present early in the trajectory and persisted until movement end. Subsequent experimentation indicated that the magnitude of the biases was equivalent whether participants performed a perceptual or motor task, whereas variability was greater for the motor versus the perceptual tasks. We propose that visually induced trajectory biases were driven by the perceived mislocation of the auditory target, which in turn affected both the movement plan and subsequent control of the movement. Such findings provide further evidence of the dominant role visual information processing plays in encoding spatial locations as well as planning and executing reaching action, even when reaching towards auditory targets.
Full Text Available After many decades of flourishing computer science it is now rather evident that in a world dominated by different kinds of digital information, both applications and people are forced to seek new, innovative structures and forms of data management and organization. Following this blunt observation, researchers in informatics have strived over the recent years to tackle the non-unique and rather evolving notion of context, which aids significantly the data disambiguation process. Motivated by this environment, this work attempts to summarize and organize in a researcher-friendly tabular manner important or pioneer related research works deriving from diverse computational intelligence domains: Initially, we discuss the influence of context with respect to traditional low-level multimedia content analysis and search, and retrieval tasks and then we advance to the fields of overall computational context-awareness and the so-called human-generated contextual elements. In an effort to provide meaningful information to fellow researchers, this brief survey focuses on the impact of context in modern and popular computing undertakings of our era. More specifically, we focus to the presentation of a short review of visual context modeling methods, followed by the depiction of context-awareness in modern computing. Works dealing with the interpretation of context by human-generated interactions are also discussed herein, as the particular domain gains an ever-increasing proportion of related research nowadays. We then conclude the paper by providing a short discussion on (i the motivation behind the included context type categorization into three main pillars; (ii the findings and conclusions of the survey for each context category; and (iii a couple of brief advices derived from the survey for both interested developers and fellow researchers.
July 19, 2013. The Office of Inspector General plans to begin preliminary research on the U.S. Environmental Protection Agency’s process for deciding to release information requested under the Freedom of Information Act.
Sondersorg, Anna Christina; Busse, Daniela; Kyereme, Jessica; Rothermel, Markus; Neufang, Gitta; Gisselmann, Günter; Hatt, Hanns; Conrad, Heike
Trigeminal fibers terminate within the facial mucosa and skin and transmit tactile, proprioceptive, chemical, and nociceptive sensations. Trigeminal sensations can arise from the direct stimulation of intraepithelial free nerve endings or indirectly through information transmission from adjacent cells at the peripheral innervation area. For mechanical and thermal cues, communication processes between skin cells and somatosensory neurons have already been suggested. High concentrations of most odors typically provoke trigeminal sensations in vivo but surprisingly fail to activate trigeminal neuron monocultures. This fact favors the hypothesis that epithelial cells may participate in chemodetection and subsequently transmit signals to neighboring trigeminal fibers. Keratinocytes, the major cell type of the epidermis, express various receptors that enable reactions to multiple environmental stimuli. Here, using a co-culture approach, we show for the first time that exposure to the odorant chemicals induces a chemical communication between human HaCaT keratinocytes and mouse trigeminal neurons. Moreover, a supernatant analysis of stimulated keratinocytes and subsequent blocking experiments with pyrodoxalphosphate-6-azophenyl-2′,4′-disulfonate revealed that ATP serves as the mediating transmitter molecule released from skin cells after odor stimulation. We show that the ATP release resulting from Javanol® stimulation of keratinocytes was mediated by pannexins. Consequently, keratinocytes act as chemosensors linking the environment and the trigeminal system via ATP signaling. PMID:24790106
Hybrid approaches to quantum information processing (QIP) aim to capitalize on the strengths of disparate quantum technologies to realize a system whose capabilities exceed those of any single experimental platform. At the University of Wisconsin, we are working toward integration of a fast superconducting quantum processor with a stable, long-lived quantum memory based on trapped neutral atoms. Here we describe the development of a quantum interface between superconducting thin-film cavity circuits and trapped Rydberg atoms, the key technological obstacle to realization of superconductor-atom hybrid QIP. Specific accomplishments to date include development of a theoretical protocol for high-fidelity state transfer between the atom and the cavity; fabrication and characterization of high- Q superconducting cavities with integrated trapping electrodes to enhance zero-point microwave fields at a location remote from the chip surface; and trapping and Rydberg excitation of single atoms within 1 mm of the cavity. We discuss the status of experiments to probe the strong coherent coupling of single Rydberg atoms and the superconducting cavity. Supported by ARO under contract W911NF-16-1-0133.
Levente L Orbán
Full Text Available Untrained, "flower-naïve" bumblebees display behavioural preferences when presented with visual properties such as colour, symmetry, spatial frequency and others. Two unsupervised neural networks were implemented to understand the extent to which these models capture elements of bumblebees' unlearned visual preferences towards flower-like visual properties. The computational models, which are variants of Independent Component Analysis and Feature-Extracting Bidirectional Associative Memory, use images of test-patterns that are identical to ones used in behavioural studies. Each model works by decomposing images of floral patterns into meaningful underlying factors. We reconstruct the original floral image using the components and compare the quality of the reconstructed image to the original image. Independent Component Analysis matches behavioural results substantially better across several visual properties. These results are interpreted to support a hypothesis that the temporal and energetic costs of information processing by pollinators served as a selective pressure on floral displays: flowers adapted to pollinators' cognitive constraints.
Detecting, investigating and prosecuting cybercrime? Extremely important, but not really the solution for the problem. Prevention is better! The sectors that have joined the Cybercrime Information Exchange have accepted the challenge of ensuring the effectiveness of the (information) security of
Rinderle-Ma, S.; Aalst, van der W.M.P.
Process mining has been proposed as a tool for analyzing business processes based on events logs. Today, most information systems are logging events in some log and thus provide detailed information about the processes they are supporting. This information can be used for two forms of process
Schrauben, Julie E.
LaBerge and Samuels' (1974) theory of automatic information processing in reading offers a model that explains how and where the processing of information occurs and the degree to which processing of information occurs. These processes are dependent upon two criteria: accurate word decoding and automatic word recognition. However, LaBerge and…
Zhang Lin; Wu Jun-De; Fei Shao-Ming
In this paper, we characterize the saturation of four universal inequalities in quantum information theory, including a variant version of strong subadditivity inequality for von Neumann entropy, the coherent information inequality, the Holevo quantity, and average entropy inequalities. These results shed new light on quantum information inequalities. (paper)
Over the past three decades, the promises of super-fast quantum computing and secure quantum cryptography have spurred a world-wide interest in quantum information, generating fascinating quantum technologies for coherent manipulation of individual quantum systems. However, the distance of fiber-based quantum communications is limited due to intrinsic fiber loss and decreasing of entanglement quality. Moreover, probabilistic single-photon source and entanglement source demand exponentially increased overheads for scalable quantum information processing. To overcome these problems, we are taking two paths in parallel: quantum repeaters and through satellite. We used the decoy-state QKD protocol to close the loophole of imperfect photon source, and used the measurement-device-independent QKD protocol to close the loophole of imperfect photon detectors--two main loopholes in quantum cryptograph. Based on these techniques, we are now building world's biggest quantum secure communication backbone, from Beijing to Shanghai, with a distance exceeding 2000 km. Meanwhile, we are developing practically useful quantum repeaters that combine entanglement swapping, entanglement purification, and quantum memory for the ultra-long distance quantum communication. The second line is satellite-based global quantum communication, taking advantage of the negligible photon loss and decoherence in the atmosphere. We realized teleportation and entanglement distribution over 100 km, and later on a rapidly moving platform. We are also making efforts toward the generation of multiphoton entanglement and its use in teleportation of multiple properties of a single quantum particle, topological error correction, quantum algorithms for solving systems of linear equations and machine learning. Finally, I will talk about our recent experiments on quantum simulations on ultracold atoms. On the one hand, by applying an optical Raman lattice technique, we realized a two-dimensional spin-obit (SO
Weisang, C. (Asea Brown Boveri AG, Heidelberg (Germany). Konzernforschungszentrum)
Through the application of specialized systems, future-orientated information processing integrates the sciences of processes, control systems, process control strategies, user behaviour and ergonomics. Improvements in process control can be attained, inter alia, by the preparation of the information contained (e.g. by suppressing the flow of signals and replacing it with signals which are found on substance) and also by an ergonomic representation of the study of the process. (orig.).
Agrawal, Prathima; Hyden, Eoin; Krzyzanowsji, Paul; Srivastava, Mani B.; Trotter, John
Anytime anywhere wireless access to databases, such as medical and inventory records, can simplify workflow management in a business, and reduce or even eliminate the cost of moving paper documents. Moreover, continual progress in wireless access technology promises to provide per-user bandwidths of the order of a few Mbps, at least in indoor environments. When combined with the emerging high-speed integrated service wired networks, it enables ubiquitous and tetherless access to and processing of multimedia information by mobile users. To leverage on this synergy an indoor wireless network based on room-sized cells and multimedia mobile end-points is being developed at AT&T Bell Laboratories. This research network, called SWAN (Seamless Wireless ATM Networking), allows users carrying multimedia end-points such as PDAs, laptops, and portable multimedia terminals, to seamlessly roam while accessing multimedia data streams from the wired backbone network. A distinguishing feature of the SWAN network is its use of end-to-end ATM connectivity as opposed to the connectionless mobile-IP connectivity used by present day wireless data LANs. This choice allows the wireless resource in a cell to be intelligently allocated amongst various ATM virtual circuits according to their quality of service requirements. But an efficient implementation of ATM in a wireless environment requires a proper mobile network architecture. In particular, the wireless link and medium-access layers need to be cognizant of the ATM traffic, while the ATM layers need to be cognizant of the mobility enabled by the wireless layers. This paper presents an overview of SWAN's network architecture, briefly discusses the issues in making ATM mobile and wireless, and describes initial multimedia applications for SWAN.
Kasuga, Shoko; Hirashima, Masaya; Nozaki, Daichi
The proper association between planned and executed movements is crucial for motor learning because the discrepancies between them drive such learning. Our study explored how this association was determined when a single action caused the movements of multiple visual objects. Participants reached toward a target by moving a cursor, which represented the right hand's position. Once every five to six normal trials, we interleaved either of two kinds of visual perturbation trials: rotation of the cursor by a certain amount (±15°, ±30°, and ±45°) around the starting position (single-cursor condition) or rotation of two cursors by different angles (+15° and -45°, 0° and 30°, etc.) that were presented simultaneously (double-cursor condition). We evaluated the aftereffects of each condition in the subsequent trial. The error sensitivity (ratio of the aftereffect to the imposed visual rotation) in the single-cursor trials decayed with the amount of rotation, indicating that the motor learning system relied to a greater extent on smaller errors. In the double-cursor trials, we obtained a coefficient that represented the degree to which each of the visual rotations contributed to the aftereffects based on the assumption that the observed aftereffects were a result of the weighted summation of the influences of the imposed visual rotations. The decaying pattern according to the amount of rotation was maintained in the coefficient of each imposed visual rotation in the double-cursor trials, but the value was reduced to approximately 40% of the corresponding error sensitivity in the single-cursor trials. We also found a further reduction of the coefficients when three distinct cursors were presented (e.g., -15°, 15°, and 30°). These results indicated that the motor learning system utilized multiple sources of visual error information simultaneously to correct subsequent movement and that a certain averaging mechanism might be at work in the utilization process.
Reimsbach, D.; Hahn, R.; Gürtürk, A.
Sustainability-related non-financial information is increasingly deemed value relevant. Against this background, two recent trends in non-financial reporting are frequently discussed: integrated reporting and assurance of sustainability information. Using an established framework of information
... processing seasons. However, very limited information is available in a consolidated location or format about... DEPARTMENT OF COMMERCE National Oceanic and Atmospheric Administration Proposed Information Collection; Comment Request; Produce Processor Profiles of Fish Processing Plants in Alaska AGENCY: National...
Danz, Mary E.
The Level 4 Mission Sequence Test (MST) was studied to develop strategies and recommendations to facilitate information flow. Recommendations developed as a result of this study include revised format of the Test and Assembly Procedure (TAP) document and a conceptualized software based system to assist in the management of information flow during the MST.
Hayward, Tim; Broady, Judith E.
Presents research on the use of external information in the strategic management of retail banks in the United Kingdom. Explores the organizational role of the environmental analysis department, the character of business environment analysis, and the nature of information used in strategic management and its perceived importance. (Author/AEF)
Miettinen, Aki; Mykkänen, Juha; Laaksonen, Maarit
Finnish social services include 21 service commissions of social welfare including Adoption counselling, Income support, Child welfare, Services for immigrants and Substance abuse care. This paper describes the method used for process modeling in the National project for IT in Social Services in Finland (Tikesos). The process modeling in the project aimed to support common national target state processes from the perspective of national electronic archive, increased interoperability between systems and electronic client documents. The process steps and other aspects of the method are presented. The method was developed, used and refined during the three years of process modeling in the national project.
Severino, S K; Bucci, W; Creelman, M L
Our scientific tools have rapidly advanced in recent decades. Urine tests and hormone assays allow us to know exactly where a woman is in her menstrual cycle and to document precisely her hormonal rhythms. Sleep-laboratory techniques allow us to know exactly when someone is dreaming so that we can obtain that communication that Freud prized so highly. Furthermore, we now have quantifiable means to measure accessibility to nonverbal mental representations, which derive from important advances in theory and method in cognitive psychology in the last several decades. None of the studies we surveyed combined these tools. The sleep-laboratory studies did not document menstrual-cycle phase with either temperature or hormone levels. Moreover, the relationship between their findings and daily functioning is still unclear. The psychoanalytic study by Benedek and Rubenstein carefully documented cycle phase, but statements about fantasy and conflict were large and sweeping and the focus was on drive-related rather than information processing effects. Careful work must be done by modern investigators before the field of medical psychoanalysis can address the basic questions of mind-body functioning that are at issue here. We have presented one approach to entering the communication network of mind-brain functioning, that is, the application of the dual-code model to dreams, in the context of the influence of hormones across the menstrual cycle. Although prior research has demonstrated cyclical fluctuations of psychodynamic themes in dream content (Baron, 1977; Benedek & Rubenstein, 1939a, b; Hertz & Jensen, 1975; Lewis & Burns, 1975; Swanson & Foulkes, 1968), the existence of a cyclical cognitive pattern as regulated by gonadal function has not previously been explored. While the findings are preliminary and limited, this is the first study to provide evidence that there are psycholinguistic styles characteristic of different phases in the menstrual cycle, and that this
Miller, George A.
Capacity limitations in absolute judgment tasks are discussed in relation to information theory. Information theory can provide a quantitative way of resolving questions about limitations on the amount of information we can receive and the process of recoding. (SLD)
Kupersmidt, Janis B.; Stelter, Rebecca; Dodge, Kenneth A.
The purpose of this study was to evaluate the psychometric properties of an audio computer-assisted self-interviewing Web-based software application called the Social Information Processing Application (SIP-AP) that was designed to assess social information processing skills in boys in RD through 5th grades. This study included a racially and…
Tute, Erik; Steiner, Jochen
Literature describes a big potential for reuse of clinical patient data. A clinical data warehouse (CDWH) is a means for that. To support management and maintenance of processes extracting, transforming and loading (ETL) data into CDWHs as well as to ease reuse of metadata between regular IT-management, CDWH and secondary data users by providing a modeling approach. Expert survey and literature review to find requirements and existing modeling techniques. An ETL-modeling-technique was developed extending existing modeling techniques. Evaluation by exemplarily modeling existing ETL-process and a second expert survey. Nine experts participated in the first survey. Literature review yielded 15 included publications. Six existing modeling techniques were identified. A modeling technique extending 3LGM2 and combining it with openEHR information models was developed and evaluated. Seven experts participated in the evaluation. The developed approach can help in management and maintenance of ETL-processes and could serve as interface between regular IT-management, CDWH and secondary data users.
Possel, Patrick; Seemann, Simone; Ahrens, Stefanie; Hautzinger, Martin
In Dodge's model of "social information processing" depression is the result of a linear sequence of five stages of information processing ("Annu Rev Psychol" 44: 559-584, 1993). These stages follow a person's reaction to situational stimuli, such that each stage of information processing mediates the relationship between earlier and later stages.…
...-02] Announcing Approval of Federal Information Processing Standard (FIPS) Publication 180-4, Secure... approval of Federal Information Processing Standard (FIPS) Publication 180-4, Secure Hash Standard (SHS... Federal Information Processing Standard (FIPS) Publication 180-4, Secure Hash Standard (SHS). FIPS 180-4...
... claims processing and information retrieval systems; definitions. Section 205.35 through 205.38 contain...: (a) A mechanized claims processing and information retrieval system, hereafter referred to as an automated application processing and information retrieval system (APIRS), or the system, means a system of...
... 10 Energy 4 2010-01-01 2010-01-01 false Processing on Automated Information Systems (AIS). 1017.28... UNCLASSIFIED CONTROLLED NUCLEAR INFORMATION Physical Protection Requirements § 1017.28 Processing on Automated Information Systems (AIS). UCNI may be processed or produced on any AIS that complies with the guidance in OMB...
Tombaugh, Tom N; Rees, Laura; Stormer, Peter; Harrison, Allyson G; Smith, Andra
In spite of the fact that reaction time (RT) measures are sensitive to the effects of traumatic brain injury (TBI), few RT procedures have been developed for use in standard clinical evaluations. The computerized test of information processing (CTIP) [Tombaugh, T. N., & Rees, L. (2000). Manual for the computerized tests of information processing (CTIP). Ottawa, Ont.: Carleton University] was designed to measure the degree to which TBI decreases the speed at which information is processed. The CTIP consists of three computerized programs that progressively increase the amount of information that is processed. Results of the current study demonstrated that RT increased as the difficulty of the CTIP tests increased (known as the complexity effect), and as severity of injury increased (from mild to severe TBI). The current study also demonstrated the importance of selecting a non-biased measure of variability. Overall, findings suggest that the CTIP is an easy to administer and sensitive measure of information processing speed.
Barato, A C; Seifert, U
So far, feedback-driven systems have been discussed using (i) measurement and control, (ii) a tape interacting with a system, or (iii) by identifying an implicit Maxwell demon in steady-state transport. We derive the corresponding second laws from one master fluctuation theorem and discuss their relationship. In particular, we show that both the entropy production involving mutual information between system and controller and the one involving a Shannon entropy difference of an information reservoir like a tape carry an extra term different from the usual current times affinity. We, thus, generalize stochastic thermodynamics to the presence of an information reservoir.
Full Text Available Abstract: The aim of this paper is to point out which role the individual plays in the generation of information in social systems. First, it is argued that the individual is a social, self-conscious, creative, reflective, cultural, symbol- and language-using, active natural, producing, labouring, objective, corporeal, living, real, sensuous, visionary, imaginative, designing, co-operative being that makes its own history and can strive towards freedom and autonomy. Based on these assumptions the re-creation/self-organisation of social systems is described as a dialectic of actions and social structures and as a dialectic of individual information and social information. The individual enters economic, political and cultural relationships that result in the emergence and differentiation of social (i.e. economic, political and cultural information which enables and constrains individual actions and thinking. Individuals as actors in social systems are indispensable for social self-organisation.
The aim of this paper is to point out which role the individual plays in the generation of information in social systems. First, it is argued that the individual is a social, self-conscious, creative, reflective, cultural, symbol- and language-using, active natural, producing, labouring, objective, corporeal, living, real, sensuous, visionary, imaginative, designing, co-operative being that makes its own history and can strive towards freedom and autonomy. Based on these assumptions the re-creation/self-organisation of social systems is described as a dialectic of actions and social structures and as a dialectic of individual information and social information. The individual enters economic, political and cultural relationships that result in the emergence and differentiation of social (i.e. economic, political and cultural) information which enables and constrains individual actions and thinking. Individuals as actors in social systems are indispensable for social self-organisation.
Hunn, Bruce P; Schweitzer, Kristin M; Cahir, John A; Finch, Mary M
... intelligence, geospatial analysis cell. The Improved Performance Research Integration Tool (IMPRINT) modeling program was used to understand this process and to assess crew workload during several test scenarios...
Full Text Available Design to Robotic Production (D2RP establishes links between digital design and production in order to achieve informed materialization at an architectural scale. D2RP research is being discussed under the computation, automation and materialization themes, by reference to customizable digital design means, robotic fabrication setups and informed materialization strategies implemented by the Robotic Building group at Hyperbody, TU Delft.
Oberhauser, Roy; Reichert, Manfred
This book provides a state-of-the-art perspective on intelligent process-aware information systems and presents chapters on specific facets and approaches applicable to such systems. Further, it highlights novel advances and developments in various aspects of intelligent process-aware information systems and business process management systems. Intelligence capabilities are increasingly being integrated into or created in many of today’s software products and services. Process-aware information systems provide critical computing infrastructure to support the various processes involved in the creation and delivery of business products and services. Yet the integration of intelligence capabilities into process-aware information systems is a non-trivial yet necessary evolution of these complex systems. The book’s individual chapters address adaptive process management, case management processes, autonomically-capable processes, process-oriented information logistics, process recommendations, reasoning over ...
Losee, Robert M
Information is an important concept that is studied extensively across a range of disciplines, from the physical sciences to genetics to psychology to epistemology. Information continues to increase in importance, and the present age has been referred to as the ""Information Age."" One may understand information in a variety of ways. For some, information is found in facts that were previously unknown. For others, a fact must have some economic value to be considered information. Other people emphasize the movement through a communication channel from one location to another when describing in
Samalikova, J.; Kusters, R.J.; Trienekens, J.J.M.; Weijters, A.J.M.M.; Siemons, P.
A critical problem in software development is the monitoring, control and improvement in the processes of software developers. Software processes are often not explicitly modeled, and manuals to support the development work contain abstract guidelines and procedures. Consequently, there are huge
Brandt, Charlotte J.
, and despite the apparent reason to come to terms with IOIS, the utilization rate is still low. Adoption of IOIS is an interesting process to study, because of the high complexity in successful adoption of IOIS created by the increased number of organizations involved in the adoption process, and because...
Nigam, Aastha; Dambanemuya, Henry K; Joshi, Madhav; Chawla, Nitesh V
Peace processes are complex, protracted, and contentious involving significant bargaining and compromising among various societal and political stakeholders. In civil war terminations, it is pertinent to measure the pulse of the nation to ensure that the peace process is responsive to citizens' concerns. Social media yields tremendous power as a tool for dialogue, debate, organization, and mobilization, thereby adding more complexity to the peace process. Using Colombia's final peace agreement and national referendum as a case study, we investigate the influence of two important indicators: intergroup polarization and public sentiment toward the peace process. We present a detailed linguistic analysis to detect intergroup polarization and a predictive model that leverages Tweet structure, content, and user-based features to predict public sentiment toward the Colombian peace process. We demonstrate that had proaccord stakeholders leveraged public opinion from social media, the outcome of the Colombian referendum could have been different.
Dragnev Y. V.
Full Text Available A role and value of informative educational space in the professional becoming of future teacher of physical culture is considered. It is well-proven that such environment is characterized: by the volume of educational services, power, intensity, set of terms. It is shown that higher professional education requires perfection of the use of information technologies, programmatic and informative providing of educational process. It is set that modern information technologies are the mean of increase of efficiency of management all of spheres of public activity. It is marked that the process of forming of informative culture needs the personally oriented and differentiated going near the choice of the teaching programs. Directions of the use of information technologies in the controlled from distance teaching are selected. The ways of intensification of educational process are recommended through the increase of interest of students to the study of concrete discipline, increase of volume of independent work, increase of closeness of educational material.
Welser, Jeffrey J.; Bourianoff, George I.; Zhirnov, Victor V.; Cavin, Ralph Keary
Fundamental physical considerations indicate that the scaling of devices that use electron charge as the information carrier will limit within the next one to two decades. The Nanoelectronics Research Initiative (NRI), a joint industry-government program, has been developed to fund university research seeking devices that utilize alternative physical information carriers or non-equilibrium switching mechanisms to continue the historical cost and performance trends of information technology. Three research centers have been established to pursue five research vectors that have been identified as critical to the effort to replace the electronic switch. A brief history and rationale for NRI is given and the projects currently underway are described in the context of the five research vectors.
Marsala, Christophe; Rifqi, Maria; Yager, Ronald R
Intelligent systems are necessary to handle modern computer-based technologies managing information and knowledge. This book discusses the theories required to help provide solutions to difficult problems in the construction of intelligent systems. Particular attention is paid to situations in which the available information and data may be imprecise, uncertain, incomplete or of a linguistic nature. The main aspects of clustering, classification, summarization, decision making and systems modeling are also addressed. Topics covered in the book include fundamental issues in uncertainty, the rap
Rasouli, M.; Eshuis, H.; Grefen, P.W.P.J.; Trienekens, J.J.M.; Kusters, R.J.
Competition in today’s globalized markets forces organizations to collaborate within dynamic business networks to provide mass-customized integrated solutions for customers. The collaboration within dynamic business networks necessitates forming dynamic networked business processes (DNBPs).
Mills, Stephen F.; Yeh, Raymond T.; Giroir, Brett P.; Tanik, Murat M.
Integration of networking and data management technologies such as PACS, RIS and HIS into a healthcare enterprise in a clinically acceptable manner is a difficult problem. Data within such a facility are generally managed via a combination of manual hardcopy systems and proprietary, special-purpose data processing systems. Process modeling techniques have been successfully applied to engineering and manufacturing enterprises, but have not generally been applied to service-based enterprises such as healthcare facilities. The use of process modeling techniques can provide guidance for the placement, configuration and usage of PACS and other informatics technologies within the healthcare enterprise, and thus improve the quality of healthcare. Initial process modeling activities conducted within the Pediatric ICU at Children's Medical Center in Dallas, Texas are described. The ongoing development of a full enterprise- level model for the Pediatric ICU is also described.
Boettcher, Kevin L; Levis, Alexander H
... and a response selection stage. The total rate of internal processing of each decisionmaker is constrained by bounded rationality, while the performance of the organization must satisfy specified goals...
Helder, J.C.; Schram, P.H.; Verwey, H.; Meijler, F.L.; Robles de Medina, E.O.
The ECG handling in the University Hospital of Utrecht is composed by a system consisting of acquisition and storage of ECG signals, computer analysis, data management, and storage of readings in a patient data base. The last two modules are part of a Hospital Information System (HIS). The modular
I dette projektarbejde med titlen “Generation of optical coherent state superpositions for quantum information processing” har målet været at generere optiske kat-tilstande. Dette er en kvantemekanisk superpositions tilstand af to koherente tilstande med stor amplitude. Sådan en tilstand er...
Gonzalez-Sanchez, A.; Abreu, R.; Gross, H.; Van Gemund, A.
When failures occur during software testing, automated software fault localization helps to diagnose their root causes and identify the defective components of a program to support debugging. Diagnosis is carried out by selecting test cases in such way that their pass or fail information will narrow
This book explains how can be created information extraction (IE) applications that are able to tap the vast amount of relevant information available in natural language sources: Internet pages, official documents such as laws and regulations, books and newspapers, and social web. Readers are introduced to the problem of IE and its current challenges and limitations, supported with examples. The book discusses the need to fill the gap between documents, data, and people, and provides a broad overview of the technology supporting IE. The authors present a generic architecture for developing systems that are able to learn how to extract relevant information from natural language documents, and illustrate how to implement working systems using state-of-the-art and freely available software tools. The book also discusses concrete applications illustrating IE uses. · Provides an overview of state-of-the-art technology in information extraction (IE), discussing achievements and limitations for t...
Camila Leoni Nascimento
Full Text Available The social networks have caused changes in the consumption habits and in the ways of relationship among companies and consumers, emerging a more demanding and informed consumer. In this paper it is aimed to assess the social networks as a source of information for the purchase of goods or services. In the study it was applied a research of exploratory nature through the survey method, conducted through personal interviews using a questionnaire with closed-ended questions. The sample of non-probabilistic type was comprised of 200 individuals from a higher education institution of São Paulo State hinterland. The survey data were analyzed descriptively. Overall, the results showed the use of social networks as a source of information search, in which the main motive is the practicality. The results corroborate the studies of Kotler and Keller (2006 when they state that the consumer seeks information on social networks to help him in the purchase, as Edelman and Hirshberg (2006 when approaching the user confidence in their friends’ opinion. For future works it is recommended to check what strategies and in what ways the companies could work in order to provide more detailed data to Internet users, aiming to support them in the decision
patient the false impression that anaesthesia and surgery are ... regard to the information divulged by the anaesthetist during ... attitudes of patients with regard to the existing method of obtaining ... any change in how patients view anaesthetists before and ... specific doctors and their specialities, and a belief that all doctors.
Rinehart, Joseph B; Lee, Tiffany C; Kaneshiro, Kayleigh; Tran, Minh-Ha; Sun, Coral; Kain, Zeev N
As part of ongoing perioperative surgical home implantation process, we applied a previously published algorithm for creation of a maximum surgical blood order schedule (MSBOS) to our operating rooms. We hypothesized that using the MSBOS we could show a reduction in unnecessary preoperative blood testing and associated costs. Data regarding all surgical cases done at UC Irvine Health's operating rooms from January 1, 2011, to January 1, 2014 were extracted from the anesthesia information management systems (AIMS). After the data were organized into surgical specialties and operative sites, blood order recommendations were generated based on five specific case characteristics of the group. Next, we assessed current ordering practices in comparison to actual blood utilization to identify potential areas of wastage and performed a cost analysis comparing the annual hospital costs from preoperative blood orders if the blood order schedule were to be followed to historical practices. Of the 19,138 patients who were categorized by the MSBOS as needing no blood sample, 2694 (14.0%) had a type and screen (T/S) ordered and 1116 (5.8%) had a type and crossmatch ordered. Of the 6073 procedures where MSBOS recommended only a T/S, 2355 (38.8%) had blood crossmatched. The cost analysis demonstrated an annual reduction in actual hospital costs of $57,335 with the MSBOS compared to historical blood ordering practices. We showed that the algorithm for development of a multispecialty blood order schedule is transferable and yielded reductions in preoperative blood product screening at our institution. © 2016 AABB.
Zendel, Benjamin Rich; Lagrois, Marie-Élaine; Robitaille, Nicolas; Peretz, Isabelle
In normal listeners, the tonal rules of music guide musical expectancy. In a minority of individuals, known as amusics, the processing of tonality is disordered, which results in severe musical deficits. It has been shown that the tonal rules of music are neurally encoded, but not consciously available in amusics. Previous neurophysiological studies have not explicitly controlled the level of attention in tasks where participants ignored the tonal structure of the stimuli. Here, we test whether access to tonal knowledge can be demonstrated in congenital amusia when attention is controlled. Electric brain responses were recorded while asking participants to detect an individually adjusted near-threshold click in a melody. In half the melodies, a note was inserted that violated the tonal rules of music. In a second task, participants were presented with the same melodies but were required to detect the tonal deviation. Both tasks required sustained attention, thus conscious access to the rules of tonality was manipulated. In the click-detection task, the pitch deviants evoked an early right anterior negativity (ERAN) in both groups. In the pitch-detection task, the pitch deviants evoked an ERAN and P600 in controls but not in amusics. These results indicate that pitch regularities are represented in the cortex of amusics, but are not consciously available. Moreover, performing a pitch-judgment task eliminated the ERAN in amusics, suggesting that attending to pitch information interferes with perception of pitch. We propose that an impaired top-down frontotemporal projection is responsible for this disorder. Copyright © 2015 the authors 0270-6474/15/353815-10$15.00/0.
Thammasitboon, Satid; Darby, John B; Hair, Amy B; Rose, Karen M; Ward, Mark A; Turner, Teri L; Balmer, Dorene F
The Accreditation Council for Graduate Medical Education requires residency programs to provide curricula for residents to engage in scholarly activities but does not specify particular guidelines for instruction. We propose a Resident Scholarship Program that is framed by the self-determination theory (SDT) and emphasize the process of scholarly activity versus a scholarly product. The authors report on their longitudinal Resident Scholarship Program, which aimed to support psychological needs central to SDT: autonomy, competence, and relatedness. By addressing those needs in program aims and program components, the program may foster residents' intrinsic motivation to learn and to engage in scholarly activity. To this end, residents' engagement in scholarly processes, and changes in perceived autonomy, competence, and relatedness were assessed. Residents engaged in a range of scholarly projects and expressed positive regard for the program. Compared to before residency, residents felt more confident in the process of scholarly activity, as determined by changes in increased perceived autonomy, competence, and relatedness. Scholarly products were accomplished in return for a focus on scholarly process. Based on our experience, and in line with the SDT, supporting residents' autonomy, competence, and relatedness through a process-oriented scholarship program may foster the curiosity, inquisitiveness, and internal motivation to learn that drives scholarly activity and ultimately the production of scholarly products.
Thammasitboon, Satid; Darby, John B.; Hair, Amy B.; Rose, Karen M.; Ward, Mark A.; Turner, Teri L.; Balmer, Dorene F.
Background The Accreditation Council for Graduate Medical Education requires residency programs to provide curricula for residents to engage in scholarly activities but does not specify particular guidelines for instruction. We propose a Resident Scholarship Program that is framed by the self-determination theory (SDT) and emphasize the process of scholarly activity versus a scholarly product. Methods The authors report on their longitudinal Resident Scholarship Program, which aimed to support psychological needs central to SDT: autonomy, competence, and relatedness. By addressing those needs in program aims and program components, the program may foster residents’ intrinsic motivation to learn and to engage in scholarly activity. To this end, residents’ engagement in scholarly processes, and changes in perceived autonomy, competence, and relatedness were assessed. Results Residents engaged in a range of scholarly projects and expressed positive regard for the program. Compared to before residency, residents felt more confident in the process of scholarly activity, as determined by changes in increased perceived autonomy, competence, and relatedness. Scholarly products were accomplished in return for a focus on scholarly process. Conclusions Based on our experience, and in line with the SDT, supporting residents’ autonomy, competence, and relatedness through a process-oriented scholarship program may foster the curiosity, inquisitiveness, and internal motivation to learn that drives scholarly activity and ultimately the production of scholarly products. PMID:27306995
Mathews, G.J.; Howard, W.M.; Takahashi, K.; Ward, R.A.
The astrophysical s-process is a sequence of neutron-capture and beta-decay reactions on a slow time scale compared to beta-decay lifetimes near the line of stability. This detailed sequence of neutron capture, continuum and bound-state beta decay, positron decay, and electron-capture reactions that comprise the s-process has been studied for a broad range of astrophysical environments. The results are then compared with the solar-system abundancies of heavy elements to determine the range of physical conditions responsible for their nucleosynthesis. The nuclear data needs are extensive but have begun to be precise enough to allow for a consistent interpretation of the astrophysical site for the s-process.
Mathews, G.J.; Howard, W.M.; Takahashi, K.; Ward, R.A.
The astrophysical s-process is a sequence of neutron-capture and beta-decay reactions on a slow time scale compared to beta-decay lifetimes near the line of stability. We systematically study this detailed sequence of neutron capture, continuum and bound-state beta decay, positron decay, and electron-capture reactions that comprise the s-process for a broad range of astrophysical environments. Our results are then compared with the solar-system abundances of heavy elements to determine the range of physical conditions responsible for their nucleosynthesis. The nuclear data needs are extensive but have begun to be precise enough to allow for a consistent interpretation of the astrophysical site for the s-process.
Cash, Philip; Gonçalves, Milene
Core elements of design work include the development of problem/solution understanding, as well as information and knowledge sharing activities. However, their interrelationships have been little explored. As such, this work aims to take the first steps towards a more integrated evaluation......? A protocol analysis is used to provide the basis for characterization of different types of coevolutionary transition event. A number of distinct event types are described and significant differences in information use and team engagement are identified across transition events. Bringing these findings...... together, we propose a unitary model of the interaction between activity and understanding around co-evolutionary transition events. This has a number of implications for future theory building and testing in both design activity and wider design research....
Rothwell, Erin; Wong, Bob; Rose, Nancy C; Anderson, Rebecca; Fedor, Beth; Stark, Louisa A; Botkin, Jeffrey R
A pilot study assessed an electronic informed consent model within a randomized controlled trial (RCT). Participants who were recruited for the parent RCT project were randomly selected and randomized to either an electronic consent group (n = 32) or a simplified paper-based consent group (n = 30). Results from the electronic consent group reported significantly higher understanding of the purpose of the study, alternatives to participation, and who to contact if they had questions or concerns about the study. However, participants in the paper-based control group reported higher mean scores on some survey items. This research suggests that an electronic informed consent presentation may improve participant understanding for some aspects of a research study. © The Author(s) 2014.
Xin, Jie; Cui, Zhiming; Zhang, Shukui; He, Tianxu; Li, Chunhua; Huang, Haojing
Internet of Things (IoT) is regarded as a remarkable development of the modern information technology. There is abundant digital products data on the IoT, linking with multiple types of objects/entities. Those associated entities carry rich information and usually in the form of query records. Therefore, constructing high quality topic hierarchies that can capture the term distribution of each product record enables us to better understand users' search intent and benefits tasks such as taxonomy construction, recommendation systems, and other communications solutions for the future IoT. In this paper, we propose a novel record entity topic model (RETM) for IoT environment that is associated with a set of entities and records and a Gibbs sampling-based algorithm is proposed to learn the model. We conduct extensive experiments on real-world datasets and compare our approach with existing methods to demonstrate the advantage of our approach.
Diane K. Yatchmenoff
Full Text Available The importance of trauma-informed care (TIC is now recognized across most health and human service systems. Providers are calling for concrete examples of what TIC means in practice and how to create more trauma-informed organizations. However, much of the current understanding about implementation rests on principles and values rather than specific recommendations for action. This paper addresses this gap based on observations during the provision of technical assistance over the past decade in fields like mental health and addictions, juvenile justice, child welfare, healthcare, housing, and education. Focusing on the infrastructure for making change (the TIC workgroup, assessment and planning, and the early stages of implementation, the authors discuss barriers and challenges that are commonly encountered, strategies that have proven effective in addressing barriers, and specific action steps that can help sustain momentum for the longer term.
Xin, Jie; Cui, Zhiming; Zhang, Shukui; He, Tianxu; Li, Chunhua; Huang, Haojing
Internet of Things (IoT) is regarded as a remarkable development of the modern information technology. There is abundant digital products data on the IoT, linking with multiple types of objects/entities. Those associated entities carry rich information and usually in the form of query records. Therefore, constructing high quality topic hierarchies that can capture the term distribution of each product record enables us to better understand users' search intent and benefits tasks such as taxonomy construction, recommendation systems, and other communications solutions for the future IoT. In this paper, we propose a novel record entity topic model (RETM) for IoT environment that is associated with a set of entities and records and a Gibbs sampling-based algorithm is proposed to learn the model. We conduct extensive experiments on real-world datasets and compare our approach with existing methods to demonstrate the advantage of our approach. PMID:25110737
The olfactory system is an attractive model system due to the easy control of sensory input and the experimental accessibility in animal studies. The odorant signals are processed from receptor neurons to a neural network of mitral and granular cells while various types of nonlinear behaviour can...... and equation-free techniques allow for a better reproduction and understanding of recent experimental findings. Talks: Olfaction as a Model System for Sensory-Processing Neural Networks (Jens Midtgaard, University of Copenhagen, Denmark) Nonlinear Effects of Signal Transduction in Olfactory Sensory Neurons...
Full Text Available This paper discusses the extent to which algorithms developed for the processing of textual databases are also applicable to the processing of chemical structure databases, and vice versa. Applications discussed include: an algorithm for distribution sorting that has been applied to the design of screening systems for rapid chemical substructure searching; the use of measures of inter-molecular structural similarity for the analysis of hypertext graphs; a genetic algorithm for calculating term weights for relevance feedback searching for determining whether a molecule is likely to exhibit biological activity; and the use of data fusion to combine the results of different chemical similarity searches.
Full Text Available This paper presents the methodology and research results on identification of potential users of the ESABALT system, which is targeted towards improving the situational awareness in the Baltic Sea region. We describe the technique of analysing the stakeholders involved in maritime sector processes, especially in maritime transport processes, while also taking into account their different classification criteria. The resulting list of stakeholders is used to identify system users and their classification into user profiles groups. This study will form the basis for the identification of user requirements of the ESABALT system.
N. V. Katargin
Possible to describe complicate as natural as social systems as objects consisted of nonlinearly dependent elements at the multi-dimension (phase) space contained as real as information components. The system movement is defined by natural growth of entropy and its decrease as a result of using of external energy sources and other resources. Examined the аssociation of entropy with the value of objects, as well as with humanitarian concepts: God's Providence, morality, and happiness.
N. V. Katargin
Full Text Available Possible to describe complicate as natural as social systems as objects consisted of nonlinearly dependent elements at the multi-dimension (phase space contained as real as information components. The system movement is defined by natural growth of entropy and its decrease as a result of using of external energy sources and other resources. Examined the аssociation of entropy with the value of objects, as well as with humanitarian concepts: God's Providence, morality, and happiness.
significant development for disseminating the results of biomedical research in our lifetime." Sir Paul Nurse , Cancer Research UK Your research papers...of the evidence for local cortical over-connectivity is anecdotal. Belmonte and colleagues suggested the co-morbidity with epilepsy that is highly...Tomma-Halme J, Lahti-Nuuttila P, Service E, Virsu V: Rate of information segregation in developmentally dyslexic children . Brain Lang 2000, 75:66-81
Full Text Available Decision-making takes place at all levels of the organization, taking into account both short-term outlook and long-term perspective. Plans are implemented by decisions whose purpose is materialized by formulating rational conclusions obtained as a result of financial and quantitative analysis. Thus, managerial accounting practice is deeply involved in decision making, a basic requirement of the existence of a solid managerial accounting information system cost, able to provide fundamental data.
EC OBICI NICOLAE
Full Text Available Decision-making takes place at all levels of the organization, taking into account both short-term outlook and long-term perspective. Plans are implemented by decisions whose purpose is materialized by formulating rational conclusions obtained as a result of financial and quantitative analysis. Thus, managerial accounting practice is deeply involved in decision making, a basic requirement of the existence of a solid managerial accounting information system cost, able to provide fundamental data
Belblidia, L.A.; Carlson, R.W.; Russell, J.L. Jr.
One of the 'lessons learned' from the Three Mile Island accident focuses upon the need for a validated source of plant-status information in the control room. The utilization of computer-generated graphics to display the readings of the major plant instrumentation has introduced the capability of validating signals prior to their presentation to the reactor operations staff. The current operations philosophies allow the operator a quick look at the gauges to form an impression of the fraction of full scale as the basis for knowledge of the current plant conditions. After the introduction of a computer-based information-display system such as the Safety Parameter Display System (SPDS), operational decisions can be based upon precise knowledge of the parameters that define the operation of the reactor and auxiliary systems. The principal impact of this system on the operator will be to remove the continuing concern for the validity of the instruments which provide the information that governs the operator's decisions. (author)
Mayer, Joerg; Schuster, Heinz Georg; Claussen, Jens Christian
The information transfer in the thalamus is blocked dynamically during sleep, in conjunction with the occurrence of spindle waves. In order to describe the dynamic mechanisms which control the sensory transfer of information, it is necessary to have a qualitative model for the response properties of thalamic neurons. As the theoretical understanding of the mechanism remains incomplete, we analyze two modeling approaches for a recent experiment by Le Masson et al. [Nature (London) 417, 854 (2002)] on the thalamocortical loop. We use a conductance based model in order to motivate an extension of the Hindmarsh-Rose model, which mimics experimental observations of Le Masson et al. Typically, thalamic neurons posses two different firing modes, depending on their membrane potential. At depolarized potentials, the cells fire in a single spike mode and relay synaptic inputs in a one-to-one manner to the cortex. If the cell gets hyperpolarized, T-type calcium currents generate burst-mode firing which leads to a decrease in the spike transfer. In thalamocortical circuits, the cell membrane gets hyperpolarized by recurrent inhibitory feedback loops. In the case of reciprocally coupled excitatory and inhibitory neurons, inhibitory feedback leads to metastable self-sustained oscillations, which mask the incoming input, and thereby reduce the information transfer significantly
Ryan, Mandy; Krucien, Nicolas; Hermens, Frouke
Although choice experiments (CEs) are widely applied in economics to study choice behaviour, understanding of how individuals process attribute information remains limited. We show how eye-tracking methods can provide insight into how decisions are made. Participants completed a CE, while their eye movements were recorded. Results show that although the information presented guided participants' decisions, there were also several processing biases at work. Evidence was found of (a) top-to-bottom, (b) left-to-right, and (c) first-to-last order biases. Experimental factors-whether attributes are defined as "best" or "worst," choice task complexity, and attribute ordering-also influence information processing. How individuals visually process attribute information was shown to be related to their choices. Implications for the design and analysis of CEs and future research are discussed. Copyright © 2017 John Wiley & Sons, Ltd.
Casado, Alba; Palma, Alfonso; Paolieri, Daniela
Three different tasks (word repetition, lexical decision, and gender decision) were designed to explore the impact of the sex clues (sex of the speaker, sex of the addressee) and the type of gender (semantic, arbitrary) on the processing of isolated Spanish gendered words. The findings showed that the grammatical gender feature was accessed when…
Bates, Timothy C.; Rock, Andrew
Raven's matrices and inspection time (IT) were recorded from 56 subjects under five arousal levels. Raven's and IT correlated strongly (r = -0.7) as predicted by processing-speed theories of "g." In line with Eysenck's [Eysenck, H. J. (1967). "The biological basis of personality". Springfield, IL: Thomas] arousal theory of extraversion, there was…
... the initial assessment rate being one cent per pound, which shall be paid to the National Processed... countries and their share of the imports are: Chile (78 percent) and Canada (17 percent). The 1996 Act... found that it is in the national public interest and vital to the welfare of the agricultural economy of...
Full Text Available Individuals who are unaware of a need for information and/or who have not experienced the information retrieval process while meeting such a need cannot be a part of information society. Only those individuals with an awareness that information is essential to the problem-solving and decision-making processes, who are equipped with information retrieval and utilization skills and who can further integrate such skills into their daily lives, can be a part of an information society and attain the capability of performing properly in their societal roles and thus ultimately of shaping their society. Moving from this context, this article defines the elements of the information retrieval process, starting with the concept of information, and studies the influences of the information retrieval process on problem solving and decision making.
De Dreu, Carsten K W; Beersma, Bianca; Stroebe, Katherine; Euwema, Martin C.
The authors tested a motivated information-processing model of negotiation: To reach high joint outcomes, negotiators need a deep understanding of the task, which requires them to exchange information and to process new information systematically. All this depends on social motivation, epistemic
De Dreu, CKW; Beersma, B; Stroebe, K; Euwema, MC
The authors tested a motivated information-processing model of negotiation: To reach high joint outcomes, negotiators need a deep understanding of the task, which requires them to exchange information and to process new information. systematically. All this depends on social motivation, epistemic
...-01] Announcing DRAFT Revisions to Federal Information Processing Standard (FIPS) 186-3, Digital... Technology (NIST) requests comments on revisions to Federal Information Processing Standard (FIPS) 186-3... 25, 2012. ADDRESSES: Written comments may be sent to: Chief, Computer Security Division, Information...
...-03] NIST Federal Information Processing Standard (FIPS) 140-3 (Second Draft), Security Requirements....'' Authority: Federal Information Processing Standards (FIPS) are issued by the National Institute of Standards... Standards and Technology (NIST) seeks additional comments on specific sections of Federal Information...
... 49 Transportation 6 2010-10-01 2010-10-01 false Information filing; agency processing of filings... HIGHWAY TRAFFIC SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION REPLACEABLE LIGHT SOURCE INFORMATION (Eff. until 12-01-12) § 564.5 Information filing; agency processing of filings. (a) Each manufacturer...
... Information Collection: Comment Request Delegated Processing for Certain 202 Supportive Housing for the.... This Notice also lists the following information: Title of Proposal: Delegated Processing for Certain... proposed information collection requirement described below will be submitted to the Office of Management...