WorldWideScience

Sample records for complex information processing

  1. Information processing in complex networks

    OpenAIRE

    Quax, R.

    2013-01-01

    Eerste resultaten van onderzoek van Rick Quax suggereren dat een combinatie van informatietheorie, netwerktheorie en statistische mechanica kan leiden tot een veelbelovende theorie om het gedrag van complexe netwerken te voorspellen. Er bestaat nog weinig theorie over het gedrag van dynamische eenheden die verbonden zijn in een netwerk, zoals neuronen in een breinnetwerk of genen in een gen-regulatienetwerk. Quax combineert informatietheorie, netwerktheorie, en statistische onderzoeken en mec...

  2. Modelling of information processes management of educational complex

    Directory of Open Access Journals (Sweden)

    Оксана Николаевна Ромашкова

    2014-12-01

    Full Text Available This work concerns information model of the educational complex which includes several schools. A classification of educational complexes formed in Moscow is given. There are also a consideration of the existing organizational structure of the educational complex and a suggestion of matrix management structure. Basic management information processes of the educational complex were conceptualized.

  3. Quantum-information processing in disordered and complex quantum systems

    International Nuclear Information System (INIS)

    Sen, Aditi; Sen, Ujjwal; Ahufinger, Veronica; Briegel, Hans J.; Sanpera, Anna; Lewenstein, Maciej

    2006-01-01

    We study quantum information processing in complex disordered many body systems that can be implemented by using lattices of ultracold atomic gases and trapped ions. We demonstrate, first in the short range case, the generation of entanglement and the local realization of quantum gates in a disordered magnetic model describing a quantum spin glass. We show that in this case it is possible to achieve fidelities of quantum gates higher than in the classical case. Complex systems with long range interactions, such as ions chains or dipolar atomic gases, can be used to model neural network Hamiltonians. For such systems, where both long range interactions and disorder appear, it is possible to generate long range bipartite entanglement. We provide an efficient analytical method to calculate the time evolution of a given initial state, which in turn allows us to calculate its quantum correlations

  4. The visual illustration of complex process information during abnormal incidents

    International Nuclear Information System (INIS)

    Heimbuerger, H.; Kautto, A.; Norros, L.; Ranta, J.

    1985-01-01

    One of the proposed solutions to the man-process interface problem in nuclear power plants is the integration of a system in the control room that can provide the operator with a display of a minimum set of critical plant parameters defining the safety status of the plant. Such a system has been experimentally validated using the Loviisa training simulator during the fall of 1982. The project was a joint effort between Combustion Engineering Inc., the Halden Reactor Project, Imatran Voima Oy and VTT. Alarm systems are used in nuclear power plants to tell the control room operators that an unexpected change in the plant operation state has occurred. One difficulty in using the alarms for checking the actions of the operator is that the conventional way of realizing the alarm systems implies that several alarms are active also during normal operation. The coding and representation of alarm information will be discussed in the paper. An important trend in control room design is the move away from direct, concrete indication of process parameters towards use of more abstract/logical representation of information as a basis for plant supervision. Recent advances in computer graphics provide the possibility that, in the future, visual information will be utilized to make the essential dynamics of the process more intelligible. A set of criteria for use of visual information will be necessary. The paper discusses practical aspects for the realisation of such criteria in the context of nuclear power plant. The criteria of the decomposition of the process information concerning the sub-goals safety and availability and also the tentative results of the conceptualization of a PWR-process are discussed in the paper

  5. Exploiting global information in complex network repair processes

    Institute of Scientific and Technical Information of China (English)

    Tianyu WANG; Jun ZHANG; Sebastian WANDELT

    2017-01-01

    Robustness of complex networks has been studied for decades,with a particular focus on network attack.Research on network repair,on the other hand,has been conducted only very lately,given the even higher complexity and absence of an effective evaluation metric.A recently proposed network repair strategy is self-healing,which aims to repair networks for larger compo nents at a low cost only with local information.In this paper,we discuss the effectiveness and effi ciency of self-healing,which limits network repair to be a multi-objective optimization problem and makes it difficult to measure its optimality.This leads us to a new network repair evaluation metric.Since the time complexity of the computation is very high,we devise a greedy ranking strategy.Evaluations on both real-world and random networks show the effectiveness of our new metric and repair strategy.Our study contributes to optimal network repair algorithms and provides a gold standard for future studies on network repair.

  6. MAIA - Method for Architecture of Information Applied: methodological construct of information processing in complex contexts

    Directory of Open Access Journals (Sweden)

    Ismael de Moura Costa

    2017-04-01

    Full Text Available Introduction: Paper to presentation the MAIA Method for Architecture of Information Applied evolution, its structure, results obtained and three practical applications.Objective: Proposal of a methodological constructo for treatment of complex information, distinguishing information spaces and revealing inherent configurations of those spaces. Metodology: The argument is elaborated from theoretical research of analitical hallmark, using distinction as a way to express concepts. Phenomenology is used as a philosophical position, which considers the correlation between Subject↔Object. The research also considers the notion of interpretation as an integrating element for concepts definition. With these postulates, the steps to transform the information spaces are formulated. Results: This article explores not only how the method is structured to process information in its contexts, starting from a succession of evolutive cicles, divided in moments, which, on their turn, evolve to transformation acts. Conclusions: This article explores not only how the method is structured to process information in its contexts, starting from a succession of evolutive cicles, divided in moments, which, on their turn, evolve to transformation acts. Besides that, the article presents not only possible applications as a cientific method, but also as configuration tool in information spaces, as well as generator of ontologies. At last, but not least, presents a brief summary of the analysis made by researchers who have already evaluated the method considering the three aspects mentioned.

  7. Phenylketonuria and Complex Spatial Visualization: An Analysis of Information Processing.

    Science.gov (United States)

    Brunner, Robert L.; And Others

    1987-01-01

    The study of the ability of 16 early treated phenylketonuric (PKU) patients (ages 6-23 years) to solve complex spatial problems suggested that choice of problem-solving strategy, attention span, and accuracy of mental representation may be affected in PKU patients, despite efforts to maintain well-controlled phenylalanine concentrations in the…

  8. Effects of emotional tone and visual complexity on processing health information in prescription drug advertising.

    Science.gov (United States)

    Norris, Rebecca L; Bailey, Rachel L; Bolls, Paul D; Wise, Kevin R

    2012-01-01

    This experiment explored how the emotional tone and visual complexity of direct-to-consumer (DTC) drug advertisements affect the encoding and storage of specific risk and benefit statements about each of the drugs in question. Results are interpreted under the limited capacity model of motivated mediated message processing framework. Findings suggest that DTC drug ads should be pleasantly toned and high in visual complexity in order to maximize encoding and storage of risk and benefit information.

  9. Further Understanding of Complex Information Processing in Verbal Adolescents and Adults with Autism Spectrum Disorders

    Science.gov (United States)

    Williams, Diane L.; Minshew, Nancy J.; Goldstein, Gerald

    2015-01-01

    More than 20?years ago, Minshew and colleagues proposed the Complex Information Processing model of autism in which the impairment is characterized as a generalized deficit involving multiple modalities and cognitive domains that depend on distributed cortical systems responsible for higher order abilities. Subsequent behavioral work revealed a…

  10. Communication complexity and information complexity

    Science.gov (United States)

    Pankratov, Denis

    Information complexity enables the use of information-theoretic tools in communication complexity theory. Prior to the results presented in this thesis, information complexity was mainly used for proving lower bounds and direct-sum theorems in the setting of communication complexity. We present three results that demonstrate new connections between information complexity and communication complexity. In the first contribution we thoroughly study the information complexity of the smallest nontrivial two-party function: the AND function. While computing the communication complexity of AND is trivial, computing its exact information complexity presents a major technical challenge. In overcoming this challenge, we reveal that information complexity gives rise to rich geometrical structures. Our analysis of information complexity relies on new analytic techniques and new characterizations of communication protocols. We also uncover a connection of information complexity to the theory of elliptic partial differential equations. Once we compute the exact information complexity of AND, we can compute exact communication complexity of several related functions on n-bit inputs with some additional technical work. Previous combinatorial and algebraic techniques could only prove bounds of the form theta( n). Interestingly, this level of precision is typical in the area of information theory, so our result demonstrates that this meta-property of precise bounds carries over to information complexity and in certain cases even to communication complexity. Our result does not only strengthen the lower bound on communication complexity of disjointness by making it more exact, but it also shows that information complexity provides the exact upper bound on communication complexity. In fact, this result is more general and applies to a whole class of communication problems. In the second contribution, we use self-reduction methods to prove strong lower bounds on the information

  11. Automated complex for information retrieval and processing in the gamma-resonance spectrometry

    International Nuclear Information System (INIS)

    Belogurov, V.N.; Bylinkin, V.A.

    1977-01-01

    A complex for information retrieval and processing in the Moessbauer effect spectrometry is described. The complex consists of a set of 4 precision spectrometers nad a program system for the computation of Moessbauer effect spectrum. High velocity accuracy - 0.004 mm/s during 6 months operation is achieved by introducing an additional negative feedback, the signal from which is obtained when the time of electromagnetic vibrator driving rod passage in the middle of the cycle is compared to the half-period length of the operation of the multichannel analyser address register. Information from 4 spectrometers via a commutation unit and an equalizer unit is analyzed by one analyser. Descriptions and schemes of spectrometers, procedure and scheme of calibration and checking of their operation are given. Described are the system of connection of spectrometers with the BESM-4 computer and the program complex including programs of information input, check-up,, correction and storage in the computer, the calibration of the spectrometer velocity scale and the programs for computing gamma-resonance spectra. The description of the operational principles of these programs and their block diagrams is given

  12. Neuropsychological study of FASD in a sample of American Indian children: processing simple versus complex information.

    Science.gov (United States)

    Aragón, Alfredo S; Kalberg, Wendy O; Buckley, David; Barela-Scott, Lindsey M; Tabachnick, Barbara G; May, Philip A

    2008-12-01

    Although a large body of literature exists on cognitive functioning in alcohol-exposed children, it is unclear if there is a signature neuropsychological profile in children with Fetal Alcohol Spectrum Disorders (FASD). This study assesses cognitive functioning in children with FASD from several American Indian reservations in the Northern Plains States, and it applies a hierarchical model of simple versus complex information processing to further examine cognitive function. We hypothesized that complex tests would discriminate between children with FASD and culturally similar controls, while children with FASD would perform similar to controls on relatively simple tests. Our sample includes 32 control children and 24 children with a form of FASD [fetal alcohol syndrome (FAS) = 10, partial fetal alcohol syndrome (PFAS) = 14]. The test battery measures general cognitive ability, verbal fluency, executive functioning, memory, and fine-motor skills. Many of the neuropsychological tests produced results consistent with a hierarchical model of simple versus complex processing. The complexity of the tests was determined "a priori" based on the number of cognitive processes involved in them. Multidimensional scaling was used to statistically analyze the accuracy of classifying the neurocognitive tests into a simple versus complex dichotomy. Hierarchical logistic regression models were then used to define the contribution made by complex versus simple tests in predicting the significant differences between children with FASD and controls. Complex test items discriminated better than simple test items. The tests that conformed well to the model were the Verbal Fluency, Progressive Planning Test (PPT), the Lhermitte memory tasks, and the Grooved Pegboard Test (GPT). The FASD-grouped children, when compared with controls, demonstrated impaired performance on letter fluency, while their performance was similar on category fluency. On the more complex PPT trials (problems 5 to

  13. Efficient physical embedding of topologically complex information processing networks in brains and computer circuits.

    Directory of Open Access Journals (Sweden)

    Danielle S Bassett

    2010-04-01

    Full Text Available Nervous systems are information processing networks that evolved by natural selection, whereas very large scale integrated (VLSI computer circuits have evolved by commercially driven technology development. Here we follow historic intuition that all physical information processing systems will share key organizational properties, such as modularity, that generally confer adaptivity of function. It has long been observed that modular VLSI circuits demonstrate an isometric scaling relationship between the number of processing elements and the number of connections, known as Rent's rule, which is related to the dimensionality of the circuit's interconnect topology and its logical capacity. We show that human brain structural networks, and the nervous system of the nematode C. elegans, also obey Rent's rule, and exhibit some degree of hierarchical modularity. We further show that the estimated Rent exponent of human brain networks, derived from MRI data, can explain the allometric scaling relations between gray and white matter volumes across a wide range of mammalian species, again suggesting that these principles of nervous system design are highly conserved. For each of these fractal modular networks, the dimensionality of the interconnect topology was greater than the 2 or 3 Euclidean dimensions of the space in which it was embedded. This relatively high complexity entailed extra cost in physical wiring: although all networks were economically or cost-efficiently wired they did not strictly minimize wiring costs. Artificial and biological information processing systems both may evolve to optimize a trade-off between physical cost and topological complexity, resulting in the emergence of homologous principles of economical, fractal and modular design across many different kinds of nervous and computational networks.

  14. Automated information and control complex of hydro-gas endogenous mine processes

    Science.gov (United States)

    Davkaev, K. S.; Lyakhovets, M. V.; Gulevich, T. M.; Zolin, K. A.

    2017-09-01

    The automated information and control complex designed to prevent accidents, related to aerological situation in the underground workings, accounting of the received and handed over individual devices, transmission and display of measurement data, and the formation of preemptive solutions is considered. Examples for the automated workplace of an airgas control operator by individual means are given. The statistical characteristics of field data characterizing the aerological situation in the mine are obtained. The conducted studies of statistical characteristics confirm the feasibility of creating a subsystem of controlled gas distribution with an adaptive arrangement of points for gas control. The adaptive (multivariant) algorithm for processing measuring information of continuous multidimensional quantities and influencing factors has been developed.

  15. Motor dysfunction of complex regional pain syndrome is related to impaired central processing of proprioceptive information.

    Science.gov (United States)

    Bank, Paulina J M; Peper, C Lieke E; Marinus, Johan; Beek, Peter J; van Hilten, Jacobus J

    2013-11-01

    Our understanding of proprioceptive deficits in complex regional pain syndrome (CRPS) and its potential contribution to impaired motor function is still limited. To gain more insight into these issues, we evaluated accuracy and precision of joint position sense over a range of flexion-extension angles of the wrist of the affected and unaffected sides in 25 chronic CRPS patients and in 50 healthy controls. The results revealed proprioceptive impairment at both the patients' affected and unaffected sides, characterized predominantly by overestimation of wrist extension angles. Precision of the position estimates was more prominently reduced at the affected side. Importantly, group differences in proprioceptive performance were observed not only for tests at identical percentages of each individual's range of wrist motion but also when controls were tested at wrist angles that corresponded to those of the patient's affected side. More severe motor impairment of the affected side was associated with poorer proprioceptive performance. Based on additional sensory tests, variations in proprioceptive performance over the range of wrist angles, and comparisons between active and passive displacements, the disturbances of proprioceptive performance most likely resulted from altered processing of afferent (and not efferent) information and its subsequent interpretation in the context of a distorted "body schema." The present results point at a significant role for impaired central processing of proprioceptive information in the motor dysfunction of CRPS and suggest that therapeutic strategies aimed at identification of proprioceptive impairments and their restoration may promote the recovery of motor function in CRPS patients. Copyright © 2013 American Pain Society. Published by Elsevier Inc. All rights reserved.

  16. System model the processing of heterogeneous sensory information in robotized complex

    Science.gov (United States)

    Nikolaev, V.; Titov, V.; Syryamkin, V.

    2018-05-01

    Analyzed the scope and the types of robotic systems consisting of subsystems of the form "a heterogeneous sensors data processing subsystem". On the basis of the Queuing theory model is developed taking into account the unevenness of the intensity of information flow from the sensors to the subsystem of information processing. Analytical solution to assess the relationship of subsystem performance and uneven flows. The research of the obtained solution in the range of parameter values of practical interest.

  17. Unifying Complexity and Information

    Science.gov (United States)

    Ke, Da-Guan

    2013-04-01

    Complex systems, arising in many contexts in the computer, life, social, and physical sciences, have not shared a generally-accepted complexity measure playing a fundamental role as the Shannon entropy H in statistical mechanics. Superficially-conflicting criteria of complexity measurement, i.e. complexity-randomness (C-R) relations, have given rise to a special measure intrinsically adaptable to more than one criterion. However, deep causes of the conflict and the adaptability are not much clear. Here I trace the root of each representative or adaptable measure to its particular universal data-generating or -regenerating model (UDGM or UDRM). A representative measure for deterministic dynamical systems is found as a counterpart of the H for random process, clearly redefining the boundary of different criteria. And a specific UDRM achieving the intrinsic adaptability enables a general information measure that ultimately solves all major disputes. This work encourages a single framework coving deterministic systems, statistical mechanics and real-world living organisms.

  18. Clinical Information Systems as the Backbone of a Complex Information Logistics Process: Findings from the Clinical Information Systems Perspective for 2016.

    Science.gov (United States)

    Hackl, W O; Ganslandt, T

    2017-08-01

    Objective: To summarize recent research and to propose a selection of best papers published in 2016 in the field of Clinical Information Systems (CIS). Method: The query used to retrieve the articles for the CIS section of the 2016 edition of the IMIA Yearbook of Medical Informatics was reused. It again aimed at identifying relevant publications in the field of CIS from PubMed and Web of Science and comprised search terms from the Medical Subject Headings (MeSH) catalog as well as additional free text search terms. The retrieved articles were categorized in a multi-pass review carried out by the two section editors. The final selection of candidate papers was then peer-reviewed by Yearbook editors and external reviewers. Based on the review results, the best papers were then chosen at the selection meeting with the IMIA Yearbook editorial board. Text mining, term co-occurrence mapping, and topic modelling techniques were used to get an overview on the content of the retrieved articles. Results: The query was carried out in mid-January 2017, yielding a consolidated result set of 2,190 articles published in 921 different journals. Out of them, 14 papers were nominated as candidate best papers and three of them were finally selected as the best papers of the CIS field. The content analysis of the articles revealed the broad spectrum of topics covered by CIS research. Conclusions: The CIS field is multi-dimensional and complex. It is hard to draw a well-defined outline between CIS and other domains or other sections of the IMIA Yearbook. The trends observed in the previous years are progressing. Clinical information systems are more than just sociotechnical systems for data collection, processing, exchange, presentation, and archiving. They are the backbone of a complex, trans-institutional information logistics process. Georg Thieme Verlag KG Stuttgart.

  19. An information transfer based novel framework for fault root cause tracing of complex electromechanical systems in the processing industry

    Science.gov (United States)

    Wang, Rongxi; Gao, Xu; Gao, Jianmin; Gao, Zhiyong; Kang, Jiani

    2018-02-01

    As one of the most important approaches for analyzing the mechanism of fault pervasion, fault root cause tracing is a powerful and useful tool for detecting the fundamental causes of faults so as to prevent any further propagation and amplification. Focused on the problems arising from the lack of systematic and comprehensive integration, an information transfer-based novel data-driven framework for fault root cause tracing of complex electromechanical systems in the processing industry was proposed, taking into consideration the experience and qualitative analysis of conventional fault root cause tracing methods. Firstly, an improved symbolic transfer entropy method was presented to construct a directed-weighted information model for a specific complex electromechanical system based on the information flow. Secondly, considering the feedback mechanisms in the complex electromechanical systems, a method for determining the threshold values of weights was developed to explore the disciplines of fault propagation. Lastly, an iterative method was introduced to identify the fault development process. The fault root cause was traced by analyzing the changes in information transfer between the nodes along with the fault propagation pathway. An actual fault root cause tracing application of a complex electromechanical system is used to verify the effectiveness of the proposed framework. A unique fault root cause is obtained regardless of the choice of the initial variable. Thus, the proposed framework can be flexibly and effectively used in fault root cause tracing for complex electromechanical systems in the processing industry, and formulate the foundation of system vulnerability analysis and condition prediction, as well as other engineering applications.

  20. A business process modeling experience in a complex information system re-engineering.

    Science.gov (United States)

    Bernonville, Stéphanie; Vantourout, Corinne; Fendeler, Geneviève; Beuscart, Régis

    2013-01-01

    This article aims to share a business process modeling experience in a re-engineering project of a medical records department in a 2,965-bed hospital. It presents the modeling strategy, an extract of the results and the feedback experience.

  1. Information Interaction Criteria Among Students in Process of Task-Based Information Searching (Role of Objective Complexity and Type of Product

    Directory of Open Access Journals (Sweden)

    Marziyeh Saeedizadeh

    2016-08-01

    Full Text Available Purpose:  human-information interactions must be considered in order to be able to interactively design Information Retrieval Systems (IRS. In this regard, study of users’ interactions must be based on their socio-cultural context (specifically work tasks. Accordingly, this paper aims to explore the use of information-interaction criteria among students in the information searching process according to different kinds of their work tasks.  Methodology: This research is applied qualitative method using exploratory study. The research population consisted of MSc students of Ferdowsi university of Mashhad enrolled in 2012-13  academic year. In 3 stages of sampling (random stratified, quota, and voluntary sampling, 30 cases were selected. Each of these cases searched 6 different types of simulated work tasks. Interaction criteria were extracted ? Content analysis of aloud thinking reports. Validity of tools was verified through Faculties of KIS at Ferdowsi university of Mashhad. Also,0.78  Kripendorff’s alpha ratio based on an agreement between the inter – coder indicates the Dependability  of content analysis. Findings: The findings show that in addition to ‘topic’ criteria, other interaction criteria impact on information- interaction of users, such as: ‘search results ranking’, ‘domain knowledge of user’, ‘layout’, ‘type of information resource’ and etc. based on the level of objective complexity and product of  work tasks, information-interaction criteria change. Conclusion: the users pay attention to different information-interaction criteria in process of information searching, considering to variety of work tasks (level of objective complexity and product. So, it is necessary to pay attention to work task characteristics in order to design interactive and personalized IR systems.

  2. Use of Information Intelligent Components for the Analysis of Complex Processes of Marine Energy Systems

    Directory of Open Access Journals (Sweden)

    Chernyi Sergei

    2016-09-01

    Full Text Available Synchronous motors and their modifications (ac converter-fed motor, etc. enable to develop low-noise, reliable and economically efficient electric drive systems. The construction of up-to-date systems based on the synchronous machines is impossible without development of the computing software incorporating mathematical and computing simulation. In its turn, modelling of the synchronous machines as a rule is based on the equations by Park-Gorev, application of which requires adoption of a series of allowances. These allowances in a number of cases do not permit to obtain adequate results of the simulation coinciding with the results of field experiments of the systems under review. Moreover, while applying the system of equations by Park-Gorev to research the systems including interaction of synchronous machines with semiconducting converters of electric energy it is necessary simulate the process of formation their controlling signals in the sphere of frequency. If the states of converter’s keys are defined not only by controlling impulses but also by the state of currents in the synchronous machines flowing through them, such approach is not reasonable.

  3. The value of mechanistic biophysical information for systems-level understanding of complex biological processes such as cytokinesis.

    Science.gov (United States)

    Pollard, Thomas D

    2014-12-02

    This review illustrates the value of quantitative information including concentrations, kinetic constants and equilibrium constants in modeling and simulating complex biological processes. Although much has been learned about some biological systems without these parameter values, they greatly strengthen mechanistic accounts of dynamical systems. The analysis of muscle contraction is a classic example of the value of combining an inventory of the molecules, atomic structures of the molecules, kinetic constants for the reactions, reconstitutions with purified proteins and theoretical modeling to account for the contraction of whole muscles. A similar strategy is now being used to understand the mechanism of cytokinesis using fission yeast as a favorable model system. Copyright © 2014 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  4. The Process of Handling an Excess of Complex and Interdisciplinary Information in a Decision Support Research Situation

    Directory of Open Access Journals (Sweden)

    Fredrik Moltu Johnsen

    2017-06-01

    Full Text Available Researchers are sometimes expected to investigate a complex and interdisciplinary subject-matter in order to provide scientific support for large-scale decisions. This may prove challenging: typically, a lack of cohesion between the pieces of information investigated in the starting phase may cause confusion. This article suggests one possible road from this problem, which may lead to holistic understanding and next to communication and implementation of this understanding. The process is presented as a diagram, and selected aspects of it are analysed. The process involves moving to a higher level of generalisation in order to gain a better overview and potentially invent new concepts, and next moving back to a more detailed level in order to communicate and implement these insights. Potential challenges and roadblocks are identified. The possible conflict between normal science and decision support is briefly investigated; it is pointed out that “post-normal science” may be a more appropriate description of such processes than simply “science”.

  5. Processing of spatial and non-spatial information in rats with lesions of the medial and lateral entorhinal cortex: Environmental complexity matters.

    Science.gov (United States)

    Rodo, Christophe; Sargolini, Francesca; Save, Etienne

    2017-03-01

    The entorhinal-hippocampal circuitry has been suggested to play an important role in episodic memory but the contribution of the entorhinal cortex remains elusive. Predominant theories propose that the medial entorhinal cortex (MEC) processes spatial information whereas the lateral entorhinal cortex (LEC) processes non spatial information. A recent study using an object exploration task has suggested that the involvement of the MEC and LEC spatial and non-spatial information processing could be modulated by the amount of information to be processed, i.e. environmental complexity. To address this hypothesis we used an object exploration task in which rats with excitotoxic lesions of the MEC and LEC had to detect spatial and non-spatial novelty among a set of objects and we varied environmental complexity by decreasing the number of objects or amount of object diversity. Reducing diversity resulted in restored ability to process spatial and non-spatial information in MEC and LEC groups, respectively. Reducing the number of objects yielded restored ability to process non-spatial information in the LEC group but not the ability to process spatial information in the MEC group. The findings indicate that the MEC and LEC are not strictly necessary for spatial and non-spatial processing but that their involvement depends on the complexity of the information to be processed. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Communication Analysis of Information Complexes.

    Science.gov (United States)

    Malik, M. F.

    Communication analysis is a tool for perceptual assessment of existing or projected information complexes, i.e., an established reality perceived by one or many humans. An information complex could be of a physical nature, such as a building, landscape, city street; or of a pure informational nature, such as a film, television program,…

  7. Complexity in Evolutionary Processes

    International Nuclear Information System (INIS)

    Schuster, P.

    2010-01-01

    Darwin's principle of evolution by natural selection is readily casted into a mathematical formalism. Molecular biology revealed the mechanism of mutation and provides the basis for a kinetic theory of evolution that models correct reproduction and mutation as parallel chemical reaction channels. A result of the kinetic theory is the existence of a phase transition in evolution occurring at a critical mutation rate, which represents a localization threshold for the population in sequence space. Occurrence and nature of such phase transitions depend critically on fitness landscapes. The fitness landscape being tantamount to a mapping from sequence or genotype space into phenotype space is identified as the true source of complexity in evolution. Modeling evolution as a stochastic process is discussed and neutrality with respect to selection is shown to provide a major challenge for understanding evolutionary processes (author)

  8. Information-processing genes

    International Nuclear Information System (INIS)

    Tahir Shah, K.

    1995-01-01

    There are an estimated 100,000 genes in the human genome of which 97% is non-coding. On the other hand, bacteria have little or no non-coding DNA. Non-coding region includes introns, ALU sequences, satellite DNA, and other segments not expressed as proteins. Why it exists? Why nature has kept non-coding during the long evolutionary period if it has no role in the development of complex life forms? Does complexity of a species somehow correlated to the existence of apparently useless sequences? What kind of capability is encoded within such nucleotide sequences that is a necessary, but not a sufficient condition for the evolution of complex life forms, keeping in mind the C-value paradox and the omnipresence of non-coding segments in higher eurkaryotes and also in many archea and prokaryotes. The physico-chemical description of biological processes is hardware oriented and does not highlight algorithmic or information processing aspect. However, an algorithm without its hardware implementation is useless as much as hardware without its capability to run an algorithm. The nature and type of computation an information-processing hardware can perform depends only on its algorithm and the architecture that reflects the algorithm. Given that enormously difficult tasks such as high fidelity replication, transcription, editing and regulation are all achieved within a long linear sequence, it is natural to think that some parts of a genome are involved is these tasks. If some complex algorithms are encoded with these parts, then it is natural to think that non-coding regions contain processing-information algorithms. A comparison between well-known automatic sequences and sequences constructed out of motifs is found in all species proves the point: noncoding regions are a sort of ''hardwired'' programs, i.e., they are linear representations of information-processing machines. Thus in our model, a noncoding region, e.g., an intron contains a program (or equivalently, it is

  9. Photonic Quantum Information Processing

    International Nuclear Information System (INIS)

    Walther, P.

    2012-01-01

    The advantage of the photon's mobility makes optical quantum system ideally suited for delegated quantum computation. I will present results for the realization for a measurement-based quantum network in a client-server environment, where quantum information is securely communicated and computed. Related to measurement-based quantum computing I will discuss a recent experiment showing that quantum discord can be used as resource for the remote state preparation, which might shine new light on the requirements for quantum-enhanced information processing. Finally, I will briefly review recent photonic quantum simulation experiments of four frustrated Heisenberg-interactions spins and present an outlook of feasible simulation experiments with more complex interactions or random walk structures. As outlook I will discuss the current status of new quantum technology for improving the scalability of photonic quantum systems by using superconducting single-photon detectors and tailored light-matter interactions. (author)

  10. Information geometric methods for complexity

    Science.gov (United States)

    Felice, Domenico; Cafaro, Carlo; Mancini, Stefano

    2018-03-01

    Research on the use of information geometry (IG) in modern physics has witnessed significant advances recently. In this review article, we report on the utilization of IG methods to define measures of complexity in both classical and, whenever available, quantum physical settings. A paradigmatic example of a dramatic change in complexity is given by phase transitions (PTs). Hence, we review both global and local aspects of PTs described in terms of the scalar curvature of the parameter manifold and the components of the metric tensor, respectively. We also report on the behavior of geodesic paths on the parameter manifold used to gain insight into the dynamics of PTs. Going further, we survey measures of complexity arising in the geometric framework. In particular, we quantify complexity of networks in terms of the Riemannian volume of the parameter space of a statistical manifold associated with a given network. We are also concerned with complexity measures that account for the interactions of a given number of parts of a system that cannot be described in terms of a smaller number of parts of the system. Finally, we investigate complexity measures of entropic motion on curved statistical manifolds that arise from a probabilistic description of physical systems in the presence of limited information. The Kullback-Leibler divergence, the distance to an exponential family and volumes of curved parameter manifolds, are examples of essential IG notions exploited in our discussion of complexity. We conclude by discussing strengths, limits, and possible future applications of IG methods to the physics of complexity.

  11. Methods of information processing

    Energy Technology Data Exchange (ETDEWEB)

    Kosarev, Yu G; Gusev, V D

    1978-01-01

    Works are presented on automation systems for editing and publishing operations by methods of processing symbol information and information contained in training selection (ranking of objectives by promise, classification algorithm of tones and noise). The book will be of interest to specialists in the automation of processing textural information, programming, and pattern recognition.

  12. Mathematical Analysis of Evolution, Information, and Complexity

    CERN Document Server

    Arendt, Wolfgang

    2009-01-01

    Mathematical Analysis of Evolution, Information, and Complexity deals with the analysis of evolution, information and complexity. The time evolution of systems or processes is a central question in science, this text covers a broad range of problems including diffusion processes, neuronal networks, quantum theory and cosmology. Bringing together a wide collection of research in mathematics, information theory, physics and other scientific and technical areas, this new title offers elementary and thus easily accessible introductions to the various fields of research addressed in the book.

  13. The creation of the analytical information system to serve the process of complex decommissioning of nuclear submarines (NSM) and surface ships (SS) with nuclear power installations (NPI)

    International Nuclear Information System (INIS)

    Terentiev, V.G.; Yakovlev, N.E.; Tyurin, A.V.

    2002-01-01

    Management of the decommissioning of nuclear vessels includes information collection, accumulation, systematisation and analysis on the complex utilization of nuclear submarines and surface ships with nuclear power installations and on treatment of spent nuclear fuel and radioactive wastes. The relevant data on radiation and ecology, science and technology, law and economy, administration and management should be properly processed. The general objective of the analytical information system (AIS) development, described in the present paper, is the efficiency upgrading for nuclear submarine utilization management and decision making. The report considers information provision and functioning principles as well as software/hardware solutions associated with the AIS creation. (author)

  14. Natural Information Processing Systems

    OpenAIRE

    John Sweller; Susan Sweller

    2006-01-01

    Natural information processing systems such as biological evolution and human cognition organize information used to govern the activities of natural entities. When dealing with biologically secondary information, these systems can be specified by five common principles that we propose underlie natural information processing systems. The principles equate: (1) human long-term memory with a genome; (2) learning from other humans with biological reproduction; (3) problem solving through random ...

  15. Quantum information processing

    National Research Council Canada - National Science Library

    Leuchs, Gerd; Beth, Thomas

    2003-01-01

    ... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.5 SimulationofHamiltonians... References... 1 1 1 3 5 8 10 2 Quantum Information Processing and Error Correction with Jump Codes (G. Alber, M. Mussinger...

  16. Hybrid quantum information processing

    Energy Technology Data Exchange (ETDEWEB)

    Furusawa, Akira [Department of Applied Physics, School of Engineering, The University of Tokyo (Japan)

    2014-12-04

    I will briefly explain the definition and advantage of hybrid quantum information processing, which is hybridization of qubit and continuous-variable technologies. The final goal would be realization of universal gate sets both for qubit and continuous-variable quantum information processing with the hybrid technologies. For that purpose, qubit teleportation with a continuousvariable teleporter is one of the most important ingredients.

  17. Scientific information processing procedures

    Directory of Open Access Journals (Sweden)

    García, Maylin

    2013-07-01

    Full Text Available The paper systematizes several theoretical view-points on scientific information processing skill. It decomposes the processing skills into sub-skills. Several methods such analysis, synthesis, induction, deduction, document analysis were used to build up a theoretical framework. Interviews and survey to professional being trained and a case study was carried out to evaluate the results. All professional in the sample improved their performance in scientific information processing.

  18. Quantum Information Processing

    CERN Document Server

    Leuchs, Gerd

    2005-01-01

    Quantum processing and communication is emerging as a challenging technique at the beginning of the new millennium. This is an up-to-date insight into the current research of quantum superposition, entanglement, and the quantum measurement process - the key ingredients of quantum information processing. The authors further address quantum protocols and algorithms. Complementary to similar programmes in other countries and at the European level, the German Research Foundation (DFG) started a focused research program on quantum information in 1999. The contributions - written by leading experts - bring together the latest results in quantum information as well as addressing all the relevant questions

  19. Working memory activation of neural networks in the elderly as a function of information processing phase and task complexity.

    Science.gov (United States)

    Charroud, Céline; Steffener, Jason; Le Bars, Emmanuelle; Deverdun, Jérémy; Bonafe, Alain; Abdennour, Meriem; Portet, Florence; Molino, François; Stern, Yaakov; Ritchie, Karen; Menjot de Champfleur, Nicolas; Akbaraly, Tasnime N

    2015-11-01

    Changes in working memory are sensitive indicators of both normal and pathological brain aging and associated disability. The present study aims to further understanding of working memory in normal aging using a large cohort of healthy elderly in order to examine three separate phases of information processing in relation to changes in task load activation. Using covariance analysis, increasing and decreasing neural activation was observed on fMRI in response to a delayed item recognition task in 337 cognitively healthy elderly persons as part of the CRESCENDO (Cognitive REServe and Clinical ENDOphenotypes) study. During three phases of the task (stimulation, retention, probe), increased activation was observed with increasing task load in bilateral regions of the prefrontal cortex, parietal lobule, cingulate gyrus, insula and in deep gray matter nuclei, suggesting an involvement of central executive and salience networks. Decreased activation associated with increasing task load was observed during the stimulation phase, in bilateral temporal cortex, parietal lobule, cingulate gyrus and prefrontal cortex. This spatial distribution of decreased activation is suggestive of the default mode network. These findings support the hypothesis of an increased activation in salience and central executive networks and a decreased activation in default mode network concomitant to increasing task load. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Complex sound processing during human REM sleep by recovering information from long-term memory as revealed by the mismatch negativity (MMN).

    Science.gov (United States)

    Atienza, M; Cantero, J L

    2001-05-18

    Perceptual learning is thought to be the result of neural changes that take place over a period of several hours or days, allowing information to be transferred to long-term memory. Evidence suggests that contents of long-term memory may improve attentive and pre-attentive sensory processing. Therefore, it is plausible to hypothesize that learning-induced neural changes that develop during wakefulness could improve automatic information processing during human REM sleep. The MMN, an objective measure of the automatic change detection in auditory cortex, was used to evaluate long-term learning effects on pre-attentive processing during wakefulness and REM sleep. When subjects learned to discriminate two complex auditory patterns in wakefulness, an increase in the MMN was obtained in both wake and REM states. The automatic detection of the infrequent complex auditory pattern may therefore be improved in both brain states by reactivating information from long-term memory. These findings suggest that long-term learning-related neural changes are accessible during REM sleep as well.

  1. Epidemic processes in complex networks

    OpenAIRE

    Pastor Satorras, Romualdo; Castellano, Claudio; Van Mieghem, Piet; Vespignani, Alessandro

    2015-01-01

    In recent years the research community has accumulated overwhelming evidence for the emergence of complex and heterogeneous connectivity patterns in a wide range of biological and sociotechnical systems. The complex properties of real-world networks have a profound impact on the behavior of equilibrium and nonequilibrium phenomena occurring in various systems, and the study of epidemic spreading is central to our understanding of the unfolding of dynamical processes in complex networks. The t...

  2. PREFACE: Quantum information processing

    Science.gov (United States)

    Briggs, Andrew; Ferry, David; Stoneham, Marshall

    2006-05-01

    Microelectronics and the classical information technologies transformed the physics of semiconductors. Photonics has given optical materials a new direction. Quantum information technologies, we believe, will have immense impact on condensed matter physics. The novel systems of quantum information processing need to be designed and made. Their behaviours must be manipulated in ways that are intrinsically quantal and generally nanoscale. Both in this special issue and in previous issues (see e.g., Spiller T P and Munro W J 2006 J. Phys.: Condens. Matter 18 V1-10) we see the emergence of new ideas that link the fundamentals of science to the pragmatism of market-led industry. We hope these papers will be followed by many others on quantum information processing in the Journal of Physics: Condensed Matter.

  3. Information communication on complex networks

    International Nuclear Information System (INIS)

    Igarashi, Akito; Kawamoto, Hiroki; Maruyama, Takahiro; Morioka, Atsushi; Naganuma, Yuki

    2013-01-01

    Since communication networks such as the Internet, which is regarded as a complex network, have recently become a huge scale and a lot of data pass through them, the improvement of packet routing strategies for transport is one of the most significant themes in the study of computer networks. It is especially important to find routing strategies which can bear as many traffic as possible without congestion in complex networks. First, using neural networks, we introduce a strategy for packet routing on complex networks, where path lengths and queue lengths in nodes are taken into account within a framework of statistical physics. Secondly, instead of using shortest paths, we propose efficient paths which avoid hubs, nodes with a great many degrees, on scale-free networks with a weight of each node. We improve the heuristic algorithm proposed by Danila et. al. which optimizes step by step routing properties on congestion by using the information of betweenness, the probability of paths passing through a node in all optimal paths which are defined according to a rule, and mitigates the congestion. We confirm the new heuristic algorithm which balances traffic on networks by achieving minimization of the maximum betweenness in much smaller number of iteration steps. Finally, We model virus spreading and data transfer on peer-to-peer (P2P) networks. Using mean-field approximation, we obtain an analytical formulation and emulate virus spreading on the network and compare the results with those of simulation. Moreover, we investigate the mitigation of information traffic congestion in the P2P networks.

  4. Physics as Information Processing

    International Nuclear Information System (INIS)

    D'Ariano, Giacomo Mauro

    2011-01-01

    I review some recent advances in foundational research at Pavia QUIT group. The general idea is that there is only Quantum Theory without quantization rules, and the whole Physics - including space-time and relativity - is emergent from the quantum-information processing. And since Quantum Theory itself is axiomatized solely on informational principles, the whole Physics must be reformulated in information-theoretical terms: this is the It from bit of J. A. Wheeler.The review is divided into four parts: a) the informational axiomatization of Quantum Theory; b) how space-time and relativistic covariance emerge from quantum computation; c) what is the information-theoretical meaning of inertial mass and of (ℎ/2π), and how the quantum field emerges; d) an observational consequence of the new quantum field theory: a mass-dependent refraction index of vacuum. I will conclude with the research lines that will follow in the immediate future.

  5. Information Processing of Trauma.

    Science.gov (United States)

    Hartman, Carol R.; Burgess, Ann W.

    1993-01-01

    This paper presents a neuropsychosocial model of information processing to explain a victimization experience, specifically child sexual abuse. It surveys the relation of sensation, perception, and cognition as a systematic way to provide a framework for studying human behavior and describing human response to traumatic events. (Author/JDD)

  6. Information services and information processing

    Science.gov (United States)

    1975-01-01

    Attempts made to design and extend space system capabilities are reported. Special attention was given to establishing user needs for information or services which might be provided by space systems. Data given do not attempt to detail scientific, technical, or economic bases for the needs expressed by the users.

  7. Introduction to information processing

    CERN Document Server

    Dietel, Harvey M

    2014-01-01

    An Introduction to Information Processing provides an informal introduction to the computer field. This book introduces computer hardware, which is the actual computing equipment.Organized into three parts encompassing 12 chapters, this book begins with an overview of the evolution of personal computing and includes detailed case studies on two of the most essential personal computers for the 1980s, namely, the IBM Personal Computer and Apple's Macintosh. This text then traces the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapte

  8. Effective Complexity of Stationary Process Realizations

    Directory of Open Access Journals (Sweden)

    Arleta Szkoła

    2011-06-01

    Full Text Available The concept of effective complexity of an object as the minimal description length of its regularities has been initiated by Gell-Mann and Lloyd. The regularities are modeled by means of ensembles, which is the probability distributions on finite binary strings. In our previous paper [1] we propose a definition of effective complexity in precise terms of algorithmic information theory. Here we investigate the effective complexity of binary strings generated by stationary, in general not computable, processes. We show that under not too strong conditions long typical process realizations are effectively simple. Our results become most transparent in the context of coarse effective complexity which is a modification of the original notion of effective complexity that needs less parameters in its definition. A similar modification of the related concept of sophistication has been suggested by Antunes and Fortnow.

  9. Advanced information processing system

    Science.gov (United States)

    Lala, J. H.

    1984-01-01

    Design and performance details of the advanced information processing system (AIPS) for fault and damage tolerant data processing on aircraft and spacecraft are presented. AIPS comprises several computers distributed throughout the vehicle and linked by a damage tolerant data bus. Most I/O functions are available to all the computers, which run in a TDMA mode. Each computer performs separate specific tasks in normal operation and assumes other tasks in degraded modes. Redundant software assures that all fault monitoring, logging and reporting are automated, together with control functions. Redundant duplex links and damage-spread limitation provide the fault tolerance. Details of an advanced design of a laboratory-scale proof-of-concept system are described, including functional operations.

  10. Epidemic processes in complex networks

    Science.gov (United States)

    Pastor-Satorras, Romualdo; Castellano, Claudio; Van Mieghem, Piet; Vespignani, Alessandro

    2015-07-01

    In recent years the research community has accumulated overwhelming evidence for the emergence of complex and heterogeneous connectivity patterns in a wide range of biological and sociotechnical systems. The complex properties of real-world networks have a profound impact on the behavior of equilibrium and nonequilibrium phenomena occurring in various systems, and the study of epidemic spreading is central to our understanding of the unfolding of dynamical processes in complex networks. The theoretical analysis of epidemic spreading in heterogeneous networks requires the development of novel analytical frameworks, and it has produced results of conceptual and practical relevance. A coherent and comprehensive review of the vast research activity concerning epidemic processes is presented, detailing the successful theoretical approaches as well as making their limits and assumptions clear. Physicists, mathematicians, epidemiologists, computer, and social scientists share a common interest in studying epidemic spreading and rely on similar models for the description of the diffusion of pathogens, knowledge, and innovation. For this reason, while focusing on the main results and the paradigmatic models in infectious disease modeling, the major results concerning generalized social contagion processes are also presented. Finally, the research activity at the forefront in the study of epidemic spreading in coevolving, coupled, and time-varying networks is reported.

  11. THE ELABORATION OF THE OPTIMAL SYNTHESIS ALGORITHM FOR COMPLEX PROCESSING INFORMATION OF THE SPATIAL POSITION OF THE UPPER-AIR RADIOSONDE

    Directory of Open Access Journals (Sweden)

    2016-01-01

    Full Text Available The article considers the elaboration of the problem of optimal algorithm synthesis of complex signal processing of satel- lite GLONASS/GPS systems navigation relayed from the Board of the upper-air radiosonde and the output data upper-air radar to determine the spatial coordinates of upper-air radiosonde. The upper-air sounding is performed with the help of technical means of radio sounding system of atmosphere, including the upper-air radiosonde, manufactured in free flight, and ground supporting equipment, which includes devices for signal processing of upper-air radiosonde and preparation of the operational upper-air mes- sages. The peculiarity of atmosphere radio sounding of domestic system is the measurement with method of radar slant range to upper-air radiosonde, the viewing angles of the antenna upper-air radar to determine azimuth and elevation of upper-air radiosonde. The disadvantage of the radar method of radiosonde support is the relatively low accuracy of determining the coordinates of the radiosonde and the possible disruption of automatic tracking in angular coordinates. Satellite navigation system based on the mi- crowave sensors has clear advantages in terms of efficiency, size, mobility, and use on mobile objects, however, with significant drawbacks associated primarily with the geometric factor and the error propagation of the navigation signal. The article presents a mathematical model useful incoherent GLONASS/GPS signals, relayed by the upper-air radiosonde, and interference on the input receiver ground point for complex information processing, and mathematical models of output data in upper-air radars.

  12. Weather Information Processing

    Science.gov (United States)

    1991-01-01

    Science Communications International (SCI), formerly General Science Corporation, has developed several commercial products based upon experience acquired as a NASA Contractor. Among them are METPRO, a meteorological data acquisition and processing system, which has been widely used, RISKPRO, an environmental assessment system, and MAPPRO, a geographic information system. METPRO software is used to collect weather data from satellites, ground-based observation systems and radio weather broadcasts to generate weather maps, enabling potential disaster areas to receive advance warning. GSC's initial work for NASA Goddard Space Flight Center resulted in METPAK, a weather satellite data analysis system. METPAK led to the commercial METPRO system. The company also provides data to other government agencies, U.S. embassies and foreign countries.

  13. Visual Navigation of Complex Information Spaces

    Directory of Open Access Journals (Sweden)

    Sarah North

    1995-11-01

    Full Text Available The authors lay the foundation for the introduction of visual navigation aid to assist computer users in direct manipulation of the complex information spaces. By exploring present research on scientific data visualisation and creating a case for improved information visualisation tools, they introduce the design of an improved information visualisation interface utilizing dynamic slider, called Visual-X, incorporating icons with bindable attributes (glyphs. Exploring the improvement that these data visualisations, make to a computing environment, the authors conduct an experiment to compare the performance of subjects who use traditional interfaces and Visual-X. Methodology is presented and conclusions reveal that the use of Visual-X appears to be a promising approach in providing users with a navigation tool that does not overload their cognitive processes.

  14. Maximizing information exchange between complex networks

    International Nuclear Information System (INIS)

    West, Bruce J.; Geneston, Elvis L.; Grigolini, Paolo

    2008-01-01

    research overarching all of the traditional scientific disciplines. The transportation networks of planes, highways and railroads; the economic networks of global finance and stock markets; the social networks of terrorism, governments, businesses and churches; the physical networks of telephones, the Internet, earthquakes and global warming and the biological networks of gene regulation, the human body, clusters of neurons and food webs, share a number of apparently universal properties as the networks become increasingly complex. Ubiquitous aspects of such complex networks are the appearance of non-stationary and non-ergodic statistical processes and inverse power-law statistical distributions. Herein we review the traditional dynamical and phase-space methods for modeling such networks as their complexity increases and focus on the limitations of these procedures in explaining complex networks. Of course we will not be able to review the entire nascent field of network science, so we limit ourselves to a review of how certain complexity barriers have been surmounted using newly applied theoretical concepts such as aging, renewal, non-ergodic statistics and the fractional calculus. One emphasis of this review is information transport between complex networks, which requires a fundamental change in perception that we express as a transition from the familiar stochastic resonance to the new concept of complexity matching

  15. Maximizing information exchange between complex networks

    Science.gov (United States)

    West, Bruce J.; Geneston, Elvis L.; Grigolini, Paolo

    2008-10-01

    modern research overarching all of the traditional scientific disciplines. The transportation networks of planes, highways and railroads; the economic networks of global finance and stock markets; the social networks of terrorism, governments, businesses and churches; the physical networks of telephones, the Internet, earthquakes and global warming and the biological networks of gene regulation, the human body, clusters of neurons and food webs, share a number of apparently universal properties as the networks become increasingly complex. Ubiquitous aspects of such complex networks are the appearance of non-stationary and non-ergodic statistical processes and inverse power-law statistical distributions. Herein we review the traditional dynamical and phase-space methods for modeling such networks as their complexity increases and focus on the limitations of these procedures in explaining complex networks. Of course we will not be able to review the entire nascent field of network science, so we limit ourselves to a review of how certain complexity barriers have been surmounted using newly applied theoretical concepts such as aging, renewal, non-ergodic statistics and the fractional calculus. One emphasis of this review is information transport between complex networks, which requires a fundamental change in perception that we express as a transition from the familiar stochastic resonance to the new concept of complexity matching.

  16. Maximizing information exchange between complex networks

    Energy Technology Data Exchange (ETDEWEB)

    West, Bruce J. [Mathematical and Information Science, Army Research Office, Research Triangle Park, NC 27708 (United States); Physics Department, Duke University, Durham, NC 27709 (United States)], E-mail: bwest@nc.rr.com; Geneston, Elvis L. [Center for Nonlinear Science, University of North Texas, P.O. Box 311427, Denton, TX 76203-1427 (United States); Physics Department, La Sierra University, 4500 Riverwalk Parkway, Riverside, CA 92515 (United States); Grigolini, Paolo [Center for Nonlinear Science, University of North Texas, P.O. Box 311427, Denton, TX 76203-1427 (United States); Istituto di Processi Chimico Fisici del CNR, Area della Ricerca di Pisa, Via G. Moruzzi, 56124, Pisa (Italy); Dipartimento di Fisica ' E. Fermi' Universita' di Pisa, Largo Pontecorvo 3, 56127 Pisa (Italy)

    2008-10-15

    modern research overarching all of the traditional scientific disciplines. The transportation networks of planes, highways and railroads; the economic networks of global finance and stock markets; the social networks of terrorism, governments, businesses and churches; the physical networks of telephones, the Internet, earthquakes and global warming and the biological networks of gene regulation, the human body, clusters of neurons and food webs, share a number of apparently universal properties as the networks become increasingly complex. Ubiquitous aspects of such complex networks are the appearance of non-stationary and non-ergodic statistical processes and inverse power-law statistical distributions. Herein we review the traditional dynamical and phase-space methods for modeling such networks as their complexity increases and focus on the limitations of these procedures in explaining complex networks. Of course we will not be able to review the entire nascent field of network science, so we limit ourselves to a review of how certain complexity barriers have been surmounted using newly applied theoretical concepts such as aging, renewal, non-ergodic statistics and the fractional calculus. One emphasis of this review is information transport between complex networks, which requires a fundamental change in perception that we express as a transition from the familiar stochastic resonance to the new concept of complexity matching.

  17. Robust and parliamentary or informal and participative? The pitfalls of decision-making processes in complex procedures; Robust-parlamentarisch oder informell-partizipativ? Die Tuecken der Entscheidungsfindung in komplexen Verfahren

    Energy Technology Data Exchange (ETDEWEB)

    Hocke, Peter [Karlsruher Institut fuer Technologie (KIT), Karlsruhe (Germany). Inst. fuer Technikfolgenabschaetzung und Systemanalyse (ITAS); Smeddinck, Ulrich [Technische Univ. Braunschweig (Germany). Inst. fuer Rechtswissenschaften

    2017-09-01

    The authors discuss the question whether the site selection decision for a final nuclear waste repository should be a parliamentary representative process or an informal pragmatic process based on public participation. In the frame of the German site selection law possibilities for innovative participation procedures were developed. The pitfalls of decision-making processes in complex procedures are discussed.

  18. Efficiency of cellular information processing

    International Nuclear Information System (INIS)

    Barato, Andre C; Hartich, David; Seifert, Udo

    2014-01-01

    We show that a rate of conditional Shannon entropy reduction, characterizing the learning of an internal process about an external process, is bounded by the thermodynamic entropy production. This approach allows for the definition of an informational efficiency that can be used to study cellular information processing. We analyze three models of increasing complexity inspired by the Escherichia coli sensory network, where the external process is an external ligand concentration jumping between two values. We start with a simple model for which ATP must be consumed so that a protein inside the cell can learn about the external concentration. With a second model for a single receptor we show that the rate at which the receptor learns about the external environment can be nonzero even without any dissipation inside the cell since chemical work done by the external process compensates for this learning rate. The third model is more complete, also containing adaptation. For this model we show inter alia that a bacterium in an environment that changes at a very slow time-scale is quite inefficient, dissipating much more than it learns. Using the concept of a coarse-grained learning rate, we show for the model with adaptation that while the activity learns about the external signal the option of changing the methylation level increases the concentration range for which the learning rate is substantial. (paper)

  19. Informational analysis involving application of complex information system

    Science.gov (United States)

    Ciupak, Clébia; Vanti, Adolfo Alberto; Balloni, Antonio José; Espin, Rafael

    The aim of the present research is performing an informal analysis for internal audit involving the application of complex information system based on fuzzy logic. The same has been applied in internal audit involving the integration of the accounting field into the information systems field. The technological advancements can provide improvements to the work performed by the internal audit. Thus we aim to find, in the complex information systems, priorities for the work of internal audit of a high importance Private Institution of Higher Education. The applied method is quali-quantitative, as from the definition of strategic linguistic variables it was possible to transform them into quantitative with the matrix intersection. By means of a case study, where data were collected via interview with the Administrative Pro-Rector, who takes part at the elaboration of the strategic planning of the institution, it was possible to infer analysis concerning points which must be prioritized at the internal audit work. We emphasize that the priorities were identified when processed in a system (of academic use). From the study we can conclude that, starting from these information systems, audit can identify priorities on its work program. Along with plans and strategic objectives of the enterprise, the internal auditor can define operational procedures to work in favor of the attainment of the objectives of the organization.

  20. Atmospheric processes over complex terrain

    Science.gov (United States)

    Banta, Robert M.; Berri, G.; Blumen, William; Carruthers, David J.; Dalu, G. A.; Durran, Dale R.; Egger, Joseph; Garratt, J. R.; Hanna, Steven R.; Hunt, J. C. R.

    1990-06-01

    A workshop on atmospheric processes over complex terrain, sponsored by the American Meteorological Society, was convened in Park City, Utah from 24 vto 28 October 1988. The overall objective of the workshop was one of interaction and synthesis--interaction among atmospheric scientists carrying out research on a variety of orographic flow problems, and a synthesis of their results and points of view into an assessment of the current status of topical research problems. The final day of the workshop was devoted to an open discussion on the research directions that could be anticipated in the next decade because of new and planned instrumentation and observational networks, the recent emphasis on development of mesoscale numerical models, and continual theoretical investigations of thermally forced flows, orographic waves, and stratified turbulence. This monograph represents an outgrowth of the Park City Workshop. The authors have contributed chapters based on their lecture material. Workshop discussions indicated interest in both the remote sensing and predictability of orographic flows. These chapters were solicited following the workshop in order to provide a more balanced view of current progress and future directions in research on atmospheric processes over complex terrain.

  1. A new information dimension of complex networks

    International Nuclear Information System (INIS)

    Wei, Daijun; Wei, Bo; Hu, Yong; Zhang, Haixin; Deng, Yong

    2014-01-01

    Highlights: •The proposed measure is more practical than the classical information dimension. •The difference of information for box in the box-covering algorithm is considered. •Results indicate the measure can capture the fractal property of complex networks. -- Abstract: The fractal and self-similarity properties are revealed in many complex networks. The classical information dimension is an important method to study fractal and self-similarity properties of planar networks. However, it is not practical for real complex networks. In this Letter, a new information dimension of complex networks is proposed. The nodes number in each box is considered by using the box-covering algorithm of complex networks. The proposed method is applied to calculate the fractal dimensions of some real networks. Our results show that the proposed method is efficient when dealing with the fractal dimension problem of complex networks.

  2. A new information dimension of complex networks

    Energy Technology Data Exchange (ETDEWEB)

    Wei, Daijun [School of Computer and Information Science, Southwest University, Chongqing 400715 (China); School of Science, Hubei University for Nationalities, Enshi 445000 (China); Wei, Bo [School of Computer and Information Science, Southwest University, Chongqing 400715 (China); Hu, Yong [Institute of Business Intelligence and Knowledge Discovery, Guangdong University of Foreign Studies, Guangzhou 510006 (China); Zhang, Haixin [School of Computer and Information Science, Southwest University, Chongqing 400715 (China); Deng, Yong, E-mail: ydeng@swu.edu.cn [School of Computer and Information Science, Southwest University, Chongqing 400715 (China); School of Engineering, Vanderbilt University, TN 37235 (United States)

    2014-03-01

    Highlights: •The proposed measure is more practical than the classical information dimension. •The difference of information for box in the box-covering algorithm is considered. •Results indicate the measure can capture the fractal property of complex networks. -- Abstract: The fractal and self-similarity properties are revealed in many complex networks. The classical information dimension is an important method to study fractal and self-similarity properties of planar networks. However, it is not practical for real complex networks. In this Letter, a new information dimension of complex networks is proposed. The nodes number in each box is considered by using the box-covering algorithm of complex networks. The proposed method is applied to calculate the fractal dimensions of some real networks. Our results show that the proposed method is efficient when dealing with the fractal dimension problem of complex networks.

  3. Information Technology in Complex Health Services

    Science.gov (United States)

    Southon, Frank Charles Gray; Sauer, Chris; Dampney, Christopher Noel Grant (Kit)

    1997-01-01

    Abstract Objective: To identify impediments to the successful transfer and implementation of packaged information systems through large, divisionalized health services. Design: A case analysis of the failure of an implementation of a critical application in the Public Health System of the State of New South Wales, Australia, was carried out. This application had been proven in the United States environment. Measurements: Interviews involving over 60 staff at all levels of the service were undertaken by a team of three. The interviews were recorded and analyzed for key themes, and the results were shared and compared to enable a continuing critical assessment. Results: Two components of the transfer of the system were considered: the transfer from a different environment, and the diffusion throughout a large, divisionalized organization. The analyses were based on the Scott-Morton organizational fit framework. In relation to the first, it was found that there was a lack of fit in the business environments and strategies, organizational structures and strategy-structure pairing as well as the management process-roles pairing. The diffusion process experienced problems because of the lack of fit in the strategy-structure, strategy-structure-management processes, and strategy-structure-role relationships. Conclusion: The large-scale developments of integrated health services present great challenges to the efficient and reliable implementation of information technology, especially in large, divisionalized organizations. There is a need to take a more sophisticated approach to understanding the complexities of organizational factors than has traditionally been the case. PMID:9067877

  4. Conceptual models of information processing

    Science.gov (United States)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  5. Towards an Information Theory of Complex Networks

    CERN Document Server

    Dehmer, Matthias; Mehler, Alexander

    2011-01-01

    For over a decade, complex networks have steadily grown as an important tool across a broad array of academic disciplines, with applications ranging from physics to social media. A tightly organized collection of carefully-selected papers on the subject, Towards an Information Theory of Complex Networks: Statistical Methods and Applications presents theoretical and practical results about information-theoretic and statistical models of complex networks in the natural sciences and humanities. The book's major goal is to advocate and promote a combination of graph-theoretic, information-theoreti

  6. Information Processing Research.

    Science.gov (United States)

    1986-09-01

    business form in which information is entered by filling in blanks, or circling alternatives. The fields of the form cor- respond to the various pieces...power. Parallelism, rather than raw speed of the computing elements, seems to be the way that the 4-15 MACHINE INTELIGENCE brain gets such jobs done...MACHINE INTELIGENCE all intelligent systems. The purpose of this paper is to characterize the weak methods and to explain how and why they arise in

  7. Industrial Information Processing

    DEFF Research Database (Denmark)

    Svensson, Carsten

    2002-01-01

    This paper demonstrates, how cross-functional business processes may be aligned with product specification systems in an intra-organizational environment by integrating planning systems and expert systems, thereby providing an end-to-end integrated and an automated solution to the “build-to-order...

  8. Integrated Optical Information Processing

    Science.gov (United States)

    1988-08-01

    applications in optical disk memory systems [91. This device is constructed in a glass /SiO2/Si waveguide. The choice of a Si substrate allows for the...contact mask) were formed in the photoresist deposited on all of the samples, we covered the unwanted gratings on each sample with cover glass slides...processing, let us consider TeO2 (v, = 620 m/s) as a potential substrate for applications requiring large time delays. This con- sideration is despite

  9. Teaching Information Systems Development via Process Variants

    Science.gov (United States)

    Tan, Wee-Kek; Tan, Chuan-Hoo

    2010-01-01

    Acquiring the knowledge to assemble an integrated Information System (IS) development process that is tailored to the specific needs of a project has become increasingly important. It is therefore necessary for educators to impart to students this crucial skill. However, Situational Method Engineering (SME) is an inherently complex process that…

  10. Risk and information processing

    International Nuclear Information System (INIS)

    Rasmussen, J.

    1985-08-01

    The reasons for the current widespread arguments between designers of advanced technological systems like, for instance, nuclear power plants and opponents from the general public concerning levels of acceptable risk may be found in incompatible definitions of risk, in differences in risk perception and criteria for acceptance, etc. Of importance may, however, also be the difficulties met in presenting the basis for risk analysis, such as the conceptual system models applied, in an explicit and credible form. Application of modern information technology for the design of control systems and human-machine interfaces together with the trends towards large centralised industrial installations have made it increasingly difficult to establish an acceptable model framework, in particular considering the role of human errors in major system failures and accidents. Different aspects of this problem are discussed in the paper, and areas are identified where research is needed in order to improve not only the safety of advanced systems, but also the basis for their acceptance by the general public. (author)

  11. Advanced monitoring with complex stream processing

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Making sense of metrics and logs for service monitoring can be a complicated task. Valuable information is normally scattered across several streams of monitoring data, requiring aggregation, correlation and time-based analysis to promptly detect problems and failures. This presentations shows a solution which is used to support the advanced monitoring of the messaging services provided by the IT Department. It uses Esper, an open-source software product for Complex Event Processing (CEP), that analyses series of events for deriving conclusions from them.

  12. Choice Complexity, Benchmarks and Costly Information

    NARCIS (Netherlands)

    Harms, Job; Rosenkranz, S.; Sanders, M.W.J.L.

    In this study we investigate how two types of information interventions, providing a benchmark and providing costly information on option ranking, can improve decision-making in complex choices. In our experiment subjects made a series of incentivized choices between four hypothetical financial

  13. Quantum information processing in nanostructures

    International Nuclear Information System (INIS)

    Reina Estupinan, John-Henry

    2002-01-01

    Since information has been regarded os a physical entity, the field of quantum information theory has blossomed. This brings novel applications, such as quantum computation. This field has attracted the attention of numerous researchers with backgrounds ranging from computer science, mathematics and engineering, to the physical sciences. Thus, we now have an interdisciplinary field where great efforts are being made in order to build devices that should allow for the processing of information at a quantum level, and also in the understanding of the complex structure of some physical processes at a more basic level. This thesis is devoted to the theoretical study of structures at the nanometer-scale, 'nanostructures', through physical processes that mainly involve the solid-state and quantum optics, in order to propose reliable schemes for the processing of quantum information. Initially, the main results of quantum information theory and quantum computation are briefly reviewed. Next, the state-of-the-art of quantum dots technology is described. In so doing, the theoretical background and the practicalities required for this thesis are introduced. A discussion of the current quantum hardware used for quantum information processing is given. In particular, the solid-state proposals to date are emphasised. A detailed prescription is given, using an optically-driven coupled quantum dot system, to reliably prepare and manipulate exciton maximally entangled Bell and Greenberger-Horne-Zeilinger (GHZ) states. Manipulation of the strength and duration of selective light-pulses needed for producing these highly entangled states provides us with crucial elements for the processing of solid-state based quantum information. The all-optical generation of states of the so-called Bell basis for a system of two quantum dots (QDs) is exploited for performing the quantum teleportation of the excitonic state of a dot in an array of three coupled QDs. Theoretical predictions suggest

  14. BRICS and Quantum Information Processing

    DEFF Research Database (Denmark)

    Schmidt, Erik Meineche

    1998-01-01

    BRICS is a research centre and international PhD school in theoretical computer science, based at the University of Aarhus, Denmark. The centre has recently become engaged in quantum information processing in cooperation with the Department of Physics, also University of Aarhus. This extended...... abstract surveys activities at BRICS with special emphasis on the activities in quantum information processing....

  15. Information Processing and Limited Liability

    OpenAIRE

    Bartosz Mackowiak; Mirko Wiederholt

    2012-01-01

    Decision-makers often face limited liability and thus know that their loss will be bounded. We study how limited liability affects the behavior of an agent who chooses how much information to acquire and process in order to take a good decision. We find that an agent facing limited liability processes less information than an agent with unlimited liability. The informational gap between the two agents is larger in bad times than in good times and when information is more costly to process.

  16. Information accessibility and cryptic processes

    Energy Technology Data Exchange (ETDEWEB)

    Mahoney, John R; Ellison, Christopher J; Crutchfield, James P [Complexity Sciences Center and Physics Department, University of California at Davis, One Shields Avenue, Davis, CA 95616 (United States)], E-mail: jrmahoney@ucdavis.edu, E-mail: cellison@cse.ucdavis.edu, E-mail: chaos@cse.ucdavis.edu

    2009-09-11

    We give a systematic expansion of the crypticity-a recently introduced measure of the inaccessibility of a stationary process's internal state information. This leads to a hierarchy of k-cryptic processes and allows us to identify finite-state processes that have infinite cryptic order-the internal state information is present across arbitrarily long, observed sequences. The crypticity expansion is exact in both the finite- and infinite-order cases. It turns out that k-crypticity is complementary to the Markovian finite-order property that describes state information in processes. One application of these results is an efficient expansion of the excess entropy-the mutual information between a process's infinite past and infinite future-that is finite and exact for finite-order cryptic processes. (fast track communication)

  17. Recording information on protein complexes in an information management system.

    Science.gov (United States)

    Savitsky, Marc; Diprose, Jonathan M; Morris, Chris; Griffiths, Susanne L; Daniel, Edward; Lin, Bill; Daenke, Susan; Bishop, Benjamin; Siebold, Christian; Wilson, Keith S; Blake, Richard; Stuart, David I; Esnouf, Robert M

    2011-08-01

    The Protein Information Management System (PiMS) is a laboratory information management system (LIMS) designed for use with the production of proteins in a research environment. The software is distributed under the CCP4 licence, and so is available free of charge to academic laboratories. Like most LIMS, the underlying PiMS data model originally had no support for protein-protein complexes. To support the SPINE2-Complexes project the developers have extended PiMS to meet these requirements. The modifications to PiMS, described here, include data model changes, additional protocols, some user interface changes and functionality to detect when an experiment may have formed a complex. Example data are shown for the production of a crystal of a protein complex. Integration with SPINE2-Complexes Target Tracker application is also described. Copyright © 2011 Elsevier Inc. All rights reserved.

  18. Information management in process planning

    NARCIS (Netherlands)

    Lutters, Diederick; Wijnker, T.C.; Kals, H.J.J.

    1999-01-01

    A recently proposed reference model indicates the use of structured information as the basis for the control of design and manufacturing processes. The model is used as a basis to describe the integration of design and process planning. A differentiation is made between macro- and micro process

  19. Information processing, computation, and cognition.

    Science.gov (United States)

    Piccinini, Gualtiero; Scarantino, Andrea

    2011-01-01

    Computation and information processing are among the most fundamental notions in cognitive science. They are also among the most imprecisely discussed. Many cognitive scientists take it for granted that cognition involves computation, information processing, or both - although others disagree vehemently. Yet different cognitive scientists use 'computation' and 'information processing' to mean different things, sometimes without realizing that they do. In addition, computation and information processing are surrounded by several myths; first and foremost, that they are the same thing. In this paper, we address this unsatisfactory state of affairs by presenting a general and theory-neutral account of computation and information processing. We also apply our framework by analyzing the relations between computation and information processing on one hand and classicism, connectionism, and computational neuroscience on the other. We defend the relevance to cognitive science of both computation, at least in a generic sense, and information processing, in three important senses of the term. Our account advances several foundational debates in cognitive science by untangling some of their conceptual knots in a theory-neutral way. By leveling the playing field, we pave the way for the future resolution of the debates' empirical aspects.

  20. Hydrometalurgical processes for mineral complexes

    International Nuclear Information System (INIS)

    Barskij, L.A.; Danil'chenko, L.M.

    1977-01-01

    Requirements for the technology of the processing of ores including uranium ores and principal stages of the working out of technological schemes are described in brief. There are reference data on commercial minerals and ores including uranium-thorium ores, their classification with due regard for physical, chemical and superficial properties which form the basis for ore-concentrating processes. There are also presented the classification of minerals including uranium minerals by their flotation ability, flotation regimes of minerals, structural-textural characteristics of ores, genetic types of ore formations and their concentrating ability, algorithmization of the apriori evaluation of the concentration and technological diagnostics of the processing of ores. The classification of ore concentration technique is suggested

  1. Curcumin complexation with cyclodextrins by the autoclave process: Method development and characterization of complex formation.

    Science.gov (United States)

    Hagbani, Turki Al; Nazzal, Sami

    2017-03-30

    One approach to enhance curcumin (CUR) aqueous solubility is to use cyclodextrins (CDs) to form inclusion complexes where CUR is encapsulated as a guest molecule within the internal cavity of the water-soluble CD. Several methods have been reported for the complexation of CUR with CDs. Limited information, however, is available on the use of the autoclave process (AU) in complex formation. The aims of this work were therefore to (1) investigate and evaluate the AU cycle as a complex formation method to enhance CUR solubility; (2) compare the efficacy of the AU process with the freeze-drying (FD) and evaporation (EV) processes in complex formation; and (3) confirm CUR stability by characterizing CUR:CD complexes by NMR, Raman spectroscopy, DSC, and XRD. Significant differences were found in the saturation solubility of CUR from its complexes with CD when prepared by the three complexation methods. The AU yielded a complex with expected chemical and physical fingerprints for a CUR:CD inclusion complex that maintained the chemical integrity and stability of CUR and provided the highest solubility of CUR in water. Physical and chemical characterizations of the AU complexes confirmed the encapsulated of CUR inside the CD cavity and the transformation of the crystalline CUR:CD inclusion complex to an amorphous form. It was concluded that the autoclave process with its short processing time could be used as an alternate and efficient methods for drug:CD complexation. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Extraction of quantifiable information from complex systems

    CERN Document Server

    Dahmen, Wolfgang; Griebel, Michael; Hackbusch, Wolfgang; Ritter, Klaus; Schneider, Reinhold; Schwab, Christoph; Yserentant, Harry

    2014-01-01

    In April 2007, the  Deutsche Forschungsgemeinschaft (DFG) approved the  Priority Program 1324 “Mathematical Methods for Extracting Quantifiable Information from Complex Systems.” This volume presents a comprehensive overview of the most important results obtained over the course of the program.   Mathematical models of complex systems provide the foundation for further technological developments in science, engineering and computational finance.  Motivated by the trend toward steadily increasing computer power, ever more realistic models have been developed in recent years. These models have also become increasingly complex, and their numerical treatment poses serious challenges.   Recent developments in mathematics suggest that, in the long run, much more powerful numerical solution strategies could be derived if the interconnections between the different fields of research were systematically exploited at a conceptual level. Accordingly, a deeper understanding of the mathematical foundations as w...

  3. Proprioceptive information processing in schizophrenia

    DEFF Research Database (Denmark)

    Arnfred, Sidse M H

    of the left somatosensory cortex and it was suggested to be in accordance with two theories of schizophrenic information processing: the theory of deficiency of corollary discharge and the theory of weakening of the influence of past regularities. No gating deficiency was observed and the imprecision...... Rado (1890-1972) suggested that one of two un-reducible deficits in schizophrenia was a disorder of proprioception. Exploration of proprioceptive information processing is possible through the measurement of evoked and event related potentials. Event related EEG can be analyzed as conventional time...... and amplitude attenuation was not a general phenomenon across the entire brain response. Summing up, in support of Rado's hypothesis, schizophrenia spectrum patients demonstrated abnormalities in proprioceptive information processing. Future work needs to extend the findings in larger un-medicated, non...

  4. Information, complexity and efficiency: The automobile model

    Energy Technology Data Exchange (ETDEWEB)

    Allenby, B. [Lucent Technologies (United States)]|[Lawrence Livermore National Lab., CA (United States)

    1996-08-08

    The new, rapidly evolving field of industrial ecology - the objective, multidisciplinary study of industrial and economic systems and their linkages with fundamental natural systems - provides strong ground for believing that a more environmentally and economically efficient economy will be more information intensive and complex. Information and intellectual capital will be substituted for the more traditional inputs of materials and energy in producing a desirable, yet sustainable, quality of life. While at this point this remains a strong hypothesis, the evolution of the automobile industry can be used to illustrate how such substitution may, in fact, already be occurring in an environmentally and economically critical sector.

  5. Adaptive Beamforming Based on Complex Quaternion Processes

    Directory of Open Access Journals (Sweden)

    Jian-wu Tao

    2014-01-01

    Full Text Available Motivated by the benefits of array signal processing in quaternion domain, we investigate the problem of adaptive beamforming based on complex quaternion processes in this paper. First, a complex quaternion least-mean squares (CQLMS algorithm is proposed and its performance is analyzed. The CQLMS algorithm is suitable for adaptive beamforming of vector-sensor array. The weight vector update of CQLMS algorithm is derived based on the complex gradient, leading to lower computational complexity. Because the complex quaternion can exhibit the orthogonal structure of an electromagnetic vector-sensor in a natural way, a complex quaternion model in time domain is provided for a 3-component vector-sensor array. And the normalized adaptive beamformer using CQLMS is presented. Finally, simulation results are given to validate the performance of the proposed adaptive beamformer.

  6. Simulating the complex output of rainfall and hydrological processes using the information contained in large data sets: the Direct Sampling approach.

    Science.gov (United States)

    Oriani, Fabio

    2017-04-01

    The unpredictable nature of rainfall makes its estimation as much difficult as it is essential to hydrological applications. Stochastic simulation is often considered a convenient approach to asses the uncertainty of rainfall processes, but preserving their irregular behavior and variability at multiple scales is a challenge even for the most advanced techniques. In this presentation, an overview on the Direct Sampling technique [1] and its recent application to rainfall and hydrological data simulation [2, 3] is given. The algorithm, having its roots in multiple-point statistics, makes use of a training data set to simulate the outcome of a process without inferring any explicit probability measure: the data are simulated in time or space by sampling the training data set where a sufficiently similar group of neighbor data exists. This approach allows preserving complex statistical dependencies at different scales with a good approximation, while reducing the parameterization to the minimum. The straights and weaknesses of the Direct Sampling approach are shown through a series of applications to rainfall and hydrological data: from time-series simulation to spatial rainfall fields conditioned by elevation or a climate scenario. In the era of vast databases, is this data-driven approach a valid alternative to parametric simulation techniques? [1] Mariethoz G., Renard P., and Straubhaar J. (2010), The Direct Sampling method to perform multiple-point geostatistical simulations, Water. Rerous. Res., 46(11), http://dx.doi.org/10.1029/2008WR007621 [2] Oriani F., Straubhaar J., Renard P., and Mariethoz G. (2014), Simulation of rainfall time series from different climatic regions using the direct sampling technique, Hydrol. Earth Syst. Sci., 18, 3015-3031, http://dx.doi.org/10.5194/hess-18-3015-2014 [3] Oriani F., Borghi A., Straubhaar J., Mariethoz G., Renard P. (2016), Missing data simulation inside flow rate time-series using multiple-point statistics, Environ. Model

  7. Classicality of quantum information processing

    International Nuclear Information System (INIS)

    Poulin, David

    2002-01-01

    The ultimate goal of the classicality program is to quantify the amount of quantumness of certain processes. Here, classicality is studied for a restricted type of process: quantum information processing (QIP). Under special conditions, one can force some qubits of a quantum computer into a classical state without affecting the outcome of the computation. The minimal set of conditions is described and its structure is studied. Some implications of this formalism are the increase of noise robustness, a proof of the quantumness of mixed state quantum computing, and a step forward in understanding the very foundation of QIP

  8. Information processing for aerospace structural health monitoring

    Science.gov (United States)

    Lichtenwalner, Peter F.; White, Edward V.; Baumann, Erwin W.

    1998-06-01

    Structural health monitoring (SHM) technology provides a means to significantly reduce life cycle of aerospace vehicles by eliminating unnecessary inspections, minimizing inspection complexity, and providing accurate diagnostics and prognostics to support vehicle life extension. In order to accomplish this, a comprehensive SHM system will need to acquire data from a wide variety of diverse sensors including strain gages, accelerometers, acoustic emission sensors, crack growth gages, corrosion sensors, and piezoelectric transducers. Significant amounts of computer processing will then be required to convert this raw sensor data into meaningful information which indicates both the diagnostics of the current structural integrity as well as the prognostics necessary for planning and managing the future health of the structure in a cost effective manner. This paper provides a description of the key types of information processing technologies required in an effective SHM system. These include artificial intelligence techniques such as neural networks, expert systems, and fuzzy logic for nonlinear modeling, pattern recognition, and complex decision making; signal processing techniques such as Fourier and wavelet transforms for spectral analysis and feature extraction; statistical algorithms for optimal detection, estimation, prediction, and fusion; and a wide variety of other algorithms for data analysis and visualization. The intent of this paper is to provide an overview of the role of information processing for SHM, discuss various technologies which can contribute to accomplishing this role, and present some example applications of information processing for SHM implemented at the Boeing Company.

  9. Is it health information technology? : Task complexity and work substitution

    NARCIS (Netherlands)

    Medina Palomino, Hector; Rutkowski, Anne; Verhulst, Matthijs

    2015-01-01

    New technology is making it possible to replace professions built on complex knowledge, e.g. medicine. In our exploratory research we examined how Information Technologies might be replacing some of the tasks formerly processed by physician anesthesiologists (MDAs). Data (N=1178) were collected at a

  10. Minimized state complexity of quantum-encoded cryptic processes

    Science.gov (United States)

    Riechers, Paul M.; Mahoney, John R.; Aghamohammadi, Cina; Crutchfield, James P.

    2016-05-01

    The predictive information required for proper trajectory sampling of a stochastic process can be more efficiently transmitted via a quantum channel than a classical one. This recent discovery allows quantum information processing to drastically reduce the memory necessary to simulate complex classical stochastic processes. It also points to a new perspective on the intrinsic complexity that nature must employ in generating the processes we observe. The quantum advantage increases with codeword length: the length of process sequences used in constructing the quantum communication scheme. In analogy with the classical complexity measure, statistical complexity, we use this reduced communication cost as an entropic measure of state complexity in the quantum representation. Previously difficult to compute, the quantum advantage is expressed here in closed form using spectral decomposition. This allows for efficient numerical computation of the quantum-reduced state complexity at all encoding lengths, including infinite. Additionally, it makes clear how finite-codeword reduction in state complexity is controlled by the classical process's cryptic order, and it allows asymptotic analysis of infinite-cryptic-order processes.

  11. Consciousness: a unique way of processing information.

    Science.gov (United States)

    Marchetti, Giorgio

    2018-02-08

    In this article, I argue that consciousness is a unique way of processing information, in that: it produces information, rather than purely transmitting it; the information it produces is meaningful for us; the meaning it has is always individuated. This uniqueness allows us to process information on the basis of our personal needs and ever-changing interactions with the environment, and consequently to act autonomously. Three main basic cognitive processes contribute to realize this unique way of information processing: the self, attention and working memory. The self, which is primarily expressed via the central and peripheral nervous systems, maps our body, the environment, and our relations with the environment. It is the primary means by which the complexity inherent to our composite structure is reduced into the "single voice" of a unique individual. It provides a reference system that (albeit evolving) is sufficiently stable to define the variations that will be used as the raw material for the construction of conscious information. Attention allows for the selection of those variations in the state of the self that are most relevant in the given situation. Attention originates and is deployed from a single locus inside our body, which represents the center of the self, around which all our conscious experiences are organized. Whatever is focused by attention appears in our consciousness as possessing a spatial quality defined by this center and the direction toward which attention is focused. In addition, attention determines two other features of conscious experience: periodicity and phenomenal quality. Self and attention are necessary but not sufficient for conscious information to be produced. Complex forms of conscious experiences, such as the various modes of givenness of conscious experience and the stream of consciousness, need a working memory mechanism to assemble the basic pieces of information selected by attention.

  12. Aridity and decomposition processes in complex landscapes

    Science.gov (United States)

    Ossola, Alessandro; Nyman, Petter

    2015-04-01

    Decomposition of organic matter is a key biogeochemical process contributing to nutrient cycles, carbon fluxes and soil development. The activity of decomposers depends on microclimate, with temperature and rainfall being major drivers. In complex terrain the fine-scale variation in microclimate (and hence water availability) as a result of slope orientation is caused by differences in incoming radiation and surface temperature. Aridity, measured as the long-term balance between net radiation and rainfall, is a metric that can be used to represent variations in water availability within the landscape. Since aridity metrics can be obtained at fine spatial scales, they could theoretically be used to investigate how decomposition processes vary across complex landscapes. In this study, four research sites were selected in tall open sclerophyll forest along a aridity gradient (Budyko dryness index ranging from 1.56 -2.22) where microclimate, litter moisture and soil moisture were monitored continuously for one year. Litter bags were packed to estimate decomposition rates (k) using leaves of a tree species not present in the study area (Eucalyptus globulus) in order to avoid home-field advantage effects. Litter mass loss was measured to assess the activity of macro-decomposers (6mm litter bag mesh size), meso-decomposers (1 mm mesh), microbes above-ground (0.2 mm mesh) and microbes below-ground (2 cm depth, 0.2 mm mesh). Four replicates for each set of bags were installed at each site and bags were collected at 1, 2, 4, 7 and 12 months since installation. We first tested whether differences in microclimate due to slope orientation have significant effects on decomposition processes. Then the dryness index was related to decomposition rates to evaluate if small-scale variation in decomposition can be predicted using readily available information on rainfall and radiation. Decomposition rates (k), calculated fitting single pool negative exponential models, generally

  13. Dependability problems of complex information systems

    CERN Document Server

    Zamojski, Wojciech

    2014-01-01

    This monograph presents original research results on selected problems of dependability in contemporary Complex Information Systems (CIS). The ten chapters are concentrated around the following three aspects: methods for modelling of the system and its components, tasks ? or in more generic and more adequate interpretation, functionalities ? accomplished by the system and conditions for their correct realization in the dynamic operational environment. While the main focus is on theoretical advances and roadmaps for implementations of new technologies, a?much needed forum for sharing of the bes

  14. Information processing. [in human performance

    Science.gov (United States)

    Wickens, Christopher D.; Flach, John M.

    1988-01-01

    Theoretical models of sensory-information processing by the human brain are reviewed from a human-factors perspective, with a focus on their implications for aircraft and avionics design. The topics addressed include perception (signal detection and selection), linguistic factors in perception (context provision, logical reversals, absence of cues, and order reversals), mental models, and working and long-term memory. Particular attention is given to decision-making problems such as situation assessment, decision formulation, decision quality, selection of action, the speed-accuracy tradeoff, stimulus-response compatibility, stimulus sequencing, dual-task performance, task difficulty and structure, and factors affecting multiple task performance (processing modalities, codes, and stages).

  15. Principles of neural information processing

    CERN Document Server

    Seelen, Werner v

    2016-01-01

    In this fundamental book the authors devise a framework that describes the working of the brain as a whole. It presents a comprehensive introduction to the principles of Neural Information Processing as well as recent and authoritative research. The books´ guiding principles are the main purpose of neural activity, namely, to organize behavior to ensure survival, as well as the understanding of the evolutionary genesis of the brain. Among the developed principles and strategies belong self-organization of neural systems, flexibility, the active interpretation of the world by means of construction and prediction as well as their embedding into the world, all of which form the framework of the presented description. Since, in brains, their partial self-organization, the lifelong adaptation and their use of various methods of processing incoming information are all interconnected, the authors have chosen not only neurobiology and evolution theory as a basis for the elaboration of such a framework, but also syst...

  16. Program-technical complex for collection, processing and archiving of the physical information about chain nuclear reaction based on VMEbus. I. Subsystem for energy supplying control

    International Nuclear Information System (INIS)

    Alpatov, S.V.; Golovanova, Eh.Z.; Gorskaya, E.A.; Dobryanskij, V.M.; Makan'kin, A.M.; Puzynin, V.I.; Samojlov, V.N.; Cheker, A.V.

    1996-01-01

    The substantiation of choice of the hardware and software for integration in program-technical complex is given. The complex is intended for automation of the physical experiments connected with chain nuclear reaction investigations. The subsystem for energy supplying control of experiment is considered in detail. For building the subsystem the 'client-server' architecture is used. The subsystem includes the work station and VMEbus measuring modules in the net. The description of the programs and result formats are given. 5 refs., 6 figs

  17. Information processing of earth resources data

    Science.gov (United States)

    Zobrist, A. L.; Bryant, N. A.

    1982-01-01

    Current trends in the use of remotely sensed data include integration of multiple data sources of various formats and use of complex models. These trends have placed a strain on information processing systems because an enormous number of capabilities are needed to perform a single application. A solution to this problem is to create a general set of capabilities which can perform a wide variety of applications. General capabilities for the Image-Based Information System (IBIS) are outlined in this report. They are then cross-referenced for a set of applications performed at JPL.

  18. Phonological Processes in Complex and Compound Words

    Directory of Open Access Journals (Sweden)

    Alieh Kord Zaferanlu Kambuziya

    2016-02-01

    Full Text Available Abstract This research at making a comparison between phonological processes in complex and compound Persian words. Data are gathered from a 40,000-word Persian dictionary. To catch some results, 4,034 complex words and 1,464 compound ones are chosen. To count the data, "excel" software is used. Some results of the research are: 1- "Insertion" is the usual phonological process in complex words. More than half of different insertions belongs to the consonant /g/. Then /y/ and // are in the second and the third order. The consonant /v/ has the least percentage of all. The most percentage of vowel insertion belongs to /e/. The vowels /a/ and /o/ are in the second and third order. Deletion in complex words can only be seen in consonant /t/ and vowel /e/. 2- The most frequent phonological processes in compounds is consonant deletion. In this process, seven different consonants including /t/, //, /m/, /r/, / ǰ/, /d, and /c/. The only deleted vowel is /e/. In both groups of complex and compound, /t/ deletion can be observed. A sequence of three consonants paves the way for the deletion of one of the consonants, if one of the sequences is a sonorant one like /n/, the deletion process rarely happens. 3- In complex words, consonant deletion causes a lighter syllable weight, whereas vowel deletion causes a heavier syllable weight. So, both of the processes lead to bi-moraic weight. 4- The production of bi-moraic syllable in Persian is preferable to Syllable Contact Law. So, Specific Rules have precedence to Universals. 5- Vowel insertion can be seen in both groups of complex and compound words. In complex words, /e/ insertion has the most fundamental part. The vowels /a/ and /o/ are in the second and third place. Whenever there are two sequences of ultra-heavy syllables. By vowel insertion, the first syllable is broken into two light syllables. The compounds that are influenced by vowel insertion, can be and are pronounced without any insertion

  19. Advanced Information Processing System (AIPS)

    Science.gov (United States)

    Pitts, Felix L.

    1993-01-01

    Advanced Information Processing System (AIPS) is a computer systems philosophy, a set of validated hardware building blocks, and a set of validated services as embodied in system software. The goal of AIPS is to provide the knowledgebase which will allow achievement of validated fault-tolerant distributed computer system architectures, suitable for a broad range of applications, having failure probability requirements of 10E-9 at 10 hours. A background and description is given followed by program accomplishments, the current focus, applications, technology transfer, FY92 accomplishments, and funding.

  20. The Process of Solving Complex Problems

    Science.gov (United States)

    Fischer, Andreas; Greiff, Samuel; Funke, Joachim

    2012-01-01

    This article is about Complex Problem Solving (CPS), its history in a variety of research domains (e.g., human problem solving, expertise, decision making, and intelligence), a formal definition and a process theory of CPS applicable to the interdisciplinary field. CPS is portrayed as (a) knowledge acquisition and (b) knowledge application…

  1. Real time information management for improving productivity in metallurgical complexes

    International Nuclear Information System (INIS)

    Bascur, O.A.; Kennedy, J.P.

    1999-01-01

    Applying the latest information technologies in industrial plants has become a serious challenge to management and technical teams. The availability of real time and historical operations information to identify the most critical part of the processing system from mechanical integrity is a must for global plant optimization. Expanded use of plant information on the desktop is a standard tool for revenue improvement, cost reduction, and adherence to production constraints. The industrial component desktop supports access to information for process troubleshooting, continuous improvement and innovation by plant and staff personnel. Collaboration between groups enables the implementation of an overall process effectiveness index based on losses due to equipment availability, production and product quality. The key to designing technology is to use the Internet based technologies created by Microsoft for its marketplace-office automation and the Web. Time derived variables are used for process analysis, troubleshooting and performance assessment. Connectivity between metallurgical complexes, research centers and their business system has become a reality. Two case studies of large integrated mining/metallurgical complexes are highlighted. (author)

  2. Fitness, Extrinsic Complexity and Informing Science

    Directory of Open Access Journals (Sweden)

    Grandon Gill

    2017-03-01

    We raise concerns about society’s continuing investment in academic research that discounts the extrinsic complexity of the domains under study. Future Research We highlight a need for research to operationalize the concepts of fitness and complexity in practice.

  3. Information systems process and practice

    CERN Document Server

    Urquhart, Christine; Tbaishat, Dina; Yeoman, Alison

    2017-01-01

    This book adopts a holistic interpretation of information architecture, to offer a variety of methods, tools, and techniques that may be used when designing websites and information systems that support workflows and what people require when 'managing information'.

  4. Bim Automation: Advanced Modeling Generative Process for Complex Structures

    Science.gov (United States)

    Banfi, F.; Fai, S.; Brumana, R.

    2017-08-01

    The new paradigm of the complexity of modern and historic structures, which are characterised by complex forms, morphological and typological variables, is one of the greatest challenges for building information modelling (BIM). Generation of complex parametric models needs new scientific knowledge concerning new digital technologies. These elements are helpful to store a vast quantity of information during the life cycle of buildings (LCB). The latest developments of parametric applications do not provide advanced tools, resulting in time-consuming work for the generation of models. This paper presents a method capable of processing and creating complex parametric Building Information Models (BIM) with Non-Uniform to NURBS) with multiple levels of details (Mixed and ReverseLoD) based on accurate 3D photogrammetric and laser scanning surveys. Complex 3D elements are converted into parametric BIM software and finite element applications (BIM to FEA) using specific exchange formats and new modelling tools. The proposed approach has been applied to different case studies: the BIM of modern structure for the courtyard of West Block on Parliament Hill in Ottawa (Ontario) and the BIM of Masegra Castel in Sondrio (Italy), encouraging the dissemination and interaction of scientific results without losing information during the generative process.

  5. Separations in Communication Complexity Using Cheat Sheets and Information Complexity

    NARCIS (Netherlands)

    A. Anshu (Anurag); A. Belovs (Aleksandr); S. Ben-David (Shalev); M. Goos (Mika); R. Jain (Rahul); R. Kothari (Robin); T. J. Lee (Troy); M. Santha (Miklos)

    2016-01-01

    textabstractWhile exponential separations are known between quantum and randomized communication complexity for partial functions (Raz, STOC 1999), the best known separation between these measures for a total function is quadratic, witnessed by the disjointness function. We give the first

  6. Vulnerability Assessment Tools for Complex Information Networks

    National Research Council Canada - National Science Library

    Cassandras, Christos G; Gong, Weibo; Pepyne, David L; Lee, Wenke; Liu, Hong; Ho, Yu-Chi; Pfeffer, Avrom

    2006-01-01

    The specific aims of this research is to develop theories, methodologies, tools, and implementable solutions for modeling, analyzing, designing, and securing information networks against information-based attack...

  7. The architecture of the management system of complex steganographic information

    Science.gov (United States)

    Evsutin, O. O.; Meshcheryakov, R. V.; Kozlova, A. S.; Solovyev, T. M.

    2017-01-01

    The aim of the study is to create a wide area information system that allows one to control processes of generation, embedding, extraction, and detection of steganographic information. In this paper, the following problems are considered: the definition of the system scope and the development of its architecture. For creation of algorithmic maintenance of the system, classic methods of steganography are used to embed information. Methods of mathematical statistics and computational intelligence are used to identify the embedded information. The main result of the paper is the development of the architecture of the management system of complex steganographic information. The suggested architecture utilizes cloud technology in order to provide service using the web-service via the Internet. It is meant to provide streams of multimedia data processing that are streams with many sources of different types. The information system, built in accordance with the proposed architecture, will be used in the following areas: hidden transfer of documents protected by medical secrecy in telemedicine systems; copyright protection of online content in public networks; prevention of information leakage caused by insiders.

  8. Practicality of quantum information processing

    Science.gov (United States)

    Lau, Hoi-Kwan

    Quantum Information Processing (QIP) is expected to bring revolutionary enhancement to various technological areas. However, today's QIP applications are far from being practical. The problem involves both hardware issues, i.e., quantum devices are imperfect, and software issues, i.e., the functionality of some QIP applications is not fully understood. Aiming to improve the practicality of QIP, in my PhD research I have studied various topics in quantum cryptography and ion trap quantum computation. In quantum cryptography, I first studied the security of position-based quantum cryptography (PBQC). I discovered a wrong assumption in the previous literature that the cheaters are not allowed to share entangled resources. I proposed entanglement attacks that could cheat all known PBQC protocols. I also studied the practicality of continuous-variable (CV) quantum secret sharing (QSS). While the security of CV QSS was considered by the literature only in the limit of infinite squeezing, I found that finitely squeezed CV resources could also provide finite secret sharing rate. Our work relaxes the stringent resources requirement of implementing QSS. In ion trap quantum computation, I studied the phase error of quantum information induced by dc Stark effect during ion transportation. I found an optimized ion trajectory for which the phase error is the minimum. I also defined a threshold speed, above which ion transportation would induce significant error. In addition, I proposed a new application for ion trap systems as universal bosonic simulators (UBS). I introduced two architectures, and discussed their respective strength and weakness. I illustrated the implementations of bosonic state initialization, transformation, and measurement by applying radiation fields or by varying the trap potential. When comparing with conducting optical experiments, the ion trap UBS is advantageous in higher state initialization efficiency and higher measurement accuracy. Finally, I

  9. Capturing connectivity and causality in complex industrial processes

    CERN Document Server

    Yang, Fan; Shah, Sirish L; Chen, Tongwen

    2014-01-01

    This brief reviews concepts of inter-relationship in modern industrial processes, biological and social systems. Specifically ideas of connectivity and causality within and between elements of a complex system are treated; these ideas are of great importance in analysing and influencing mechanisms, structural properties and their dynamic behaviour, especially for fault diagnosis and hazard analysis. Fault detection and isolation for industrial processes being concerned with root causes and fault propagation, the brief shows that, process connectivity and causality information can be captured in two ways: ·      from process knowledge: structural modeling based on first-principles structural models can be merged with adjacency/reachability matrices or topology models obtained from process flow-sheets described in standard formats; and ·      from process data: cross-correlation analysis, Granger causality and its extensions, frequency domain methods, information-theoretical methods, and Bayesian ne...

  10. Analyzing complex networks evolution through Information Theory quantifiers

    International Nuclear Information System (INIS)

    Carpi, Laura C.; Rosso, Osvaldo A.; Saco, Patricia M.; Ravetti, Martin Gomez

    2011-01-01

    A methodology to analyze dynamical changes in complex networks based on Information Theory quantifiers is proposed. The square root of the Jensen-Shannon divergence, a measure of dissimilarity between two probability distributions, and the MPR Statistical Complexity are used to quantify states in the network evolution process. Three cases are analyzed, the Watts-Strogatz model, a gene network during the progression of Alzheimer's disease and a climate network for the Tropical Pacific region to study the El Nino/Southern Oscillation (ENSO) dynamic. We find that the proposed quantifiers are able not only to capture changes in the dynamics of the processes but also to quantify and compare states in their evolution.

  11. Analyzing complex networks evolution through Information Theory quantifiers

    Energy Technology Data Exchange (ETDEWEB)

    Carpi, Laura C., E-mail: Laura.Carpi@studentmail.newcastle.edu.a [Civil, Surveying and Environmental Engineering, University of Newcastle, University Drive, Callaghan NSW 2308 (Australia); Departamento de Fisica, Instituto de Ciencias Exatas, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, Belo Horizonte (31270-901), MG (Brazil); Rosso, Osvaldo A., E-mail: rosso@fisica.ufmg.b [Departamento de Fisica, Instituto de Ciencias Exatas, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, Belo Horizonte (31270-901), MG (Brazil); Chaos and Biology Group, Instituto de Calculo, Facultad de Ciencias Exactas y Naturales, Universidad de Buenos Aires, Pabellon II, Ciudad Universitaria, 1428 Ciudad de Buenos Aires (Argentina); Saco, Patricia M., E-mail: Patricia.Saco@newcastle.edu.a [Civil, Surveying and Environmental Engineering, University of Newcastle, University Drive, Callaghan NSW 2308 (Australia); Departamento de Hidraulica, Facultad de Ciencias Exactas, Ingenieria y Agrimensura, Universidad Nacional de Rosario, Avenida Pellegrini 250, Rosario (Argentina); Ravetti, Martin Gomez, E-mail: martin.ravetti@dep.ufmg.b [Departamento de Engenharia de Producao, Universidade Federal de Minas Gerais, Av. Antonio Carlos, 6627, Belo Horizonte (31270-901), MG (Brazil)

    2011-01-24

    A methodology to analyze dynamical changes in complex networks based on Information Theory quantifiers is proposed. The square root of the Jensen-Shannon divergence, a measure of dissimilarity between two probability distributions, and the MPR Statistical Complexity are used to quantify states in the network evolution process. Three cases are analyzed, the Watts-Strogatz model, a gene network during the progression of Alzheimer's disease and a climate network for the Tropical Pacific region to study the El Nino/Southern Oscillation (ENSO) dynamic. We find that the proposed quantifiers are able not only to capture changes in the dynamics of the processes but also to quantify and compare states in their evolution.

  12. Research on image complexity evaluation method based on color information

    Science.gov (United States)

    Wang, Hao; Duan, Jin; Han, Xue-hui; Xiao, Bo

    2017-11-01

    In order to evaluate the complexity of a color image more effectively and find the connection between image complexity and image information, this paper presents a method to compute the complexity of image based on color information.Under the complexity ,the theoretical analysis first divides the complexity from the subjective level, divides into three levels: low complexity, medium complexity and high complexity, and then carries on the image feature extraction, finally establishes the function between the complexity value and the color characteristic model. The experimental results show that this kind of evaluation method can objectively reconstruct the complexity of the image from the image feature research. The experimental results obtained by the method of this paper are in good agreement with the results of human visual perception complexity,Color image complexity has a certain reference value.

  13. Gathering Information from Transport Systems for Processing in Supply Chains

    Science.gov (United States)

    Kodym, Oldřich; Unucka, Jakub

    2016-12-01

    Paper deals with complex system for processing information from means of transport acting as parts of train (rail or road). It focuses on automated information gathering using AutoID technology, information transmission via Internet of Things networks and information usage in information systems of logistic firms for support of selected processes on MES and ERP levels. Different kinds of gathered information from whole transport chain are discussed. Compliance with existing standards is mentioned. Security of information in full life cycle is integral part of presented system. Design of fully equipped system based on synthesized functional nodes is presented.

  14. Handbook on neural information processing

    CERN Document Server

    Maggini, Marco; Jain, Lakhmi

    2013-01-01

    This handbook presents some of the most recent topics in neural information processing, covering both theoretical concepts and practical applications. The contributions include:                         Deep architectures                         Recurrent, recursive, and graph neural networks                         Cellular neural networks                         Bayesian networks                         Approximation capabilities of neural networks                         Semi-supervised learning                         Statistical relational learning                         Kernel methods for structured data                         Multiple classifier systems                         Self organisation and modal learning                         Applications to ...

  15. Accuracy in Optical Information Processing

    Science.gov (United States)

    Timucin, Dogan Aslan

    Low computational accuracy is an important obstacle for optical processors which blocks their way to becoming a practical reality and a serious challenger for classical computing paradigms. This research presents a comprehensive solution approach to the problem of accuracy enhancement in discrete analog optical information processing systems. Statistical analysis of a generic three-plane optical processor is carried out first, taking into account the effects of diffraction, interchannel crosstalk, and background radiation. Noise sources included in the analysis are photon, excitation, and emission fluctuations in the source array, transmission and polarization fluctuations in the modulator, and photoelectron, gain, dark, shot, and thermal noise in the detector array. Means and mutual coherence and probability density functions are derived for both optical and electrical output signals. Next, statistical models for a number of popular optoelectronic devices are studied. Specific devices considered here are light-emitting and laser diode sources, an ideal noiseless modulator and a Gaussian random-amplitude-transmittance modulator, p-i-n and avalanche photodiode detectors followed by electronic postprocessing, and ideal free-space geometrical -optics propagation and single-lens imaging systems. Output signal statistics are determined for various interesting device combinations by inserting these models into the general formalism. Finally, based on these special-case output statistics, results on accuracy limitations and enhancement in optical processors are presented. Here, starting with the formulation of the accuracy enhancement problem as (1) an optimal detection problem and (2) as a parameter estimation problem, the potential accuracy improvements achievable via the classical multiple-hypothesis -testing and maximum likelihood and Bayesian parameter estimation methods are demonstrated. Merits of using proper normalizing transforms which can potentially stabilize

  16. Complex plasmochemical processing of solid fuel

    Directory of Open Access Journals (Sweden)

    Vladimir Messerle

    2012-12-01

    Full Text Available Technology of complex plasmaochemical processing of solid fuel by Ecibastuz bituminous and Turgay brown coals is presented. Thermodynamic and experimental study of the technology was fulfilled. Use of this technology allows producing of synthesis gas from organic mass of coal and valuable components (technical silicon, ferrosilicon, aluminum and silicon carbide and microelements of rare metals: uranium, molybdenum, vanadium etc. from mineral mass of coal. Produced a high-calorific synthesis gas can be used for methanol synthesis, as high-grade reducing gas instead of coke, as well as energy gas in thermal power plants.

  17. Complex diffusion process for noise reduction

    DEFF Research Database (Denmark)

    Nadernejad, Ehsan; Barari, A.

    2014-01-01

    equations (PDEs) in image restoration and de-noising prompted many researchers to search for an improvement in the technique. In this paper, a new method is presented for signal de-noising, based on PDEs and Schrodinger equations, named as complex diffusion process (CDP). This method assumes that variations...... for signal de-noising. To evaluate the performance of the proposed method, a number of experiments have been performed using Sinusoid, multi-component and FM signals cluttered with noise. The results indicate that the proposed method outperforms the approaches for signal de-noising known in prior art....

  18. Continuous-variable quantum information processing

    DEFF Research Database (Denmark)

    Andersen, Ulrik Lund; Leuchs, G.; Silberhorn, C.

    2010-01-01

    the continuous degree of freedom of a quantum system for encoding, processing or detecting information, one enters the field of continuous-variable (CV) quantum information processing. In this paper we review the basic principles of CV quantum information processing with main focus on recent developments...... in the field. We will be addressing the three main stages of a quantum information system; the preparation stage where quantum information is encoded into CVs of coherent states and single-photon states, the processing stage where CV information is manipulated to carry out a specified protocol and a detection...... stage where CV information is measured using homodyne detection or photon counting....

  19. Entropy type complexity of quantum processes

    International Nuclear Information System (INIS)

    Watanabe, Noboru

    2014-01-01

    von Neumann entropy represents the amount of information in the quantum state, and this was extended by Ohya for general quantum systems [10]. Umegaki first defined the quantum relative entropy for σ-finite von Neumann algebras, which was extended by Araki, and Uhlmann, for general von Neumann algebras and *-algebras, respectively. In 1983 Ohya introduced the quantum mutual entropy by using compound states; this describes the amount of information correctly transmitted through the quantum channel, which was also extended by Ohya for general quantum systems. In this paper, we briefly explain Ohya's S-mixing entropy and the quantum mutual entropy for general quantum systems. By using structure equivalent class, we will introduce entropy type functionals based on quantum information theory to improve treatment for the Gaussian communication process. (paper)

  20. Infochemistry Information Processing at the Nanoscale

    CERN Document Server

    Szacilowski, Konrad

    2012-01-01

    Infochemistry: Information Processing at the Nanoscale, defines a new field of science, and describes the processes, systems and devices at the interface between chemistry and information sciences. The book is devoted to the application of molecular species and nanostructures to advanced information processing. It includes the design and synthesis of suitable materials and nanostructures, their characterization, and finally applications of molecular species and nanostructures for information storage and processing purposes. Divided into twelve chapters; the first three chapters serve as an int

  1. Penalised Complexity Priors for Stationary Autoregressive Processes

    KAUST Repository

    Sø rbye, Sigrunn Holbek; Rue, Haavard

    2017-01-01

    The autoregressive (AR) process of order p(AR(p)) is a central model in time series analysis. A Bayesian approach requires the user to define a prior distribution for the coefficients of the AR(p) model. Although it is easy to write down some prior, it is not at all obvious how to understand and interpret the prior distribution, to ensure that it behaves according to the users' prior knowledge. In this article, we approach this problem using the recently developed ideas of penalised complexity (PC) priors. These prior have important properties like robustness and invariance to reparameterisations, as well as a clear interpretation. A PC prior is computed based on specific principles, where model component complexity is penalised in terms of deviation from simple base model formulations. In the AR(1) case, we discuss two natural base model choices, corresponding to either independence in time or no change in time. The latter case is illustrated in a survival model with possible time-dependent frailty. For higher-order processes, we propose a sequential approach, where the base model for AR(p) is the corresponding AR(p-1) model expressed using the partial autocorrelations. The properties of the new prior distribution are compared with the reference prior in a simulation study.

  2. Penalised Complexity Priors for Stationary Autoregressive Processes

    KAUST Repository

    Sørbye, Sigrunn Holbek

    2017-05-25

    The autoregressive (AR) process of order p(AR(p)) is a central model in time series analysis. A Bayesian approach requires the user to define a prior distribution for the coefficients of the AR(p) model. Although it is easy to write down some prior, it is not at all obvious how to understand and interpret the prior distribution, to ensure that it behaves according to the users\\' prior knowledge. In this article, we approach this problem using the recently developed ideas of penalised complexity (PC) priors. These prior have important properties like robustness and invariance to reparameterisations, as well as a clear interpretation. A PC prior is computed based on specific principles, where model component complexity is penalised in terms of deviation from simple base model formulations. In the AR(1) case, we discuss two natural base model choices, corresponding to either independence in time or no change in time. The latter case is illustrated in a survival model with possible time-dependent frailty. For higher-order processes, we propose a sequential approach, where the base model for AR(p) is the corresponding AR(p-1) model expressed using the partial autocorrelations. The properties of the new prior distribution are compared with the reference prior in a simulation study.

  3. Social Information Processing in Deaf Adolescents

    Science.gov (United States)

    Torres, Jesús; Saldaña, David; Rodríguez-Ortiz, Isabel R.

    2016-01-01

    The goal of this study was to compare the processing of social information in deaf and hearing adolescents. A task was developed to assess social information processing (SIP) skills of deaf adolescents based on Crick and Dodge's (1994; A review and reformulation of social information-processing mechanisms in children's social adjustment.…

  4. Increasing process understanding by analyzing complex interactions in experimental data

    DEFF Research Database (Denmark)

    Naelapaa, Kaisa; Allesø, Morten; Kristensen, Henning Gjelstrup

    2009-01-01

    understanding of a coating process. It was possible to model the response, that is, the amount of drug released, using both mentioned techniques. However, the ANOVAmodel was difficult to interpret as several interactions between process parameters existed. In contrast to ANOVA, GEMANOVA is especially suited...... for modeling complex interactions and making easily understandable models of these. GEMANOVA modeling allowed a simple visualization of the entire experimental space. Furthermore, information was obtained on how relative changes in the settings of process parameters influence the film quality and thereby drug......There is a recognized need for new approaches to understand unit operations with pharmaceutical relevance. A method for analyzing complex interactions in experimental data is introduced. Higher-order interactions do exist between process parameters, which complicate the interpretation...

  5. Information processing by neuronal populations

    National Research Council Canada - National Science Library

    Hölscher, Christian; Munk, Matthias

    2009-01-01

    ... simultaneously recorded spike trains 120 Mark Laubach, Nandakumar S. Narayanan, and Eyal Y. Kimchi Part III Neuronal population information coding and plasticity in specific brain areas 149 7 F...

  6. Can complexity science inform physician leadership development?

    Science.gov (United States)

    Grady, Colleen Marie

    2016-07-04

    Purpose The purpose of this paper is to describe research that examined physician leadership development using complexity science principles. Design/methodology/approach Intensive interviewing of 21 participants and document review provided data regarding physician leadership development in health-care organizations using five principles of complexity science (connectivity, interdependence, feedback, exploration-of-the-space-of-possibilities and co-evolution), which were grouped in three areas of inquiry (relationships between agents, patterns of behaviour and enabling functions). Findings Physician leaders are viewed as critical in the transformation of healthcare and in improving patient outcomes, and yet significant challenges exist that limit their development. Leadership in health care continues to be associated with traditional, linear models, which are incongruent with the behaviour of a complex system, such as health care. Physician leadership development remains a low priority for most health-care organizations, although physicians admit to being limited in their capacity to lead. This research was based on five principles of complexity science and used grounded theory methodology to understand how the behaviours of a complex system can provide data regarding leadership development for physicians. The study demonstrated that there is a strong association between physician leadership and patient outcomes and that organizations play a primary role in supporting the development of physician leaders. Findings indicate that a physician's relationship with their patient and their capacity for innovation can be extended as catalytic behaviours in a complex system. The findings also identified limiting factors that impact physicians who choose to lead, such as reimbursement models that do not place value on leadership and medical education that provides minimal opportunity for leadership skill development. Practical Implications This research provides practical

  7. Information Processing and Human Abilities

    Science.gov (United States)

    Kirby, John R.; Das, J. P.

    1978-01-01

    The simultaneous and successive processing model of cognitive abilities was compared to a traditional primary mental abilities model. Simultaneous processing was found to be primarily related to spatial ability; and to a lesser extent, to memory and inductive reasoning. Subjects were 104 fourth-grade urban males. (Author/GD C)

  8. On the Intensification of Information Protection Processes

    Directory of Open Access Journals (Sweden)

    A. A. Malyuk

    2011-03-01

    Full Text Available The features of the information protection task solution in its modern statement as a complex problem that encompasses all aspects of information technology development are discussed. Such an interpretation would inevitably lead to an increase of the role of the systemic problems solution of which relies on advanced scientific and methodological basis, so called information protection processes’ intensification.

  9. Information driven self-organization of complex robotic behaviors.

    Directory of Open Access Journals (Sweden)

    Georg Martius

    Full Text Available Information theory is a powerful tool to express principles to drive autonomous systems because it is domain invariant and allows for an intuitive interpretation. This paper studies the use of the predictive information (PI, also called excess entropy or effective measure complexity, of the sensorimotor process as a driving force to generate behavior. We study nonlinear and nonstationary systems and introduce the time-local predicting information (TiPI which allows us to derive exact results together with explicit update rules for the parameters of the controller in the dynamical systems framework. In this way the information principle, formulated at the level of behavior, is translated to the dynamics of the synapses. We underpin our results with a number of case studies with high-dimensional robotic systems. We show the spontaneous cooperativity in a complex physical system with decentralized control. Moreover, a jointly controlled humanoid robot develops a high behavioral variety depending on its physics and the environment it is dynamically embedded into. The behavior can be decomposed into a succession of low-dimensional modes that increasingly explore the behavior space. This is a promising way to avoid the curse of dimensionality which hinders learning systems to scale well.

  10. Understanding Interdependency Through Complex Information Sharing

    Directory of Open Access Journals (Sweden)

    Fernando Rosas

    2016-01-01

    Full Text Available The interactions between three or more random variables are often nontrivial, poorly understood and, yet, are paramount for future advances in fields such as network information theory, neuroscience and genetics. In this work, we analyze these interactions as different modes of information sharing. Towards this end, and in contrast to most of the literature that focuses on analyzing the mutual information, we introduce an axiomatic framework for decomposing the joint entropy that characterizes the various ways in which random variables can share information. Our framework distinguishes between interdependencies where the information is shared redundantly and synergistic interdependencies where the sharing structure exists in the whole, but not between the parts. The key contribution of our approach is to focus on symmetric properties of this sharing, which do not depend on a specific point of view for differentiating roles between its components. We show that our axioms determine unique formulas for all of the terms of the proposed decomposition for systems of three variables in several cases of interest. Moreover, we show how these results can be applied to several network information theory problems, providing a more intuitive understanding of their fundamental limits.

  11. Perceptual processing of a complex musical context

    DEFF Research Database (Denmark)

    Quiroga Martinez, David Ricardo; Hansen, Niels Christian; Højlund, Andreas

    play a fundamental role in music perception. The mismatch negativity (MMN) is a brain response that offers a unique insight into these processes. The MMN is elicited by deviants in a series of repetitive sounds and reflects the perception of change in physical and abstract sound regularities. Therefore......, it is regarded as a prediction error signal and a neural correlate of the updating of predictive perceptual models. In music, the MMN has been particularly valuable for the assessment of musical expectations, learning and expertise. However, the MMN paradigm has an important limitation: its ecological validity....... To this aim we will develop a new paradigm using more real-sounding stimuli. Our stimuli will be two-part music excerpts made by adding a melody to a previous design based on the Alberti bass (Vuust et al., 2011). Our second goal is to determine how the complexity of this context affects the predictive...

  12. Mapping stochastic processes onto complex networks

    International Nuclear Information System (INIS)

    Shirazi, A H; Reza Jafari, G; Davoudi, J; Peinke, J; Reza Rahimi Tabar, M; Sahimi, Muhammad

    2009-01-01

    We introduce a method by which stochastic processes are mapped onto complex networks. As examples, we construct the networks for such time series as those for free-jet and low-temperature helium turbulence, the German stock market index (the DAX), and white noise. The networks are further studied by contrasting their geometrical properties, such as the mean length, diameter, clustering, and average number of connections per node. By comparing the network properties of the original time series investigated with those for the shuffled and surrogate series, we are able to quantify the effect of the long-range correlations and the fatness of the probability distribution functions of the series on the networks constructed. Most importantly, we demonstrate that the time series can be reconstructed with high precision by means of a simple random walk on their corresponding networks

  13. Effects of foveal information processing

    Science.gov (United States)

    Harris, R. L., Sr.

    1984-01-01

    The scanning behavior of pilots must be understood so that cockpit displays can be assembled which will provide the most information accurately and quickly to the pilot. The results of seven years of collecting and analyzing pilot scanning data are summarized. The data indicate that pilot scanning behavior is: (1) subsconscious; (2) situation dependent; and (3) can be disrupted if pilots are forced to make conscious decisions. Testing techniques and scanning analysis techniques have been developed that are sensitive to pilot workload.

  14. Kinetics of the Dynamical Information Shannon Entropy for Complex Systems

    International Nuclear Information System (INIS)

    Yulmetyev, R.M.; Yulmetyeva, D.G.

    1999-01-01

    Kinetic behaviour of dynamical information Shannon entropy is discussed for complex systems: physical systems with non-Markovian property and memory in correlation approximation, and biological and physiological systems with sequences of the Markovian and non-Markovian random noises. For the stochastic processes, a description of the information entropy in terms of normalized time correlation functions is given. The influence and important role of two mutually dependent channels of the entropy change, correlation (creation or generation of correlations) and anti-correlation (decay or annihilation of correlation) is discussed. The method developed here is also used in analysis of the density fluctuations in liquid cesium obtained from slow neutron scattering data, fractal kinetics of the long-range fluctuation in the short-time human memory and chaotic dynamics of R-R intervals of human ECG. (author)

  15. CISAPS: Complex Informational Spectrum for the Analysis of Protein Sequences

    Directory of Open Access Journals (Sweden)

    Charalambos Chrysostomou

    2015-01-01

    Full Text Available Complex informational spectrum analysis for protein sequences (CISAPS and its web-based server are developed and presented. As recent studies show, only the use of the absolute spectrum in the analysis of protein sequences using the informational spectrum analysis is proven to be insufficient. Therefore, CISAPS is developed to consider and provide results in three forms including absolute, real, and imaginary spectrum. Biologically related features to the analysis of influenza A subtypes as presented as a case study in this study can also appear individually either in the real or imaginary spectrum. As the results presented, protein classes can present similarities or differences according to the features extracted from CISAPS web server. These associations are probable to be related with the protein feature that the specific amino acid index represents. In addition, various technical issues such as zero-padding and windowing that may affect the analysis are also addressed. CISAPS uses an expanded list of 611 unique amino acid indices where each one represents a different property to perform the analysis. This web-based server enables researchers with little knowledge of signal processing methods to apply and include complex informational spectrum analysis to their work.

  16. Information-Processing Models and Curriculum Design

    Science.gov (United States)

    Calfee, Robert C.

    1970-01-01

    "This paper consists of three sections--(a) the relation of theoretical analyses of learning to curriculum design, (b) the role of information-processing models in analyses of learning processes, and (c) selected examples of the application of information-processing models to curriculum design problems." (Author)

  17. Scaling the Information Processing Demands of Occupations

    Science.gov (United States)

    Haase, Richard F.; Jome, LaRae M.; Ferreira, Joaquim Armando; Santos, Eduardo J. R.; Connacher, Christopher C.; Sendrowitz, Kerrin

    2011-01-01

    The purpose of this study was to provide additional validity evidence for a model of person-environment fit based on polychronicity, stimulus load, and information processing capacities. In this line of research the confluence of polychronicity and information processing (e.g., the ability of individuals to process stimuli from the environment…

  18. Mathematics of Information Processing and the Internet

    Science.gov (United States)

    Hart, Eric W.

    2010-01-01

    The mathematics of information processing and the Internet can be organized around four fundamental themes: (1) access (finding information easily); (2) security (keeping information confidential); (3) accuracy (ensuring accurate information); and (4) efficiency (data compression). In this article, the author discusses each theme with reference to…

  19. Quantum information processing : science & technology.

    Energy Technology Data Exchange (ETDEWEB)

    Horton, Rebecca; Carroll, Malcolm S.; Tarman, Thomas David

    2010-09-01

    Qubits demonstrated using GaAs double quantum dots (DQD). The qubit basis states are the (1) singlet and (2) triplet stationary states. Long spin decoherence times in silicon spurs translation of GaAs qubit in to silicon. In the near term the goals are: (1) Develop surface gate enhancement mode double quantum dots (MOS & strained-Si/SiGe) to demonstrate few electrons and spin read-out and to examine impurity doped quantum-dots as an alternative architecture; (2) Use mobility, C-V, ESR, quantum dot performance & modeling to feedback and improve upon processing, this includes development of atomic precision fabrication at SNL; (3) Examine integrated electronics approaches to RF-SET; (4) Use combinations of numerical packages for multi-scale simulation of quantum dot systems (NEMO3D, EMT, TCAD, SPICE); and (5) Continue micro-architecture evaluation for different device and transport architectures.

  20. The Logic Process Formalism of the Informational Domain

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available The performance of present-day informational technologies has two main properties: the universality of the structures used and the flexibility of the final user's interfaces. The first determines the potential cover area of the informational domain. The second determines the diversity and efficiency of processing methods of the proceedings being automated. The mentioned aspects are of great importance in agriculture and ecology because there are complex processes and considerable volumes of used information. For example, the meteoro-logical processes are a part of the ecological one like habitats' existential conditions and are known as a complex prognostic problem. The latter needs considerable computational resources to solve the appropriate equations. Likewise, agriculture as a controlled activity under strong impact from natural conditions has the same high requirements for diverse structures and flexibility of information processing.

  1. Process information systems in nuclear reprocessing

    International Nuclear Information System (INIS)

    Jaeschke, A.; Keller, H.; Orth, H.

    1987-01-01

    On a production management level, a process information system in a nuclear reprocessing plant (NRP) has to fulfill conventional operating functions and functions for nuclear material surveillance (safeguards). Based on today's state of the art of on-line process control technology, the progress in hardware and software technology allows to introduce more process-specific intelligence into process information systems. Exemplified by an expert-system-aided laboratory management system as component of a NRP process information system, the paper demonstrates that these technologies can be applied already. (DG) [de

  2. Relay-based information broadcast in complex networks

    Science.gov (United States)

    Fan, Zhongyan; Han, Zeyu; Tang, Wallace K. S.; Lin, Dong

    2018-04-01

    Information broadcast (IB) is a critical process in complex network, usually accomplished by flooding mechanism. Although flooding is simple and no prior topological information is required, it consumes a lot of transmission overhead. Another extreme is the tree-based broadcast (TB), for which information is disseminated via a spanning tree. It achieves the minimal transmission overhead but the maintenance of spanning tree for every node is an obvious obstacle for implementation. Motivated by the success of scale-free network models for real-world networks, in this paper, we investigate the issues in IB by considering an alternative solution in-between these two extremes. A novel relay-based broadcast (RB) mechanism is proposed by employing a subset of nodes as relays. Information is firstly forwarded to one of these relays and then re-disseminated to others through the spanning tree whose root is the relay. This mechanism provides a trade-off solution between flooding and TB. On one hand, it saves up a lot of transmission overhead as compared to flooding; on the other hand, it costs much less resource for maintenance than TB as only a few spanning trees are needed. Based on two major criteria, namely the transmission overhead and the convergence time, the effectiveness of RB is confirmed. The impacts of relay assignment and network structures on performance are also studied in this work.

  3. RNA assemblages orchestrate complex cellular processes

    DEFF Research Database (Denmark)

    Nielsen, Finn Cilius; Hansen, Heidi Theil; Christiansen, Jan

    2016-01-01

    Eukaryotic mRNAs are monocistronic, and therefore mechanisms exist that coordinate the synthesis of multiprotein complexes in order to obtain proper stoichiometry at the appropriate intracellular locations. RNA-binding proteins containing low-complexity sequences are prone to generate liquid drop...

  4. Animal models for information processing during sleep

    NARCIS (Netherlands)

    Coenen, A.M.L.; Drinkenburg, W.H.I.M.

    2002-01-01

    Information provided by external stimuli does reach the brain during sleep, although the amount of information is reduced during sleep compared to wakefulness. The process controlling this reduction is called `sensory' gating and evidence exists that the underlying neurophysiological processes take

  5. Career information processing strategies of secondary school ...

    African Journals Online (AJOL)

    This study examined the strategies commonly adopted by Osun state secondary school students in processing career information. It specifically examined the sources of career information available to the students, the uses to which the students put the information collected and how their career decision making skills can be ...

  6. A language for information commerce processes

    NARCIS (Netherlands)

    Aberer, Karl; Wombacher, Andreas

    Automatizing information commerce requires languages to represent the typical information commerce processes. Existing languages and standards cover either only very specific types of business models or are too general to capture in a concise way the specific properties of information commerce

  7. Information Geometric Complexity of a Trivariate Gaussian Statistical Model

    Directory of Open Access Journals (Sweden)

    Domenico Felice

    2014-05-01

    Full Text Available We evaluate the information geometric complexity of entropic motion on low-dimensional Gaussian statistical manifolds in order to quantify how difficult it is to make macroscopic predictions about systems in the presence of limited information. Specifically, we observe that the complexity of such entropic inferences not only depends on the amount of available pieces of information but also on the manner in which such pieces are correlated. Finally, we uncover that, for certain correlational structures, the impossibility of reaching the most favorable configuration from an entropic inference viewpoint seems to lead to an information geometric analog of the well-known frustration effect that occurs in statistical physics.

  8. Using life cycle information in process discovery

    NARCIS (Netherlands)

    Leemans, S.J.J.; Fahland, D.; Van Der Aalst, W.M.P.; Reichert, M.; Reijers, H.A.

    2016-01-01

    Understanding the performance of business processes is an important part of any business process intelligence project. From historical information recorded in event logs, performance can be measured and visualized on a discovered process model. Thereby the accuracy of the measured performance, e.g.,

  9. Unveiling the mystery of visual information processing in human brain.

    Science.gov (United States)

    Diamant, Emanuel

    2008-08-15

    It is generally accepted that human vision is an extremely powerful information processing system that facilitates our interaction with the surrounding world. However, despite extended and extensive research efforts, which encompass many exploration fields, the underlying fundamentals and operational principles of visual information processing in human brain remain unknown. We still are unable to figure out where and how along the path from eyes to the cortex the sensory input perceived by the retina is converted into a meaningful object representation, which can be consciously manipulated by the brain. Studying the vast literature considering the various aspects of brain information processing, I was surprised to learn that the respected scholarly discussion is totally indifferent to the basic keynote question: "What is information?" in general or "What is visual information?" in particular. In the old days, it was assumed that any scientific research approach has first to define its basic departure points. Why was it overlooked in brain information processing research remains a conundrum. In this paper, I am trying to find a remedy for this bizarre situation. I propose an uncommon definition of "information", which can be derived from Kolmogorov's Complexity Theory and Chaitin's notion of Algorithmic Information. Embracing this new definition leads to an inevitable revision of traditional dogmas that shape the state of the art of brain information processing research. I hope this revision would better serve the challenging goal of human visual information processing modeling.

  10. Towards the understanding of network information processing in biology

    Science.gov (United States)

    Singh, Vijay

    Living organisms perform incredibly well in detecting a signal present in the environment. This information processing is achieved near optimally and quite reliably, even though the sources of signals are highly variable and complex. The work in the last few decades has given us a fair understanding of how individual signal processing units like neurons and cell receptors process signals, but the principles of collective information processing on biological networks are far from clear. Information processing in biological networks, like the brain, metabolic circuits, cellular-signaling circuits, etc., involves complex interactions among a large number of units (neurons, receptors). The combinatorially large number of states such a system can exist in makes it impossible to study these systems from the first principles, starting from the interactions between the basic units. The principles of collective information processing on such complex networks can be identified using coarse graining approaches. This could provide insights into the organization and function of complex biological networks. Here I study models of biological networks using continuum dynamics, renormalization, maximum likelihood estimation and information theory. Such coarse graining approaches identify features that are essential for certain processes performed by underlying biological networks. We find that long-range connections in the brain allow for global scale feature detection in a signal. These also suppress the noise and remove any gaps present in the signal. Hierarchical organization with long-range connections leads to large-scale connectivity at low synapse numbers. Time delays can be utilized to separate a mixture of signals with temporal scales. Our observations indicate that the rules in multivariate signal processing are quite different from traditional single unit signal processing.

  11. Theory of Neural Information Processing Systems

    International Nuclear Information System (INIS)

    Galla, Tobias

    2006-01-01

    It is difficult not to be amazed by the ability of the human brain to process, to structure and to memorize information. Even by the toughest standards the behaviour of this network of about 10 11 neurons qualifies as complex, and both the scientific community and the public take great interest in the growing field of neuroscience. The scientific endeavour to learn more about the function of the brain as an information processing system is here a truly interdisciplinary one, with important contributions from biology, computer science, physics, engineering and mathematics as the authors quite rightly point out in the introduction of their book. The role of the theoretical disciplines here is to provide mathematical models of information processing systems and the tools to study them. These models and tools are at the centre of the material covered in the book by Coolen, Kuehn and Sollich. The book is divided into five parts, providing basic introductory material on neural network models as well as the details of advanced techniques to study them. A mathematical appendix complements the main text. The range of topics is extremely broad, still the presentation is concise and the book well arranged. To stress the breadth of the book let me just mention a few keywords here: the material ranges from the basics of perceptrons and recurrent network architectures to more advanced aspects such as Bayesian learning and support vector machines; Shannon's theory of information and the definition of entropy are discussed, and a chapter on Amari's information geometry is not missing either. Finally the statistical mechanics chapters cover Gardner theory and the replica analysis of the Hopfield model, not without being preceded by a brief introduction of the basic concepts of equilibrium statistical physics. The book also contains a part on effective theories of the macroscopic dynamics of neural networks. Many dynamical aspects of neural networks are usually hard to find in the

  12. ALGORITHM OF CARDIO COMPLEX DETECTION AND SORTING FOR PROCESSING THE DATA OF CONTINUOUS CARDIO SIGNAL MONITORING.

    Science.gov (United States)

    Krasichkov, A S; Grigoriev, E B; Nifontov, E M; Shapovalov, V V

    The paper presents an algorithm of cardio complex classification as part of processing the data of continuous cardiac monitoring. R-wave detection concurrently with cardio complex sorting is discussed. The core of this approach is the use of prior information about. cardio complex forms, segmental structure, and degree of kindness. Results of the sorting algorithm testing are provided.

  13. Occurrence reporting and processing of operations information

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-07-21

    DOE O 232.1A, Occurrence Reporting and Processing of Operations Information, and 10 CFR 830.350, Occurrence Reporting and Processing of Operations Information (when it becomes effective), along with this manual, set forth occurrence reporting requirements for Department of Energy (DOE) Departmental Elements and contractors responsible for the management and operation of DOE-owned and -leased facilities. These requirements include categorization of occurrences related to safety, security, environment, health, or operations (``Reportable Occurrences``); DOE notification of these occurrences; and the development and submission of documented follow-up reports. This Manual provides detailed information for categorizing and reporting occurrences at DOE facilities. Information gathered by the Occurrence Reporting and processing System is used for analysis of the Department`s performance in environmental protection, safeguards and security, and safety and health of its workers and the public. This information is also used to develop lessons learned and document events that significantly impact DOE operations.

  14. Occurrence reporting and processing of operations information

    International Nuclear Information System (INIS)

    1997-01-01

    DOE O 232.1A, Occurrence Reporting and Processing of Operations Information, and 10 CFR 830.350, Occurrence Reporting and Processing of Operations Information (when it becomes effective), along with this manual, set forth occurrence reporting requirements for Department of Energy (DOE) Departmental Elements and contractors responsible for the management and operation of DOE-owned and -leased facilities. These requirements include categorization of occurrences related to safety, security, environment, health, or operations (''Reportable Occurrences''); DOE notification of these occurrences; and the development and submission of documented follow-up reports. This Manual provides detailed information for categorizing and reporting occurrences at DOE facilities. Information gathered by the Occurrence Reporting and processing System is used for analysis of the Department's performance in environmental protection, safeguards and security, and safety and health of its workers and the public. This information is also used to develop lessons learned and document events that significantly impact DOE operations

  15. Certainty and Uncertainty in Quantum Information Processing

    OpenAIRE

    Rieffel, Eleanor G.

    2007-01-01

    This survey, aimed at information processing researchers, highlights intriguing but lesser known results, corrects misconceptions, and suggests research areas. Themes include: certainty in quantum algorithms; the "fewer worlds" theory of quantum mechanics; quantum learning; probability theory versus quantum mechanics.

  16. Developments in quantum information processing by nuclear ...

    Indian Academy of Sciences (India)

    qubits, the 2n energy levels of the spin-system can be treated as an n-qubit system. ... Quantum information processing; qubit; nuclear magnetic resonance quantum comput- ing. ..... The equilibrium spectrum has theoretical intensities in the ra-.

  17. How Students Learn: Information Processing, Intellectual Development and Confrontation

    Science.gov (United States)

    Entwistle, Noel

    1975-01-01

    A model derived from information processing theory is described, which helps to explain the complex verbal learning of students and suggests implications for lecturing techniques. Other factors affecting learning, which are not covered by the model, are discussed in relationship to it: student's intellectual development and effects of individual…

  18. Eye Movement Analysis of Information Processing under Different Testing Conditions.

    Science.gov (United States)

    Dillon, Ronna F.

    1985-01-01

    Undergraduates were given complex figural analogies items, and eye movements were observed under three types of feedback: (1) elaborate feedback; (2) subjects verbalized their thinking and application of rules; and (3) no feedback. Both feedback conditions enhanced the rule-governed information processing during inductive reasoning. (Author/GDC)

  19. Information in general medical practices: the information processing model.

    Science.gov (United States)

    Crowe, Sarah; Tully, Mary P; Cantrill, Judith A

    2010-04-01

    The need for effective communication and handling of secondary care information in general practices is paramount. To explore practice processes on receiving secondary care correspondence in a way that integrates the information needs and perceptions of practice staff both clinical and administrative. Qualitative study using semi-structured interviews with a wide range of practice staff (n = 36) in nine practices in the Northwest of England. Analysis was based on the framework approach using N-Vivo software and involved transcription, familiarization, coding, charting, mapping and interpretation. The 'information processing model' was developed to describe the six stages involved in practice processing of secondary care information. These included the amendment or updating of practice records whilst simultaneously or separately actioning secondary care recommendations, using either a 'one-step' or 'two-step' approach, respectively. Many factors were found to influence each stage and impact on the continuum of patient care. The primary purpose of processing secondary care information is to support patient care; this study raises the profile of information flow and usage within practices as an issue requiring further consideration.

  20. Processing reafferent and exafferent visual information for action and perception.

    Science.gov (United States)

    Reichenbach, Alexandra; Diedrichsen, Jörn

    2015-01-01

    A recent study suggests that reafferent hand-related visual information utilizes a privileged, attention-independent processing channel for motor control. This process was termed visuomotor binding to reflect its proposed function: linking visual reafferences to the corresponding motor control centers. Here, we ask whether the advantage of processing reafferent over exafferent visual information is a specific feature of the motor processing stream or whether the improved processing also benefits the perceptual processing stream. Human participants performed a bimanual reaching task in a cluttered visual display, and one of the visual hand cursors could be displaced laterally during the movement. We measured the rapid feedback responses of the motor system as well as matched perceptual judgments of which cursor was displaced. Perceptual judgments were either made by watching the visual scene without moving or made simultaneously to the reaching tasks, such that the perceptual processing stream could also profit from the specialized processing of reafferent information in the latter case. Our results demonstrate that perceptual judgments in the heavily cluttered visual environment were improved when performed based on reafferent information. Even in this case, however, the filtering capability of the perceptual processing stream suffered more from the increasing complexity of the visual scene than the motor processing stream. These findings suggest partly shared and partly segregated processing of reafferent information for vision for motor control versus vision for perception.

  1. A process framework for information security management

    Directory of Open Access Journals (Sweden)

    Knut Haufe

    2016-01-01

    Full Text Available Securing sensitive organizational data has become increasingly vital to organizations. An Information Security Management System (ISMS is a systematic approach for establishing, implementing, operating, monitoring, reviewing, maintaining and improving an organization's information security. Key elements of the operation of an ISMS are ISMS processes. However, and in spite of its importance, an ISMS process framework with a description of ISMS processes and their interaction as well as the interaction with other management processes is not available in the literature. Cost benefit analysis of information security investments regarding single measures protecting information and ISMS processes are not in the focus of current research, mostly focused on economics. This article aims to fill this research gap by proposing such an ISMS process framework as the main contribution. Based on a set of agreed upon ISMS processes in existing standards like ISO 27000 series, COBIT and ITIL. Within the framework, identified processes are described and their interaction and interfaces are specified. This framework helps to focus on the operation of the ISMS, instead of focusing on measures and controls. By this, as a main finding, the systemic character of the ISMS consisting of processes and the perception of relevant roles of the ISMS is strengthened.

  2. Information processing among high-performance managers

    Directory of Open Access Journals (Sweden)

    S.C. Garcia-Santos

    2010-01-01

    Full Text Available The purpose of this study was to evaluate the information processing of 43 business managers with a professional superior performance. The theoretical framework considers three models: the Theory of Managerial Roles of Henry Mintzberg, the Theory of Information Processing, and Process Model Response to Rorschach by John Exner. The participants have been evaluated by Rorschach method. The results show that these managers are able to collect data, evaluate them and establish rankings properly. At same time, they are capable of being objective and accurate in the problems assessment. This information processing style permits an interpretation of the world around on basis of a very personal and characteristic processing way or cognitive style.

  3. Antecedents and Consequences of Consumer's Response to Health Information Complexity

    DEFF Research Database (Denmark)

    Hansen, Torben; Uth Thomsen, Thyra; Beckmann, Suzanne C.

    2013-01-01

    This study develops and empirically tests a model for understanding food consumers' health information seeking behaviour. Data were collected from 504 food consumers using a nationally representative consumer panel. The obtained Lisrel results suggest that consumers' product-specific health...... information seeking is positively affected by general food involvement and by usability of product-specific health information. Moreover, product-specific health information seeking and product-specific health information complexity are both positively related to post-purchase health-related dissonance....... This link between information complexity and post-purchase dissonance has implications for marketers of food products since our results suggest that consumers might avoid purchasing the same food item again if post-purchase dissonance is experienced....

  4. Information Center Complex publications and presentations, 1971-1980

    International Nuclear Information System (INIS)

    Gill, A.B.; Hawthorne, S.W.

    1981-08-01

    This indexed bibliography lists publications and presentations of the Information Center Complex, Information Division, Oak Ridge National Laboratory, from 1971 through 1980. The 659 entries cover such topics as toxicology, air and water pollution, management and transportation of hazardous wastes, energy resources and conservation, and information science. Publications range in length from 1 page to 3502 pages and include topical reports, books, journal articles, fact sheets, and newsletters. Author, title, and group indexes are provided. Annual updates are planned

  5. Information Center Complex publications and presentations, 1971-1980

    Energy Technology Data Exchange (ETDEWEB)

    Gill, A.B.; Hawthorne, S.W.

    1981-08-01

    This indexed bibliography lists publications and presentations of the Information Center Complex, Information Division, Oak Ridge National Laboratory, from 1971 through 1980. The 659 entries cover such topics as toxicology, air and water pollution, management and transportation of hazardous wastes, energy resources and conservation, and information science. Publications range in length from 1 page to 3502 pages and include topical reports, books, journal articles, fact sheets, and newsletters. Author, title, and group indexes are provided. Annual updates are planned.

  6. Information technology, knowledge processes, and innovation success

    NARCIS (Netherlands)

    Song, X.M.; Zang, F.; Bij, van der J.D.; Weggeman, M.C.D.P.

    2001-01-01

    Despite the obvious linkage between information technologies (IT) and knowledge processes and the apparent strategic importance of both, little research has done to explicitly examine how, if at all, IT and knowledge processes affect firm outcomes. The purpose of this study is to bridge this

  7. ENERGETIC CHARGE OF AN INFORMATION PROCESS

    Directory of Open Access Journals (Sweden)

    Popova T.M.

    2009-12-01

    Full Text Available Main laws of technical thermodynamics are universal and could be applied to processes other than thermodynamic ones. The results of the comparison of peculiarities of irreversible informational and thermodynamic processes are presented in the article and a new term “Infopy” is used. A more precise definition of “infopy” as an energetic charge is given in the article.

  8. Methodology for Measuring the Complexity of Enterprise Information Systems

    Directory of Open Access Journals (Sweden)

    Ilja Holub

    2016-07-01

    Full Text Available The complexity of enterprise information systems is currently a challenge faced not only by IT professionals and project managers, but also by the users of such systems. Current methodologies and frameworks used to design and implement information systems do not specifically deal with the issue of their complexity and, apart from few exceptions, do not at all attempt to simplify the complexity. This article presents the author's own methodology for managing complexity, which can be used to complement any other methodology and which helps limit the growth of complexity. It introduces its own definition and metric of complexity, which it defines as the sum of entities of the individual UML models of the given system, which are selected according to the MMDIS methodology so as to consistently describe all relevant content dimensions of the system. The main objective is to propose a methodology to manage information system complexity and to verify it in practice on a real-life SAP implementation project.

  9. Integrating complex business processes for knowledge-driven clinical decision support systems.

    Science.gov (United States)

    Kamaleswaran, Rishikesan; McGregor, Carolyn

    2012-01-01

    This paper presents in detail the component of the Complex Business Process for Stream Processing framework that is responsible for integrating complex business processes to enable knowledge-driven Clinical Decision Support System (CDSS) recommendations. CDSSs aid the clinician in supporting the care of patients by providing accurate data analysis and evidence-based recommendations. However, the incorporation of a dynamic knowledge-management system that supports the definition and enactment of complex business processes and real-time data streams has not been researched. In this paper we discuss the process web service as an innovative method of providing contextual information to a real-time data stream processing CDSS.

  10. Quantum information processing with atoms and photons

    International Nuclear Information System (INIS)

    Monroe, C.

    2003-01-01

    Quantum information processors exploit the quantum features of superposition and entanglement for applications not possible in classical devices, offering the potential for significant improvements in the communication and processing of information. Experimental realization of large-scale quantum information processors remains a long term vision, as the required nearly pure quantum behaviour is observed only in exotic hardware such as individual laser-cooled atoms and isolated photons. But recent theoretical and experimental advances suggest that cold atoms and individual photons may lead the way towards bigger and better quantum information processors, effectively building mesoscopic versions of Schroedinger's cat' from the bottom up. (author)

  11. Information interfaces for process plant diagnosis

    International Nuclear Information System (INIS)

    Lind, M.

    1984-02-01

    The paper describes a systematic approach to the design of information interfaces for operator support in diagnosing complex systems faults. The need of interpreting primary measured plant variables within the framework of different system representations organized into an abstraction hierarchy is identified from an analysis of the problem of diagnosing complex systems. A formalized approach to the modelling of production systems, called Multilevel Flow Modelling, is described. A MFM model specifies plant control requirements and the associated need for plant information and provide a consistent context for the interpretation of real time plant signals in diagnosis of malfunctions. The use of MFM models as a basis for functional design of the plant instrumentation system is outlined, and the use of knowledge Based (Expert) Systems for the design of man-machine interfaces is mentioned. Such systems would allow an active user participation in diagnosis and thus provide the basis for cooperative problem solving. 14 refs. (author)

  12. Information structure and reference tracking in complex sentences

    CERN Document Server

    Gijn, Rik van; Matic, Dejan

    2014-01-01

    This paper discusses argument marking and reference tracking in Mekens complex clauses and their correlation to information structure. The distribution of pronominal arguments in Mekens simple clauses follows an absolutive pattern with main verbs. Complex clauses maintain the morphological absolutive argument marking, but show a nominative pattern with respect to argument reference tracking, since transitive and intransitive subjects function as syntactic pivots. The language extends the use of argument-marking verb morphology to control the reference of discourse participants across clauses.

  13. Algorithmic information theory mathematics of digital information processing

    CERN Document Server

    Seibt, Peter

    2007-01-01

    Treats the Mathematics of many important areas in digital information processing. This book covers, in a unified presentation, five topics: Data Compression, Cryptography, Sampling (Signal Theory), Error Control Codes, Data Reduction. It is useful for teachers, students and practitioners in Electronic Engineering, Computer Science and Mathematics.

  14. PHYSICAL RESOURCES OF INFORMATION PROCESSES AND TECHNOLOGIES

    Directory of Open Access Journals (Sweden)

    Mikhail O. Kolbanev

    2014-11-01

    Full Text Available Subject of study. The paper describes basic information technologies for automating of information processes of data storage, distribution and processing in terms of required physical resources. It is shown that the study of these processes with such traditional objectives of modern computer science, as the ability to transfer knowledge, degree of automation, information security, coding, reliability, and others, is not enough. The reasons are: on the one hand, the increase in the volume and intensity of information exchange in the subject of human activity and, on the other hand, drawing near to the limit of information systems efficiency based on semiconductor technologies. Creation of such technologies, which not only provide support for information interaction, but also consume a rational amount of physical resources, has become an actual problem of modern engineering development. Thus, basic information technologies for storage, distribution and processing of information to support the interaction between people are the object of study, and physical temporal, spatial and energy resources required for implementation of these technologies are the subject of study. Approaches. An attempt is made to enlarge the possibilities of traditional cybernetics methodology, which replaces the consideration of material information component by states search for information objects. It is done by taking explicitly into account the amount of physical resources required for changes in the states of information media. Purpose of study. The paper deals with working out of a common approach to the comparison and subsequent selection of basic information technologies for storage, distribution and processing of data, taking into account not only the requirements for the quality of information exchange in particular subject area and the degree of technology application, but also the amounts of consumed physical resources. Main findings. Classification of resources

  15. Academic writing development: a complex, dynamic process

    NARCIS (Netherlands)

    Penris, Wouter; Verspoor, Marjolijn; Pfenniger, Simone; Navracsics, Judit

    2017-01-01

    Traditionally we look at learning outcomes by examining single outcomes. A new and future direction is to look at the actual process of development. Imagine an advanced, 17-year-old student of English (L2) who has just finished secondary school in the Netherlands and wants to become an English

  16. APACS: Monitoring and diagnosis of complex processes

    International Nuclear Information System (INIS)

    Kramer, B.M.; Mylopoulos, J.; Cheng Wang

    1994-01-01

    This paper describes APACS - a new framework for a system that detects, predicts and identifies faults in industrial processes. The APACS frameworks provides a structure in which a heterogeneous set of programs can share a common view of the problem and a common model of the domain. (author). 17 refs, 2 figs

  17. Process Information System - Nuclear Power Plant Krsko

    International Nuclear Information System (INIS)

    Mandic, D.; Barbic, B.; Linke, B.; Colak, I.

    1998-01-01

    Original NEK design was using several Process Computer Systems (PCS) for both process control and process supervision. PCS were built by different manufacturers around different hardware and software platforms. Operational experience and new regulatory requirements imposed new technical and functional requirements on the PCS. Requirements such as: - Acquisition of new signals from the technological processes and environment - Implementation of new application programs - Significant improvement of MMI (Man Machine Interface) - Process data transfer to other than Main Control Room (MCR) locations - Process data archiving and capability to retrieve same data for future analysis were impossible to be implemented within old systems. In order to satisfy new requirements, NEK has decided to build new Process Information System (PIS). During the design and construction of the PIS Project Phase I, in addition to the main foreign contractor, there was significant participation of local architect engineering and construction companies. This paper presents experience of NEK and local partners. (author)

  18. Informational and Causal Architecture of Discrete-Time Renewal Processes

    Directory of Open Access Journals (Sweden)

    Sarah E. Marzen

    2015-07-01

    Full Text Available Renewal processes are broadly used to model stochastic behavior consisting of isolated events separated by periods of quiescence, whose durations are specified by a given probability law. Here, we identify the minimal sufficient statistic for their prediction (the set of causal states, calculate the historical memory capacity required to store those states (statistical complexity, delineate what information is predictable (excess entropy, and decompose the entropy of a single measurement into that shared with the past, future, or both. The causal state equivalence relation defines a new subclass of renewal processes with a finite number of causal states despite having an unbounded interevent count distribution. We use the resulting formulae to analyze the output of the parametrized Simple Nonunifilar Source, generated by a simple two-state hidden Markov model, but with an infinite-state ϵ-machine presentation. All in all, the results lay the groundwork for analyzing more complex processes with infinite statistical complexity and infinite excess entropy.

  19. The minimal work cost of information processing

    Science.gov (United States)

    Faist, Philippe; Dupuis, Frédéric; Oppenheim, Jonathan; Renner, Renato

    2015-07-01

    Irreversible information processing cannot be carried out without some inevitable thermodynamical work cost. This fundamental restriction, known as Landauer's principle, is increasingly relevant today, as the energy dissipation of computing devices impedes the development of their performance. Here we determine the minimal work required to carry out any logical process, for instance a computation. It is given by the entropy of the discarded information conditional to the output of the computation. Our formula takes precisely into account the statistically fluctuating work requirement of the logical process. It enables the explicit calculation of practical scenarios, such as computational circuits or quantum measurements. On the conceptual level, our result gives a precise and operational connection between thermodynamic and information entropy, and explains the emergence of the entropy state function in macroscopic thermodynamics.

  20. [Emerging infectious diseases: complex, unpredictable processes].

    Science.gov (United States)

    Guégan, Jean-François

    2016-01-01

    In the light of a double approach, at first empirical, later theoretical and comparative, illustrated by the example of the Buruli ulcer and its mycobacterial agent Mycobacterium ulcerans on which I focused my research activity these last ten years by studying determinants and factors of emerging infectious or parasitic diseases, the complexity of events explaining emerging diseases will be presented. The cascade of events occurring at various levels of spatiotemporal scales and organization of life, which lead to the numerous observed emergences, nowadays requires better taking into account the interactions between host(s), pathogen(s) and the environment by including the behavior of both individuals and the population. In numerous research studies on emerging infectious diseases, microbial hazard is described rather than infectious disease risk, the latter resulting from the confrontation between an association of threatening phenomena, or hazards, and a susceptible population. Beyond, the theme of emerging infectious diseases and its links with global environmental and societal changes leads to reconsider some well-established knowledge in infectiology and parasitology. © Société de Biologie, 2017.

  1. A framework for information warehouse development processes

    OpenAIRE

    Holten, Roland

    1999-01-01

    Since the terms Data Warehouse and On-Line Analytical Processing were proposed by Inmon and Codd, Codd, Sally respectively the traditional ideas of creating information systems in support of management¿s decision became interesting again in theory and practice. Today information warehousing is a strategic market for any data base systems vendor. Nevertheless the theoretical discussions of this topic go back to the early years of the 20th century as far as management science and accounting the...

  2. Informational Entropy and Bridge Scour Estimation under Complex Hydraulic Scenarios

    Science.gov (United States)

    Pizarro, Alonso; Link, Oscar; Fiorentino, Mauro; Samela, Caterina; Manfreda, Salvatore

    2017-04-01

    Bridges are important for society because they allow social, cultural and economic connectivity. Flood events can compromise the safety of bridge piers up to the complete collapse. The Bridge Scour phenomena has been described by empirical formulae deduced from hydraulic laboratory experiments. The range of applicability of such models is restricted by the specific hydraulic conditions or flume geometry used for their derivation (e.g., water depth, mean flow velocity, pier diameter and sediment properties). We seek to identify a general formulation able to capture the main dynamic of the process in order to cover a wide range of hydraulic and geometric configuration, allowing to extend our analysis in different contexts. Therefore, exploiting the Principle of Maximum Entropy (POME) and applying it on the recently proposed dimensionless Effective flow work, W*, we derived a simple model characterized by only one parameter. The proposed Bridge Scour Entropic (BRISENT) model shows good performances under complex hydraulic conditions as well as under steady-state flow. Moreover, the model was able to capture the evolution of scour in several hydraulic configurations even if the model contains only one parameter. Furthermore, results show that the model parameter is controlled by the geometric configurations of the experiment. This offers a possible strategy to obtain a priori model parameter calibration. The BRISENT model represents a good candidate for estimating the time-dependent scour depth under complex hydraulic scenarios. The authors are keen to apply this idea for describing the scour behavior during a real flood event. Keywords: Informational entropy, Sediment transport, Bridge pier scour, Effective flow work.

  3. Vulnerability of complex networks under intentional attack with incomplete information

    International Nuclear Information System (INIS)

    Wu, J; Deng, H Z; Tan, Y J; Zhu, D Z

    2007-01-01

    We study the vulnerability of complex networks under intentional attack with incomplete information, which means that one can only preferentially attack the most important nodes among a local region of a network. The known random failure and the intentional attack are two extreme cases of our study. Using the generating function method, we derive the exact value of the critical removal fraction f c of nodes for the disintegration of networks and the size of the giant component. To validate our model and method, we perform simulations of intentional attack with incomplete information in scale-free networks. We show that the attack information has an important effect on the vulnerability of scale-free networks. We also demonstrate that hiding a fraction of the nodes information is a cost-efficient strategy for enhancing the robustness of complex networks

  4. An analytical approach to customer requirement information processing

    Science.gov (United States)

    Zhou, Zude; Xiao, Zheng; Liu, Quan; Ai, Qingsong

    2013-11-01

    'Customer requirements' (CRs) management is a key component of customer relationship management (CRM). By processing customer-focused information, CRs management plays an important role in enterprise systems (ESs). Although two main CRs analysis methods, quality function deployment (QFD) and Kano model, have been applied to many fields by many enterprises in the past several decades, the limitations such as complex processes and operations make them unsuitable for online businesses among small- and medium-sized enterprises (SMEs). Currently, most SMEs do not have the resources to implement QFD or Kano model. In this article, we propose a method named customer requirement information (CRI), which provides a simpler and easier way for SMEs to run CRs analysis. The proposed method analyses CRs from the perspective of information and applies mathematical methods to the analysis process. A detailed description of CRI's acquisition, classification and processing is provided.

  5. EFFECTIVE COMPLEX PROCESSING OF RAW TOMATOES

    Directory of Open Access Journals (Sweden)

    AIDA M. GADZHIEVA

    2018-03-01

    Full Text Available Tomatoes grown in the central and southern parts of the country, which contain 5 - 6 % of solids, including 0.13 % of pectin, 0.86 % of fat, 0.5 % of organic acids, 0.5 % minerals, etc. are used as research material. These tomatoes, grown in the mountains, on soils with high salinity, contain high amounts of valuable components and have long term preservation. For the extraction of valuable components from dried tomato pomace, the CO2 extraction method is applied. The technological and environmental feasibility of graded tomato drying in the atmosphere of an inert gas and in a solar drier is evaluated; the scheme of dried tomatoes production is improved; a system for tomato pomace drying is developed; a scheme of tomato powder production from pulp, skin and seeds is developed. The combined method of tomato pomace drying involves the simultaneous use of electromagnetic field of low and ultra-high frequency and blowing hot nitrogen on the product surface. Conducting the drying process in the atmosphere of nitrogen intensifies the process of removing moisture from tomatoes. The expediency of using tomato powder as an enriching additive is proved. Based on the study of the chemical composition of the tomato powder made from the Dagestan varieties, and on the organoleptic evaluation and physicochemical analysis of finished products, we prove the best degree of recoverability of tomato powder in the production of reconstituted juice and tomato beverages.

  6. COMPLEX PROCESSING TECHNOLOGY OF TOMATO RAW MATERIALS

    Directory of Open Access Journals (Sweden)

    A. M. Gadzhieva

    2015-01-01

    Full Text Available Tomatoes grown in the central and southern parts of the country, which contain 5-6 % of solids, including 0.13 % of pectin, 0.86 % of fat, 0.5 % of organic acids; 0.5 % minerals, etc. were used as a subject of research. These tomatoes, grown in the mountains, on soils with high salinity, contain high amounts of valuable components and have a long-term preservation. For the extraction of valuable components from dried tomato pomace CO2 extraction method was applied. Technological and environmental feasibility of tomatoes stage drying in the atmosphere of inert gas in solar dry kiln were evaluated; production scheme of dried tomatoes is improved; a system for tomato pomace drying is developed; a production scheme of powders of pulp, skin and seeds of tomatoes is developed. Combined method of tomato pomace drying involves the simultaneous use of the electromagnetic field of low and ultra-high frequency and blowing product surface with hot nitrogen. Conducting the drying process in an inert gas atmosphere of nitrogen intensified the process of moisture removing from tomatoes. The expediency of using tomato powder as enriching additive was proved. Based on the study of the chemical composition of the tomato powder made from Dagestan varieties of tomatoes, and on the organoleptic evaluation and physico-chemical studies of finished products, we have proved the best degree of recoverability of tomato powder during the production of reconstituted juice and tomato beverages.

  7. Markovian Processes for Quantitative Information Leakage

    DEFF Research Database (Denmark)

    Biondi, Fabrizio

    Quantification of information leakage is a successful approach for evaluating the security of a system. It models the system to be analyzed as a channel with the secret as the input and an output as observable by the attacker as the output, and applies information theory to quantify the amount...... and randomized processes with Markovian models and to compute their information leakage for a very general model of attacker. We present the QUAIL tool that automates such analysis and is able to compute the information leakage of an imperative WHILE language. Finally, we show how to use QUAIL to analyze some...... of information transmitted through such channel, thus effectively quantifying how many bits of the secret can be inferred by the attacker by analyzing the system’s output. Channels are usually encoded as matrices of conditional probabilities, known as channel matrices. Such matrices grow exponentially...

  8. Optimal Information Processing in Biochemical Networks

    Science.gov (United States)

    Wiggins, Chris

    2012-02-01

    A variety of experimental results over the past decades provide examples of near-optimal information processing in biological networks, including in biochemical and transcriptional regulatory networks. Computing information-theoretic quantities requires first choosing or computing the joint probability distribution describing multiple nodes in such a network --- for example, representing the probability distribution of finding an integer copy number of each of two interacting reactants or gene products while respecting the `intrinsic' small copy number noise constraining information transmission at the scale of the cell. I'll given an overview of some recent analytic and numerical work facilitating calculation of such joint distributions and the associated information, which in turn makes possible numerical optimization of information flow in models of noisy regulatory and biochemical networks. Illustrating cases include quantification of form-function relations, ideal design of regulatory cascades, and response to oscillatory driving.

  9. Scalable Networked Information Processing Environment (SNIPE)

    Energy Technology Data Exchange (ETDEWEB)

    Fagg, G.E.; Moore, K. [Univ. of Tennessee, Knoxville, TN (United States). Dept. of Computer Science; Dongarra, J.J. [Univ. of Tennessee, Knoxville, TN (United States). Dept. of Computer Science]|[Oak Ridge National Lab., TN (United States). Computer Science and Mathematics Div.; Geist, A. [Oak Ridge National Lab., TN (United States). Computer Science and Mathematics Div.

    1997-11-01

    SNIPE is a metacomputing system that aims to provide a reliable, secure, fault tolerant environment for long term distributed computing applications and data stores across the global Internet. This system combines global naming and replication of both processing and data to support large scale information processing applications leading to better availability and reliability than currently available with typical cluster computing and/or distributed computer environments.

  10. Compliance with Environmental Regulations through Complex Geo-Event Processing

    Directory of Open Access Journals (Sweden)

    Federico Herrera

    2017-11-01

    Full Text Available In a context of e-government, there are usually regulatory compliance requirements that support systems must monitor, control and enforce. These requirements may come from environmental laws and regulations that aim to protect the natural environment and mitigate the effects of pollution on human health and ecosystems. Monitoring compliance with these requirements involves processing a large volume of data from different sources, which is a major challenge. This volume is also increased with data coming from autonomous sensors (e.g. reporting carbon emission in protected areas and from citizens providing information (e.g. illegal dumping in a voluntary way. Complex Event Processing (CEP technologies allow processing large amount of event data and detecting patterns from them. However, they do not provide native support for the geographic dimension of events which is essential for monitoring requirements which apply to specific geographic areas. This paper proposes a geospatial extension for CEP that allows monitoring environmental requirements considering the geographic location of the processed data. We extend an existing platform-independent, model-driven approach for CEP adding the geographic location to events and specifying patterns using geographic operators. The use and technical feasibility of the proposal is shown through the development of a case study and the implementation of a prototype.

  11. 1 SUPPLEMENTARY INFORMATION A novel zinc(II) complex ...

    Indian Academy of Sciences (India)

    BİLGİSAYAR

    1. SUPPLEMENTARY INFORMATION. A novel zinc(II) complex containing square pyramidal, octahedral and tetrahedral geometries on the same polymeric chain constructed from pyrazine-2,3-dicarboxylic acid and 1-vinylimidazole. HAKAN YILMAZ* and OMER ANDAC. Department of Chemistry, Ondokuz Mayis University, ...

  12. Information Processing in Auto-regulated Systems

    Directory of Open Access Journals (Sweden)

    Karl Javorszky

    2003-06-01

    Full Text Available Abstract: We present a model of information processing which is based on two concurrent ways of describing the world, where a description in one of the languages limits the possibilities for realisations in the other language. The two describing dimensions appear in our common sense as dichotomies of perspectives: subjective - objective; diversity - similarity; individual - collective. We abstract from the subjective connotations and treat the test theoretical case of an interval on which several concurrent categories can be introduced. We investigate multidimensional partitions as potential carriers of information and compare their efficiency to that of sequenced carriers. We regard the same assembly once as a contemporary collection, once as a longitudinal sequence and find promising inroads towards understanding information processing by auto-regulated systems. Information is understood to point out that what is the case from among alternatives, which could be the case. We have translated these ideas into logical operations on the set of natural numbers and have found two equivalence points on N where matches between sequential and commutative ways of presenting a state of the world can agree in a stable fashion: a flip-flop mechanism is envisioned. By following this new approach, a mathematical treatment of some poignant biomathematical problems is allowed. Also, the concepts presented in this treatise may well have relevance and applications within the information processing and the theory of language fields.

  13. Springfield Processing Plant (SPP) Facility Information

    Energy Technology Data Exchange (ETDEWEB)

    Leach, Janice; Torres, Teresa M.

    2012-10-01

    The Springfield Processing Plant is a hypothetical facility. It has been constructed for use in training workshops. Information is provided about the facility and its surroundings, particularly security-related aspects such as target identification, threat data, entry control, and response force data.

  14. Motivated information processing and group decision refusal

    NARCIS (Netherlands)

    Nijstad, Bernard A.; Oltmanns, Jan

    Group decision making has attracted much scientific interest, but few studies have investigated group decisions that do not get made. Based on the Motivated Information Processing in Groups model, this study analysed the effect of epistemic motivation (low vs. high) and social motivation (proself

  15. Multidimensional biochemical information processing of dynamical patterns.

    Science.gov (United States)

    Hasegawa, Yoshihiko

    2018-02-01

    Cells receive signaling molecules by receptors and relay information via sensory networks so that they can respond properly depending on the type of signal. Recent studies have shown that cells can extract multidimensional information from dynamical concentration patterns of signaling molecules. We herein study how biochemical systems can process multidimensional information embedded in dynamical patterns. We model the decoding networks by linear response functions, and optimize the functions with the calculus of variations to maximize the mutual information between patterns and output. We find that, when the noise intensity is lower, decoders with different linear response functions, i.e., distinct decoders, can extract much information. However, when the noise intensity is higher, distinct decoders do not provide the maximum amount of information. This indicates that, when transmitting information by dynamical patterns, embedding information in multiple patterns is not optimal when the noise intensity is very large. Furthermore, we explore the biochemical implementations of these decoders using control theory and demonstrate that these decoders can be implemented biochemically through the modification of cascade-type networks, which are prevalent in actual signaling pathways.

  16. Information processing in the vertebrate habenula.

    Science.gov (United States)

    Fore, Stephanie; Palumbo, Fabrizio; Pelgrims, Robbrecht; Yaksi, Emre

    2018-06-01

    The habenula is a brain region that has gained increasing popularity over the recent years due to its role in processing value-related and experience-dependent information with a strong link to depression, addiction, sleep and social interactions. This small diencephalic nucleus is proposed to act as a multimodal hub or a switchboard, where inputs from different brain regions converge. These diverse inputs to the habenula carry information about the sensory world and the animal's internal state, such as reward expectation or mood. However, it is not clear how these diverse habenular inputs interact with each other and how such interactions contribute to the function of habenular circuits in regulating behavioral responses in various tasks and contexts. In this review, we aim to discuss how information processing in habenular circuits, can contribute to specific behavioral programs that are attributed to the habenula. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Testing an alternate informed consent process.

    Science.gov (United States)

    Yates, Bernice C; Dodendorf, Diane; Lane, Judy; LaFramboise, Louise; Pozehl, Bunny; Duncan, Kathleen; Knodel, Kendra

    2009-01-01

    One of the main problems in conducting clinical trials is low participation rate due to potential participants' misunderstanding of the rationale for the clinical trial or perceptions of loss of control over treatment decisions. The objective of this study was to test an alternate informed consent process in cardiac rehabilitation participants that involved the use of a multimedia flip chart to describe a future randomized clinical trial and then asked, hypothetically, if they would participate in the future trial. An attractive and inviting visual presentation of the study was created in the form of a 23-page flip chart that included 24 color photographs displaying information about the purpose of the study, similarities and differences between the two treatment groups, and the data collection process. We tested the flip chart in 35 cardiac rehabilitation participants. Participants were asked if they would participate in this future study on two occasions: immediately after the description of the flip chart and 24 hours later, after reading through the informed consent document. Participants were also asked their perceptions of the flip chart and consent process. Of the 35 participants surveyed, 19 (54%) indicated that they would participate in the future study. No participant changed his or her decision 24 hours later after reading the full consent form. The participation rate improved 145% over that of an earlier feasibility study where the recruitment rate was 22%. Most participants stated that the flip chart was helpful and informative and that the photographs were effective in communicating the purpose of the study. Participation rates could be enhanced in future clinical trials by using a visual presentation to explain and describe the study as part of the informed consent process. More research is needed to test alternate methods of obtaining informed consent.

  18. Information Processing Capacity of Dynamical Systems

    Science.gov (United States)

    Dambre, Joni; Verstraeten, David; Schrauwen, Benjamin; Massar, Serge

    2012-07-01

    Many dynamical systems, both natural and artificial, are stimulated by time dependent external signals, somehow processing the information contained therein. We demonstrate how to quantify the different modes in which information can be processed by such systems and combine them to define the computational capacity of a dynamical system. This is bounded by the number of linearly independent state variables of the dynamical system, equaling it if the system obeys the fading memory condition. It can be interpreted as the total number of linearly independent functions of its stimuli the system can compute. Our theory combines concepts from machine learning (reservoir computing), system modeling, stochastic processes, and functional analysis. We illustrate our theory by numerical simulations for the logistic map, a recurrent neural network, and a two-dimensional reaction diffusion system, uncovering universal trade-offs between the non-linearity of the computation and the system's short-term memory.

  19. Information Processing Capacity of Dynamical Systems

    Science.gov (United States)

    Dambre, Joni; Verstraeten, David; Schrauwen, Benjamin; Massar, Serge

    2012-01-01

    Many dynamical systems, both natural and artificial, are stimulated by time dependent external signals, somehow processing the information contained therein. We demonstrate how to quantify the different modes in which information can be processed by such systems and combine them to define the computational capacity of a dynamical system. This is bounded by the number of linearly independent state variables of the dynamical system, equaling it if the system obeys the fading memory condition. It can be interpreted as the total number of linearly independent functions of its stimuli the system can compute. Our theory combines concepts from machine learning (reservoir computing), system modeling, stochastic processes, and functional analysis. We illustrate our theory by numerical simulations for the logistic map, a recurrent neural network, and a two-dimensional reaction diffusion system, uncovering universal trade-offs between the non-linearity of the computation and the system's short-term memory. PMID:22816038

  20. Simple, complex and hyper-complex understanding - enhanced sensitivity in observation of information

    DEFF Research Database (Denmark)

    Bering Keiding, Tina

    for construction and analysis of empirical information. A quick overview on empirical research drawing on Luhmann reveals a diverse complex of analytical strategies and empirical methods. Despite differences between strategies and methods they have in common that understanding of uttered information is crucial...... in their production of empirically founded knowledge. However research generally seems to pay more attention to production of uttered information than to selection of understanding. The aim of this contribution is to sketch out a suggestion to how selection of understanding can be systematized in order to produce...... enhanced transparency in selection of understanding as well as enhanced sensitivity and definition in dept. The contribution suggest that we distinguish between three types of understanding; simple, complex and hyper-complex understanding. Simple understanding is the simultaneous selection of understanding...

  1. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    Science.gov (United States)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  2. Quantifying Complexity in Quantum Phase Transitions via Mutual Information Complex Networks.

    Science.gov (United States)

    Valdez, Marc Andrew; Jaschke, Daniel; Vargas, David L; Carr, Lincoln D

    2017-12-01

    We quantify the emergent complexity of quantum states near quantum critical points on regular 1D lattices, via complex network measures based on quantum mutual information as the adjacency matrix, in direct analogy to quantifying the complexity of electroencephalogram or functional magnetic resonance imaging measurements of the brain. Using matrix product state methods, we show that network density, clustering, disparity, and Pearson's correlation obtain the critical point for both quantum Ising and Bose-Hubbard models to a high degree of accuracy in finite-size scaling for three classes of quantum phase transitions, Z_{2}, mean field superfluid to Mott insulator, and a Berzinskii-Kosterlitz-Thouless crossover.

  3. Quantifying Complexity in Quantum Phase Transitions via Mutual Information Complex Networks

    Science.gov (United States)

    Valdez, Marc Andrew; Jaschke, Daniel; Vargas, David L.; Carr, Lincoln D.

    2017-12-01

    We quantify the emergent complexity of quantum states near quantum critical points on regular 1D lattices, via complex network measures based on quantum mutual information as the adjacency matrix, in direct analogy to quantifying the complexity of electroencephalogram or functional magnetic resonance imaging measurements of the brain. Using matrix product state methods, we show that network density, clustering, disparity, and Pearson's correlation obtain the critical point for both quantum Ising and Bose-Hubbard models to a high degree of accuracy in finite-size scaling for three classes of quantum phase transitions, Z2, mean field superfluid to Mott insulator, and a Berzinskii-Kosterlitz-Thouless crossover.

  4. Vision and visual information processing in cubozoans

    DEFF Research Database (Denmark)

    Bielecki, Jan

    relationship between acuity and light sensitivity. Animals have evolved a wide variety of solutions to this problem such as folded membranes, to have a larger receptive surfaces, and lenses, to focus light onto the receptive membranes. On the neural capacity side, complex eyes demand huge processing network...... animals in a wide range of behaviours. It is intuitive that a complex eye is energetically very costly, not only in components but also in neural involvement. The increasing behavioural demand added pressure on design specifications and eye evolution is considered an optimization of the inverse...... fit their need. Visual neuroethology integrates optics, sensory equipment, neural network and motor output to explain how animals can perform behaviour in response to a specific visual stimulus. In this doctoral thesis, I will elucidate the individual steps in a visual neuroethological pathway...

  5. Enabling Controlling Complex Networks with Local Topological Information.

    Science.gov (United States)

    Li, Guoqi; Deng, Lei; Xiao, Gaoxi; Tang, Pei; Wen, Changyun; Hu, Wuhua; Pei, Jing; Shi, Luping; Stanley, H Eugene

    2018-03-15

    Complex networks characterize the nature of internal/external interactions in real-world systems including social, economic, biological, ecological, and technological networks. Two issues keep as obstacles to fulfilling control of large-scale networks: structural controllability which describes the ability to guide a dynamical system from any initial state to any desired final state in finite time, with a suitable choice of inputs; and optimal control, which is a typical control approach to minimize the cost for driving the network to a predefined state with a given number of control inputs. For large complex networks without global information of network topology, both problems remain essentially open. Here we combine graph theory and control theory for tackling the two problems in one go, using only local network topology information. For the structural controllability problem, a distributed local-game matching method is proposed, where every node plays a simple Bayesian game with local information and local interactions with adjacent nodes, ensuring a suboptimal solution at a linear complexity. Starring from any structural controllability solution, a minimizing longest control path method can efficiently reach a good solution for the optimal control in large networks. Our results provide solutions for distributed complex network control and demonstrate a way to link the structural controllability and optimal control together.

  6. STAR-GENERIS - a software package for information processing

    International Nuclear Information System (INIS)

    Felkel, L.

    1985-01-01

    Man-machine-communication in electrical power plants is increasingly based on the capabilities of minicomputers. Rather than just displaying raw process data more complex processing is done to aid operators by improving information quality. Advanced operator aids for nuclear power plants are, e.g. alarm reduction, disturbance analysis and expert systems. Operator aids use complex combinations and computations of plant signals, which have to be described in a formal and homogeneous way. The design of such computer-based information systems requires extensive software and engineering efforts. The STAR software concept reduces the software effort to a minimum by proving an advanced program package which facilitates specification and implementation of engineering know-how necessary for sophisticated operator aids. (orig./HP) [de

  7. Expectation, information processing, and subjective duration.

    Science.gov (United States)

    Simchy-Gross, Rhimmon; Margulis, Elizabeth Hellmuth

    2018-01-01

    In research on psychological time, it is important to examine the subjective duration of entire stimulus sequences, such as those produced by music (Teki, Frontiers in Neuroscience, 10, 2016). Yet research on the temporal oddball illusion (according to which oddball stimuli seem longer than standard stimuli of the same duration) has examined only the subjective duration of single events contained within sequences, not the subjective duration of sequences themselves. Does the finding that oddballs seem longer than standards translate to entire sequences, such that entire sequences that contain oddballs seem longer than those that do not? Is this potential translation influenced by the mode of information processing-whether people are engaged in direct or indirect temporal processing? Two experiments aimed to answer both questions using different manipulations of information processing. In both experiments, musical sequences either did or did not contain oddballs (auditory sliding tones). To manipulate information processing, we varied the task (Experiment 1), the sequence event structure (Experiments 1 and 2), and the sequence familiarity (Experiment 2) independently within subjects. Overall, in both experiments, the sequences that contained oddballs seemed shorter than those that did not when people were engaged in direct temporal processing, but longer when people were engaged in indirect temporal processing. These findings support the dual-process contingency model of time estimation (Zakay, Attention, Perception & Psychophysics, 54, 656-664, 1993). Theoretical implications for attention-based and memory-based models of time estimation, the pacemaker accumulator and coding efficiency hypotheses of time perception, and dynamic attending theory are discussed.

  8. Selected Topics on Managing Complexity and Information Systems Engineering: Editorial Introduction to Issue 8 of CSIMQ

    Directory of Open Access Journals (Sweden)

    Peter Forbrig

    2016-10-01

    Full Text Available Business process models greatly contribute to analyze and understand the activities of enterprises. However, it is still a challenge to cope with the complexity of systems specifications and their requirements. This issue of the journal of Complex Systems Informatics and Modeling (CSIMQ presents papers that discuss topics on managing complexity and information systems engineering. The papers are extended versions of selected papers from the workshop on Continuous Requirements Engineering held at the requirements engineering conference REFSQ 2016 in Gothenburg, the workshop on Managed Complexity held at the business informatics conference BIR 2016 in Prague, and the CAiSE 2016 Forum held in Ljubljana.

  9. Disjunctive Information Flow for Communicating Processes

    DEFF Research Database (Denmark)

    Li, Ximeng; Nielson, Flemming; Nielson, Hanne Riis

    2016-01-01

    The security validation of practical computer systems calls for the ability to specify and verify information flow policies that are dependent on data content. Such policies play an important role in concurrent, communicating systems: consider a scenario where messages are sent to different...... processes according to their tagging. We devise a security type system that enforces content-dependent information flow policies in the presence of communication and concurrency. The type system soundly guarantees a compositional noninterference property. All theoretical results have been formally proved...

  10. Influence Processes for Information Technology Acceptance

    DEFF Research Database (Denmark)

    Bhattacherjee, Anol; Sanford, Clive Carlton

    2006-01-01

    This study examines how processes of external influence shape information technology acceptance among potential users, how such influence effects vary across a user population, and whether these effects are persistent over time. Drawing on the elaboration-likelihood model (ELM), we compared two...... alternative influence processes, the central and peripheral routes, in motivating IT acceptance. These processes were respectively operationalized using the argument quality and source credibility constructs, and linked to perceived usefulness and attitude, the core perceptual drivers of IT acceptance. We...... further examined how these influence processes were moderated by users' IT expertise and perceived job relevance and the temporal stability of such influence effects. Nine hypotheses thus developed were empirically validated using a field survey of document management system acceptance at an eastern...

  11. Inclusive Education as Complex Process and Challenge for School System

    Directory of Open Access Journals (Sweden)

    Al-Khamisy Danuta

    2015-08-01

    Full Text Available Education may be considered as a number of processes, actions and effects affecting human being, as the state or level of the results of these processes or as the modification of the functions, institutions and social practices roles, which in the result of inclusion become new, integrated system. Thus this is very complex process. Nowadays the complexity appears to be one of very significant terms both in science and in philosophy. It appears that despite searching for simple rules, strategies, solutions everything is still more complex. The environment is complex, the organism living in it and exploring it, and just the exploration itself is a complex phenomenon, much more than this could initially seem to be.

  12. Visual Information Processing for Television and Telerobotics

    Science.gov (United States)

    Huck, Friedrich O. (Editor); Park, Stephen K. (Editor)

    1989-01-01

    This publication is a compilation of the papers presented at the NASA conference on Visual Information Processing for Television and Telerobotics. The conference was held at the Williamsburg Hilton, Williamsburg, Virginia on May 10 to 12, 1989. The conference was sponsored jointly by NASA Offices of Aeronautics and Space Technology (OAST) and Space Science and Applications (OSSA) and the NASA Langley Research Center. The presentations were grouped into three sessions: Image Gathering, Coding, and Advanced Concepts; Systems; and Technologies. The program was organized to provide a forum in which researchers from industry, universities, and government could be brought together to discuss the state of knowledge in image gathering, coding, and processing methods.

  13. Quantum information processing with trapped ions

    International Nuclear Information System (INIS)

    Haeffner, H.; Haensel, W.; Rapol, U.; Koerber, T.; Benhelm, J.; Riebe, M.; Chek-al-Kar, D.; Schmidt-Kaler, F.; Becher, C.; Roos, C.; Blatt, R.

    2005-01-01

    Single Ca + ions and crystals of Ca + ions are confined in a linear Paul trap and are investigated for quantum information processing. Here we report on recent experimental advancements towards a quantum computer with such a system. Laser-cooled trapped ions are ideally suited systems for the investigation and implementation of quantum information processing as one can gain almost complete control over their internal and external degrees of freedom. The combination of a Paul type ion trap with laser cooling leads to unique properties of trapped cold ions, such as control of the motional state down to the zero-point of the trapping potential, a high degree of isolation from the environment and thus a very long time available for manipulations and interactions at the quantum level. The very same properties make single trapped atoms and ions well suited for storing quantum information in long lived internal states, e.g. by encoding a quantum bit (qubit) of information within the coherent superposition of the S 1/2 ground state and the metastable D 5/2 excited state of Ca + . Recently we have achieved the implementation of simple algorithms with up to 3 qubits on an ion-trap quantum computer. We will report on methods to implement single qubit rotations, the realization of a two-qubit universal quantum gate (Cirac-Zoller CNOT-gate), the deterministic generation of multi-particle entangled states (GHZ- and W-states), their full tomographic reconstruction, the realization of deterministic quantum teleportation, its quantum process tomography and the encoding of quantum information in decoherence-free subspaces with coherence times exceeding 20 seconds. (author)

  14. Complexity, Methodology and Method: Crafting a Critical Process of Research

    Science.gov (United States)

    Alhadeff-Jones, Michel

    2013-01-01

    This paper defines a theoretical framework aiming to support the actions and reflections of researchers looking for a "method" in order to critically conceive the complexity of a scientific process of research. First, it starts with a brief overview of the core assumptions framing Morin's "paradigm of complexity" and Le…

  15. Quantum Information Processing with Trapped Ions

    International Nuclear Information System (INIS)

    Barrett, M.D.; Schaetz, T.; Chiaverini, J.; Leibfried, D.; Britton, J.; Itano, W.M.; Jost, J.D.; Langer, C.; Ozeri, R.; Wineland, D.J.; Knill, E.

    2005-01-01

    We summarize two experiments on the creation and manipulation of multi-particle entangled states of trapped atomic ions - quantum dense coding and quantum teleportation. The techniques used in these experiments constitute an important step toward performing large-scale quantum information processing. The techniques also have application in other areas of physics, providing improvement in quantum-limited measurement and fundamental tests of quantum mechanical principles, for example

  16. Human Information Processing and Supervisory Control.

    Science.gov (United States)

    1980-05-01

    errors (that is of the output of the human operator). There is growing evidence (Senders, personal communication; Norman , personal communication...relates to the relative tendency to depend on sensory information or to be more analytic and independent. Norman (personal communication) has referred...decision process model. Ergonomics, 12, 543-557. Senders, J., Elkid, J., Grignetti, M., & Smallwood , R. 1966. An investigation of the visual sampling

  17. Aiming for knowledge information processing systems

    Energy Technology Data Exchange (ETDEWEB)

    Fuchi, K

    1982-01-01

    The Fifth Generation Computer Project in Japan intends to develop a new generation of computers by extensive research in many areas. This paper discusses many research topics which the Japanese are hoping will lead to a radical new knowledge information processing system. Topics discussed include new computer architecture, programming styles, semantics of programming languages, relational databases, linguistics theory, artificial intelligence, functional images and interference systems.

  18. Processing Information in Quantum Decision Theory

    OpenAIRE

    Yukalov, V. I.; Sornette, D.

    2008-01-01

    A survey is given summarizing the state of the art of describing information processing in Quantum Decision Theory, which has been recently advanced as a novel variant of decision making, based on the mathematical theory of separable Hilbert spaces. This mathematical structure captures the effect of superposition of composite prospects, including many incorporated intended actions. The theory characterizes entangled decision making, non-commutativity of subsequent decisions, and intention int...

  19. Manipulating cold atoms for quantum information processing

    International Nuclear Information System (INIS)

    Knight, P.

    2005-01-01

    Full text: I will describe how cold atoms can be manipulated to realize arrays of addressable qbits as prototype quantum registers, focussing on how atom chips can be used in combination with cavity qed techniques to form such an array. I will discuss how the array can be generated and steered using optical lattices and the Mott transition, and describe the sources of noise and how these place limits on the use of such chips in quantum information processing. (author)

  20. Processing of complex auditory patterns in musicians and nonmusicians.

    Science.gov (United States)

    Boh, Bastiaan; Herholz, Sibylle C; Lappe, Claudia; Pantev, Christo

    2011-01-01

    In the present study we investigated the capacity of the memory store underlying the mismatch negativity (MMN) response in musicians and nonmusicians for complex tone patterns. While previous studies have focused either on the kind of information that can be encoded or on the decay of the memory trace over time, we studied capacity in terms of the length of tone sequences, i.e., the number of individual tones that can be fully encoded and maintained. By means of magnetoencephalography (MEG) we recorded MMN responses to deviant tones that could occur at any position of standard tone patterns composed of four, six or eight tones during passive, distracted listening. Whereas there was a reliable MMN response to deviant tones in the four-tone pattern in both musicians and nonmusicians, only some individuals showed MMN responses to the longer patterns. This finding of a reliable capacity of the short-term auditory store underlying the MMN response is in line with estimates of a three to five item capacity of the short-term memory trace from behavioural studies, although pitch and contour complexity covaried with sequence length, which might have led to an understatement of the reported capacity. Whereas there was a tendency for an enhancement of the pattern MMN in musicians compared to nonmusicians, a strong advantage for musicians could be shown in an accompanying behavioural task of detecting the deviants while attending to the stimuli for all pattern lengths, indicating that long-term musical training differentially affects the memory capacity of auditory short-term memory for complex tone patterns with and without attention. Also, a left-hemispheric lateralization of MMN responses in the six-tone pattern suggests that additional networks that help structuring the patterns in the temporal domain might be recruited for demanding auditory processing in the pitch domain.

  1. Processing of complex auditory patterns in musicians and nonmusicians.

    Directory of Open Access Journals (Sweden)

    Bastiaan Boh

    Full Text Available In the present study we investigated the capacity of the memory store underlying the mismatch negativity (MMN response in musicians and nonmusicians for complex tone patterns. While previous studies have focused either on the kind of information that can be encoded or on the decay of the memory trace over time, we studied capacity in terms of the length of tone sequences, i.e., the number of individual tones that can be fully encoded and maintained. By means of magnetoencephalography (MEG we recorded MMN responses to deviant tones that could occur at any position of standard tone patterns composed of four, six or eight tones during passive, distracted listening. Whereas there was a reliable MMN response to deviant tones in the four-tone pattern in both musicians and nonmusicians, only some individuals showed MMN responses to the longer patterns. This finding of a reliable capacity of the short-term auditory store underlying the MMN response is in line with estimates of a three to five item capacity of the short-term memory trace from behavioural studies, although pitch and contour complexity covaried with sequence length, which might have led to an understatement of the reported capacity. Whereas there was a tendency for an enhancement of the pattern MMN in musicians compared to nonmusicians, a strong advantage for musicians could be shown in an accompanying behavioural task of detecting the deviants while attending to the stimuli for all pattern lengths, indicating that long-term musical training differentially affects the memory capacity of auditory short-term memory for complex tone patterns with and without attention. Also, a left-hemispheric lateralization of MMN responses in the six-tone pattern suggests that additional networks that help structuring the patterns in the temporal domain might be recruited for demanding auditory processing in the pitch domain.

  2. Quantum information processing and nuclear magnetic resonance

    International Nuclear Information System (INIS)

    Cummins, H.K.

    2001-01-01

    Quantum computers are information processing devices which operate by and exploit the laws of quantum mechanics, potentially allowing them to solve problems which are intractable using classical computers. This dissertation considers the practical issues involved in one of the more successful implementations to date, nuclear magnetic resonance (NMR). Techniques for dealing with systematic errors are presented, and a quantum protocol is implemented. Chapter 1 is a brief introduction to quantum computation. The physical basis of its efficiency and issues involved in its implementation are discussed. NMR quantum information processing is reviewed in more detail in Chapter 2. Chapter 3 considers some of the errors that may be introduced in the process of implementing an algorithm, and high-level ways of reducing the impact of these errors by using composite rotations. Novel general expressions for stabilising composite rotations are presented in Chapter 4 and a new class of composite rotations, tailored composite rotations, presented in Chapter 5. Chapter 6 describes some of the advantages and pitfalls of combining composite rotations. Experimental evaluations of the composite rotations are given in each case. An actual implementation of a quantum information protocol, approximate quantum cloning, is presented in Chapter 7. The dissertation ends with appendices which contain expansions of some equations and detailed calculations of certain composite rotation results, as well as spectrometer pulse sequence programs. (author)

  3. A tool for filtering information in complex systems

    OpenAIRE

    Tumminello, M.; Aste, T.; Di Matteo, T.; Mantegna, R. N.

    2005-01-01

    We introduce a technique to filter out complex data sets by extracting a subgraph of representative links. Such a filtering can be tuned up to any desired level by controlling the genus of the resulting graph. We show that this technique is especially suitable for correlation-based graphs, giving filtered graphs that preserve the hierarchical organization of the minimum spanning tree but containing a larger amount of information in their internal structure. In particular in the case of planar...

  4. Procurement of complex performance in public infrastructure: a process perspective

    OpenAIRE

    Hartmann, Andreas; Roehrich, Jens; Davies, Andrew; Frederiksen, Lars; Davies, J.; Harrington, T.; Kirkwood, D.; Holweg, M.

    2011-01-01

    The paper analyzes the process of transitioning from procuring single products and services to procuring complex performance in public infrastructure. The aim is to examine the change in the interactions between buyer and supplier, the emergence of value co-creation and the capability development during the transition process. Based on a multiple, longitudinal case study the paper proposes three generic transition stages towards increased performance and infrastructural complexity. These stag...

  5. Processing multilevel secure test and evaluation information

    Science.gov (United States)

    Hurlburt, George; Hildreth, Bradley; Acevedo, Teresa

    1994-07-01

    The Test and Evaluation Community Network (TECNET) is building a Multilevel Secure (MLS) system. This system features simultaneous access to classified and unclassified information and easy access through widely available communications channels. It provides the necessary separation of classification levels, assured through the use of trusted system design techniques, security assessments and evaluations. This system enables cleared T&E users to view and manipulate classified and unclassified information resources either using a single terminal interface or multiple windows in a graphical user interface. TECNET is in direct partnership with the National Security Agency (NSA) to develop and field the MLS TECNET capability in the near term. The centerpiece of this partnership is a state-of-the-art Concurrent Systems Security Engineering (CSSE) process. In developing the MLS TECNET capability, TECNET and NSA are providing members, with various expertise and diverse backgrounds, to participate in the CSSE process. The CSSE process is founded on the concepts of both Systems Engineering and Concurrent Engineering. Systems Engineering is an interdisciplinary approach to evolve and verify an integrated and life cycle balanced set of system product and process solutions that satisfy customer needs (ASD/ENS-MIL STD 499B 1992). Concurrent Engineering is design and development using the simultaneous, applied talents of a diverse group of people with the appropriate skills. Harnessing diverse talents to support CSSE requires active participation by team members in an environment that both respects and encourages diversity.

  6. Predicting protein complexes using a supervised learning method combined with local structural information.

    Science.gov (United States)

    Dong, Yadong; Sun, Yongqi; Qin, Chao

    2018-01-01

    The existing protein complex detection methods can be broadly divided into two categories: unsupervised and supervised learning methods. Most of the unsupervised learning methods assume that protein complexes are in dense regions of protein-protein interaction (PPI) networks even though many true complexes are not dense subgraphs. Supervised learning methods utilize the informative properties of known complexes; they often extract features from existing complexes and then use the features to train a classification model. The trained model is used to guide the search process for new complexes. However, insufficient extracted features, noise in the PPI data and the incompleteness of complex data make the classification model imprecise. Consequently, the classification model is not sufficient for guiding the detection of complexes. Therefore, we propose a new robust score function that combines the classification model with local structural information. Based on the score function, we provide a search method that works both forwards and backwards. The results from experiments on six benchmark PPI datasets and three protein complex datasets show that our approach can achieve better performance compared with the state-of-the-art supervised, semi-supervised and unsupervised methods for protein complex detection, occasionally significantly outperforming such methods.

  7. Process-aware information systems : design, enactment and analysis

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Wah, B.W.

    2009-01-01

    Process-aware information systems support operational business processes by combining advances in information technology with recent insights from management science. Workflow management systems are typical examples of such systems. However, many other types of information systems are also "process

  8. Identification of Functional Information Subgraphs in Complex Networks

    International Nuclear Information System (INIS)

    Bettencourt, Luis M. A.; Gintautas, Vadas; Ham, Michael I.

    2008-01-01

    We present a general information theoretic approach for identifying functional subgraphs in complex networks. We show that the uncertainty in a variable can be written as a sum of information quantities, where each term is generated by successively conditioning mutual informations on new measured variables in a way analogous to a discrete differential calculus. The analogy to a Taylor series suggests efficient optimization algorithms for determining the state of a target variable in terms of functional groups of other nodes. We apply this methodology to electrophysiological recordings of cortical neuronal networks grown in vitro. Each cell's firing is generally explained by the activity of a few neurons. We identify these neuronal subgraphs in terms of their redundant or synergetic character and reconstruct neuronal circuits that account for the state of target cells

  9. Conditioning from an information processing perspective.

    Science.gov (United States)

    Gallistel, C R.

    2003-04-28

    The framework provided by Claude Shannon's [Bell Syst. Technol. J. 27 (1948) 623] theory of information leads to a quantitatively oriented reconceptualization of the processes that mediate conditioning. The focus shifts from processes set in motion by individual events to processes sensitive to the information carried by the flow of events. The conception of what properties of the conditioned and unconditioned stimuli are important shifts from the tangible properties to the intangible properties of number, duration, frequency and contingency. In this view, a stimulus becomes a CS if its onset substantially reduces the subject's uncertainty about the time of occurrence of the next US. One way to represent the subject's knowledge of that time of occurrence is by the cumulative probability function, which has two limiting forms: (1) The state of maximal uncertainty (minimal knowledge) is represented by the inverse exponential function for the random rate condition, in which the US is equally likely at any moment. (2) The limit to the subject's attainable certainty is represented by the cumulative normal function, whose momentary expectation is the CS-US latency minus the time elapsed since CS onset. Its standard deviation is the Weber fraction times the CS-US latency.

  10. Information processing in decision-making systems.

    Science.gov (United States)

    van der Meer, Matthijs; Kurth-Nelson, Zeb; Redish, A David

    2012-08-01

    Decisions result from an interaction between multiple functional systems acting in parallel to process information in very different ways, each with strengths and weaknesses. In this review, the authors address three action-selection components of decision-making: The Pavlovian system releases an action from a limited repertoire of potential actions, such as approaching learned stimuli. Like the Pavlovian system, the habit system is computationally fast but, unlike the Pavlovian system permits arbitrary stimulus-action pairings. These associations are a "forward'' mechanism; when a situation is recognized, the action is released. In contrast, the deliberative system is flexible but takes time to process. The deliberative system uses knowledge of the causal structure of the world to search into the future, planning actions to maximize expected rewards. Deliberation depends on the ability to imagine future possibilities, including novel situations, and it allows decisions to be taken without having previously experienced the options. Various anatomical structures have been identified that carry out the information processing of each of these systems: hippocampus constitutes a map of the world that can be used for searching/imagining the future; dorsal striatal neurons represent situation-action associations; and ventral striatum maintains value representations for all three systems. Each system presents vulnerabilities to pathologies that can manifest as psychiatric disorders. Understanding these systems and their relation to neuroanatomy opens up a deeper way to treat the structural problems underlying various disorders.

  11. Physiological arousal in processing recognition information

    Directory of Open Access Journals (Sweden)

    Guy Hochman

    2010-07-01

    Full Text Available The recognition heuristic (RH; Goldstein and Gigerenzer, 2002 suggests that, when applicable, probabilistic inferences are based on a noncompensatory examination of whether an object is recognized or not. The overall findings on the processes that underlie this fast and frugal heuristic are somewhat mixed, and many studies have expressed the need for considering a more compensatory integration of recognition information. Regardless of the mechanism involved, it is clear that recognition has a strong influence on choices, and this finding might be explained by the fact that recognition cues arouse affect and thus receive more attention than cognitive cues. To test this assumption, we investigated whether recognition results in a direct affective signal by measuring physiological arousal (i.e., peripheral arterial tone in the established city-size task. We found that recognition of cities does not directly result in increased physiological arousal. Moreover, the results show that physiological arousal increased with increasing inconsistency between recognition information and additional cue information. These findings support predictions derived by a compensatory Parallel Constraint Satisfaction model rather than predictions of noncompensatory models. Additional results concerning confidence ratings, response times, and choice proportions further demonstrated that recognition information and other cognitive cues are integrated in a compensatory manner.

  12. Real-time monitoring of clinical processes using complex event processing and transition systems.

    Science.gov (United States)

    Meinecke, Sebastian

    2014-01-01

    Dependencies between tasks in clinical processes are often complex and error-prone. Our aim is to describe a new approach for the automatic derivation of clinical events identified via the behaviour of IT systems using Complex Event Processing. Furthermore we map these events on transition systems to monitor crucial clinical processes in real-time for preventing and detecting erroneous situations.

  13. PREFACE: Complex Networks: from Biology to Information Technology

    Science.gov (United States)

    Barrat, A.; Boccaletti, S.; Caldarelli, G.; Chessa, A.; Latora, V.; Motter, A. E.

    2008-06-01

    for counting large directed loops. This work proposes a belief-propagation algorithm for counting long loops in directed networks, which is then applied to networks of different sizes and loop structure. In The anatomy of a large query graph, Baeza-Yates and Tiberi show that scale invariance is present also in the structure of a graph derived from query logs. This graph is determined not only by the queries but also by the subsequent actions of the users. The graph analysed in this study is generated by more than twenty million queries and is less sparse than suggested by previous studies. A different class of networks is considered by Travençolo and da F Costa in Hierarchical spatial organisation of geographical networks. This work proposes a hierarchical extension of the polygonality index as a means to characterise geographical planar networks and, in particular, to obtain more complete information about the spatial order of the network at progressive spatial scales. The paper Border trees of complex networks by Villas Boas et al focuses instead on the statistical properties of the boundary of graphs, constituted by the vertices of degree one (the leaves of border trees). The authors study the local properties, the depth, and the number of leaves of these border trees, finding that in some real networks more than half of the nodes belong to the border trees. The last contribution to the first section is The generation of random directed networks with prescribed 1-node and 2-node degree correlations by Zamora-López et al. This study deals with the generation of random directed networks and shows that often a large number of links cannot be 'randomised' without altering the degree correlations. This permits fast generation of ensembles of maximally random networks. In the section Methods: The Dynamics, significant attention is given to the study of synchronisation processes on networks: Díaz-Guilera's contribution Dynamics towards synchronisation in hierarchical

  14. A foundational methodology for determining system static complexity using notional lunar oxygen production processes

    Science.gov (United States)

    Long, Nicholas James

    This thesis serves to develop a preliminary foundational methodology for evaluating the static complexity of future lunar oxygen production systems when extensive information is not yet available about the various systems under consideration. Evaluating static complexity, as part of a overall system complexity analysis, is an important consideration in ultimately selecting a process to be used in a lunar base. When system complexity is higher, there is generally an overall increase in risk which could impact the safety of astronauts and the economic performance of the mission. To evaluate static complexity in lunar oxygen production, static complexity is simplified and defined into its essential components. First, three essential dimensions of static complexity are investigated, including interconnective complexity, strength of connections, and complexity in variety. Then a set of methods is developed upon which to separately evaluate each dimension. Q-connectivity analysis is proposed as a means to evaluate interconnective complexity and strength of connections. The law of requisite variety originating from cybernetic theory is suggested to interpret complexity in variety. Secondly, a means to aggregate the results of each analysis is proposed to create holistic measurement for static complexity using the Single Multi-Attribute Ranking Technique (SMART). Each method of static complexity analysis and the aggregation technique is demonstrated using notional data for four lunar oxygen production processes.

  15. The Complexity of Developmental Predictions from Dual Process Models

    Science.gov (United States)

    Stanovich, Keith E.; West, Richard F.; Toplak, Maggie E.

    2011-01-01

    Drawing developmental predictions from dual-process theories is more complex than is commonly realized. Overly simplified predictions drawn from such models may lead to premature rejection of the dual process approach as one of many tools for understanding cognitive development. Misleading predictions can be avoided by paying attention to several…

  16. The Effects of Syntactic Complexity on Processing Sentences in Noise

    Science.gov (United States)

    Carroll, Rebecca; Ruigendijk, Esther

    2013-01-01

    This paper discusses the influence of stationary (non-fluctuating) noise on processing and understanding of sentences, which vary in their syntactic complexity (with the factors canonicity, embedding, ambiguity). It presents data from two RT-studies with 44 participants testing processing of German sentences in silence and in noise. Results show a…

  17. Bridging domains : a comparison between information processing in Archaea and Eukarya

    NARCIS (Netherlands)

    Koning, de B.

    2015-01-01

    Bridging Domains

    A Comparison between Information Processing in Archaea and Eukarya

    Studying Information Processing

    Living cells evolved complex systems to handle the flow of information both

  18. Quantum Information Processing using Nonlinear Optical Effects

    DEFF Research Database (Denmark)

    Andersen, Lasse Mejling

    This PhD thesis treats applications of nonlinear optical effects for quantum information processing. The two main applications are four-wave mixing in the form of Bragg scattering (BS) for quantum-state-preserving frequency conversion, and sum-frequency generation (SFG) in second-order nonlinear......-chirping the pumps. In the high-conversion regime without the effects of NPM, exact Green functions for BS are derived. In this limit, separability is possible for conversion efficiencies up to 60 %. However, the system still allows for selective frequency conversion as well as re-shaping of the output. One way...

  19. Quantum wells for optical information processing

    International Nuclear Information System (INIS)

    Miller, D.A.B.

    1989-01-01

    Quantum wells, alternate thin layers of two different semiconductor materials, show an exceptional electric field dependence of the optical absorption, called the quantum-confined Stark effect (QCSE), for electric fields perpendicular to the layers. This enables electrically controlled optical modulators and optically controlled self-electro-optic-effect devices that can operate at high speed and low energy density. Recent developments in these QCSE devices are summarized, including new device materials and novel device structures. The variety of sophisticated devices now demonstrated is promising for applications to information processing

  20. Information processing of sexual abuse in elders.

    Science.gov (United States)

    Burgess, Ann W; Clements, Paul T

    2006-01-01

    Sexual abuse is considered to be a pandemic contemporary public health issue, with significant physical and psychosocial consequences for its victims. However, the incidence of elder sexual assault is difficult to estimate with any degree of confidence. A convenience sample of 284 case records were reviewed for Post-Traumatic Stress Disorder (PTSD) symptoms. The purpose of this paper is to present the limited data noted on record review on four PTSD symptoms of startle, physiological upset, anger, and numbness. A treatment model for information processing of intrapsychic trauma is presented to describe domain disruption within a nursing diagnosis of rape trauma syndrome and provide guidance for sensitive assessment and intervention.

  1. Gradation of complexity and predictability of hydrological processes

    Science.gov (United States)

    Sang, Yan-Fang; Singh, Vijay P.; Wen, Jun; Liu, Changming

    2015-06-01

    Quantification of the complexity and predictability of hydrological systems is important for evaluating the impact of climate change on hydrological processes, and for guiding water activities. In the literature, the focus seems to have been on describing the complexity of spatiotemporal distribution of hydrological variables, but little attention has been paid to the study of complexity gradation, because the degree of absolute complexity of hydrological systems cannot be objectively evaluated. Here we show that complexity and predictability of hydrological processes can be graded into three ranks (low, middle, and high). The gradation is based on the difference in the energy distribution of hydrological series and that of white noise under multitemporal scales. It reflects different energy concentration levels and contents of deterministic components of the hydrological series in the three ranks. Higher energy concentration level reflects lower complexity and higher predictability, but scattered energy distribution being similar to white noise has the highest complexity and is almost unpredictable. We conclude that the three ranks (low, middle, and high) approximately correspond to deterministic, stochastic, and random hydrological systems, respectively. The result of complexity gradation can guide hydrological observations and modeling, and identification of similarity patterns among different hydrological systems.

  2. Information and Self-Organization A Macroscopic Approach to Complex Systems

    CERN Document Server

    Haken, Hermann

    2006-01-01

    This book presents the concepts needed to deal with self-organizing complex systems from a unifying point of view that uses macroscopic data. The various meanings of the concept "information" are discussed and a general formulation of the maximum information (entropy) principle is used. With the aid of results from synergetics, adequate objective constraints for a large class of self-organizing systems are formulated and examples are given from physics, life and computer science. The relationship to chaos theory is examined and it is further shown that, based on possibly scarce and noisy data, unbiased guesses about processes of complex systems can be made and the underlying deterministic and random forces determined. This allows for probabilistic predictions of processes, with applications to numerous fields in science, technology, medicine and economics. The extensions of the third edition are essentially devoted to an introduction to the meaning of information in the quantum context. Indeed, quantum inform...

  3. Reinforcing Visual Grouping Cues to Communicate Complex Informational Structure.

    Science.gov (United States)

    Bae, Juhee; Watson, Benjamin

    2014-12-01

    In his book Multimedia Learning [7], Richard Mayer asserts that viewers learn best from imagery that provides them with cues to help them organize new information into the correct knowledge structures. Designers have long been exploiting the Gestalt laws of visual grouping to deliver viewers those cues using visual hierarchy, often communicating structures much more complex than the simple organizations studied in psychological research. Unfortunately, designers are largely practical in their work, and have not paused to build a complex theory of structural communication. If we are to build a tool to help novices create effective and well structured visuals, we need a better understanding of how to create them. Our work takes a first step toward addressing this lack, studying how five of the many grouping cues (proximity, color similarity, common region, connectivity, and alignment) can be effectively combined to communicate structured text and imagery from real world examples. To measure the effectiveness of this structural communication, we applied a digital version of card sorting, a method widely used in anthropology and cognitive science to extract cognitive structures. We then used tree edit distance to measure the difference between perceived and communicated structures. Our most significant findings are: 1) with careful design, complex structure can be communicated clearly; 2) communicating complex structure is best done with multiple reinforcing grouping cues; 3) common region (use of containers such as boxes) is particularly effective at communicating structure; and 4) alignment is a weak structural communicator.

  4. Evolutionary relevance facilitates visual information processing.

    Science.gov (United States)

    Jackson, Russell E; Calvillo, Dusti P

    2013-11-03

    Visual search of the environment is a fundamental human behavior that perceptual load affects powerfully. Previously investigated means for overcoming the inhibitions of high perceptual load, however, generalize poorly to real-world human behavior. We hypothesized that humans would process evolutionarily relevant stimuli more efficiently than evolutionarily novel stimuli, and evolutionary relevance would mitigate the repercussions of high perceptual load during visual search. Animacy is a significant component to evolutionary relevance of visual stimuli because perceiving animate entities is time-sensitive in ways that pose significant evolutionary consequences. Participants completing a visual search task located evolutionarily relevant and animate objects fastest and with the least impact of high perceptual load. Evolutionarily novel and inanimate objects were located slowest and with the highest impact of perceptual load. Evolutionary relevance may importantly affect everyday visual information processing.

  5. Evolutionary Relevance Facilitates Visual Information Processing

    Directory of Open Access Journals (Sweden)

    Russell E. Jackson

    2013-07-01

    Full Text Available Visual search of the environment is a fundamental human behavior that perceptual load affects powerfully. Previously investigated means for overcoming the inhibitions of high perceptual load, however, generalize poorly to real-world human behavior. We hypothesized that humans would process evolutionarily relevant stimuli more efficiently than evolutionarily novel stimuli, and evolutionary relevance would mitigate the repercussions of high perceptual load during visual search. Animacy is a significant component to evolutionary relevance of visual stimuli because perceiving animate entities is time-sensitive in ways that pose significant evolutionary consequences. Participants completing a visual search task located evolutionarily relevant and animate objects fastest and with the least impact of high perceptual load. Evolutionarily novel and inanimate objects were located slowest and with the highest impact of perceptual load. Evolutionary relevance may importantly affect everyday visual information processing.

  6. THEORETICAL FRAMEWORK FOR INFORMATION AND EDUCATIONAL COMPLEX DEVELOPMENT OF AN ACADEMIC DISCIPLINE AT A HIGHER INSTITUTION

    Directory of Open Access Journals (Sweden)

    Evgeniia Nikolaevna Kikot

    2015-05-01

    Full Text Available The question of organization of contemporary education process is getting more important nowadays in the conditions of ICT (information and communication technologies and e-education usage.This defines one of the most important methodological and research directions in the university – creation of informational-educational course unit complex as the foundation of e-University resource.The foundation of informational-educational course unit complex creation are the concepts of openness, accessibility, clearness, personalisation and that allow to built the requirements system to the complex creation and its substantial content.The main functions of informational educational complex are detected: informational, educational, controlling and communicative.It’s defined that into the basis of scientific justification of new structure elements of informational-educational of course unit complex development and introduction is necessary to include creation of e-workbook, e-workshops in order to organize theoretical and practical e-conferences.Development of ICT in education that provides e-education application assume establishment of distance learning techno-logies for educational programme implementation.

  7. Information Propagation in Complex Networks : Structures and Dynamics

    NARCIS (Netherlands)

    Märtens, M.

    2018-01-01

    This thesis is a contribution to a deeper understanding of how information propagates and what this process entails. At its very core is the concept of the network: a collection of nodes and links, which describes the structure of the systems under investigation. The network is a mathematical model

  8. Structure, context, complexity, organization: physical aspects of information and value

    National Research Council Canada - National Science Library

    Eriksson, Karl-Erik; Lindgren, Kristian; Månsson, Bengt Å

    1987-01-01

    ... and of information theory are general enough to play such a role. The authors have been involved in studies of the handling of natural resources in human societies. There we met problems and ideas which led us to the theme of this book: a perspective and a set of concepts, useful for describing and understanding processes in which structure emerges. T...

  9. Design and analysis of information model hotel complex

    Directory of Open Access Journals (Sweden)

    Garyaev Nikolai

    2016-01-01

    Full Text Available The article analyzes the innovation in 3D modeling and development of process design approaches based on visualization of information technology and computer-aided design systems. The problems arising in the modern design and the approach to address them.

  10. Motivated information processing and group decision-making : Effects of process accountability on information processing and decision quality

    NARCIS (Netherlands)

    Scholten, Lotte; van Knippenberg, Daan; Nijstad, Bernard A.; De Dreu, Carsten K. W.

    Integrating dual-process models [Chaiken, S., & Trope, Y. (Eds.). (1999). Dual-process theories in social psychology. NewYork: Guilford Press] with work on information sharing and group decision-making [Stasser, G., & Titus, W. (1985). Pooling of unshared information in group decision making: biased

  11. Habitat Complexity in Aquatic Microcosms Affects Processes Driven by Detritivores.

    Directory of Open Access Journals (Sweden)

    Lorea Flores

    Full Text Available Habitat complexity can influence predation rates (e.g. by providing refuge but other ecosystem processes and species interactions might also be modulated by the properties of habitat structure. Here, we focussed on how complexity of artificial habitat (plastic plants, in microcosms, influenced short-term processes driven by three aquatic detritivores. The effects of habitat complexity on leaf decomposition, production of fine organic matter and pH levels were explored by measuring complexity in three ways: 1. as the presence vs. absence of habitat structure; 2. as the amount of structure (3 or 4.5 g of plastic plants; and 3. as the spatial configuration of structures (measured as fractal dimension. The experiment also addressed potential interactions among the consumers by running all possible species combinations. In the experimental microcosms, habitat complexity influenced how species performed, especially when comparing structure present vs. structure absent. Treatments with structure showed higher fine particulate matter production and lower pH compared to treatments without structures and this was probably due to higher digestion and respiration when structures were present. When we explored the effects of the different complexity levels, we found that the amount of structure added explained more than the fractal dimension of the structures. We give a detailed overview of the experimental design, statistical models and R codes, because our statistical analysis can be applied to other study systems (and disciplines such as restoration ecology. We further make suggestions of how to optimise statistical power when artificially assembling, and analysing, 'habitat complexity' by not confounding complexity with the amount of structure added. In summary, this study highlights the importance of habitat complexity for energy flow and the maintenance of ecosystem processes in aquatic ecosystems.

  12. Quantum teleportation for continuous variables and related quantum information processing

    International Nuclear Information System (INIS)

    Furusawa, Akira; Takei, Nobuyuki

    2007-01-01

    Quantum teleportation is one of the most important subjects in quantum information science. This is because quantum teleportation can be regarded as not only quantum information transfer but also a building block for universal quantum information processing. Furthermore, deterministic quantum information processing is very important for efficient processing and it can be realized with continuous-variable quantum information processing. In this review, quantum teleportation for continuous variables and related quantum information processing are reviewed from these points of view

  13. Role of Information Anxiety and Information Load on Processing of Prescription Drug Information Leaflets.

    Science.gov (United States)

    Bapat, Shweta S; Patel, Harshali K; Sansgiry, Sujit S

    2017-10-16

    In this study, we evaluate the role of information anxiety and information load on the intention to read information from prescription drug information leaflets (PILs). These PILs were developed based on the principals of information load and consumer information processing. This was an experimental prospective repeated measures study conducted in the United States where 360 (62% response rate) university students (>18 years old) participated. Participants were presented with a scenario followed by exposure to the three drug product information sources used to operationalize information load. The three sources were: (i) current practice; (ii) pre-existing one-page text only; and (iii) interventional one-page prototype PILs designed for the study. Information anxiety was measured as anxiety experienced by the individual when encountering information. The outcome variable of intention to read PILs was defined as the likelihood that the patient will read the information provided in the leaflets. A survey questionnaire was used to capture the data and the objectives were analyzed by performing a repeated measures MANOVA using SAS version 9.3. When compared to current practice and one-page text only leaflets, one-page PILs had significantly lower scores on information anxiety ( p information load ( p Information anxiety and information load significantly impacted intention to read ( p < 0.001). Newly developed PILs increased patient's intention to read and can help in improving the counseling services provided by pharmacists.

  14. Modeling biochemical transformation processes and information processing with Narrator

    Directory of Open Access Journals (Sweden)

    Palfreyman Niall M

    2007-03-01

    Full Text Available Abstract Background Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs, which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Results Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Conclusion Narrator is a

  15. Modeling biochemical transformation processes and information processing with Narrator.

    Science.gov (United States)

    Mandel, Johannes J; Fuss, Hendrik; Palfreyman, Niall M; Dubitzky, Werner

    2007-03-27

    Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Narrator is a flexible and intuitive systems biology tool. It is

  16. Defining information need in health - assimilating complex theories derived from information science.

    Science.gov (United States)

    Ormandy, Paula

    2011-03-01

    Key policy drivers worldwide include optimizing patients' roles in managing their care; focusing services around patients' needs and preferences; and providing information to support patients' contributions and choices. The term information need penetrates many policy documents. Information need is espoused as the foundation from which to develop patient-centred or patient-led services. Yet there is no clear definition as to what the term means or how patients' information needs inform and shape information provision and patient care. The assimilation of complex theories originating from information science has much to offer considerations of patient information need within the context of health care. Health-related research often focuses on the content of information patients prefer, not why they need information. This paper extends and applies knowledge of information behaviour to considerations of information need in health, exposing a working definition for patient information need that reiterates the importance of considering the patient's goals and understanding the patient's context/situation. A patient information need is defined as 'recognition that their knowledge is inadequate to satisfy a goal, within the context/situation that they find themselves at a specific point in the time'. This typifies the key concepts of national/international health policy, the centrality and importance of the patient. The proposed definition of patient information need provides a conceptual framework to guide health-care practitioners on what to consider and why when meeting the information needs of patients in practice. This creates a solid foundation from which to inform future research. © 2010 The Author. Health Expectations © 2010 Blackwell Publishing Ltd.

  17. Intersubject information mapping: revealing canonical representations of complex natural stimuli

    Directory of Open Access Journals (Sweden)

    Nikolaus Kriegeskorte

    2015-03-01

    Full Text Available Real-world time-continuous stimuli such as video promise greater naturalism for studies of brain function. However, modeling the stimulus variation is challenging and introduces a bias in favor of particular descriptive dimensions. Alternatively, we can look for brain regions whose signal is correlated between subjects, essentially using one subject to model another. Intersubject correlation mapping (ICM allows us to find brain regions driven in a canonical manner across subjects by a complex natural stimulus. However, it requires a direct voxel-to-voxel match between the spatiotemporal activity patterns and is thus only sensitive to common activations sufficiently extended to match up in Talairach space (or in an alternative, e.g. cortical-surface-based, common brain space. Here we introduce the more general approach of intersubject information mapping (IIM. For each brain region, IIM determines how much information is shared between the subjects' local spatiotemporal activity patterns. We estimate the intersubject mutual information using canonical correlation analysis applied to voxels within a spherical searchlight centered on each voxel in turn. The intersubject information estimate is invariant to linear transforms including spatial rearrangement of the voxels within the searchlight. This invariance to local encoding will be crucial in exploring fine-grained brain representations, which cannot be matched up in a common space and, more fundamentally, might be unique to each individual – like fingerprints. IIM yields a continuous brain map, which reflects intersubject information in fine-grained patterns. Performed on data from functional magnetic resonance imaging (fMRI of subjects viewing the same television show, IIM and ICM both highlighted sensory representations, including primary visual and auditory cortices. However, IIM revealed additional regions in higher association cortices, namely temporal pole and orbitofrontal cortex. These

  18. The informed consent process in randomised controlled trials: a nurse-led process.

    Science.gov (United States)

    Cresswell, Pip; Gilmour, Jean

    2014-03-01

    Clinical trials are carried out with human participants to answer questions about the best way to diagnose, treat and prevent illness. Participants must give informed consent to take part in clinical trials that requires understanding of how clinical trials work and their purpose. Randomised controlled trials provide strong evidence but their complex design is difficult for both clinicians and participants to understand. Increasingly, ensuring informed consent in randomised controlled trials has become part of the clinical research nurse role. The aim of this study was to explore in depth the clinical research nurse role in the informed consent process using a qualitative descriptive approach. Three clinical research nurses were interviewed and data analysed using a thematic analysis approach. Three themes were identified to describe the process of ensuring informed consent. The first theme, Preparatory partnerships, canvassed the relationships required prior to initiation of the informed consent process. The second theme, Partnering the participant, emphasises the need for ensuring voluntariness and understanding, along with patient advocacy. The third theme, Partnership with the project, highlights the clinical research nurse contribution to the capacity of the trial to answer the research question through appropriate recruiting and follow up of participants. Gaining informed consent in randomised controlled trials was complex and required multiple partnerships. A wide variety of skills was used to protect the safety of trial participants and promote quality research. The information from this study contributes to a greater understanding of the clinical research nurse role, and suggests the informed consent process in trials can be a nurse-led one. In order to gain collegial, employer and industry recognition it is important this aspect of the nursing role is acknowledged.

  19. Development of the operational information processing platform

    International Nuclear Information System (INIS)

    Shin, Hyun Kook; Park, Jeong Seok; Baek, Seung Min; Kim, Young Jin; Joo, Jae Yoon; Lee, Sang Mok; Jeong, Young Woo; Seo, Ho Jun; Kim, Do Youn; Lee, Tae Hoon

    1996-02-01

    The Operational Information Processing Platform(OIPP) is platform system which was designed to provide the development and operation environments for plant operation and plant monitoring. It is based on the Plant Computer Systems (PCS) of Yonggwang 3 and 4, Ulchin 3 and 4, and Yonggwang 5 and 6 Nuclear Power Plants (NPP). The UNIX based workstation, real time kernel and graphics design tool are selected and installed through the reviewing the function of PCS. In order to construct the development environment for open system architecture and distributed computer system, open computer system architecture was adapted both in hardware and software. For verification of system design and evaluation of technical methodologies, the PCS running under the OIPP is being designed and implemented. In this system, the man-machine interface and system functions are being designed and implemented to evaluate the differences between the UCN 3, 4 PCS and OIPP. 15 tabs., 32 figs., 11 refs. (Author)

  20. Quantum information processing with optical vortices

    Energy Technology Data Exchange (ETDEWEB)

    Khoury, Antonio Z. [Universidade Federal Fluminense (UFF), Niteroi, RJ (Brazil)

    2012-07-01

    Full text: In this work we discuss several proposals for quantum information processing using the transverse structure of paraxial beams. Different techniques for production and manipulation of optical vortices have been employed and combined with polarization transformations in order to investigate fundamental properties of quantum entanglement as well as to propose new tools for quantum information processing. As an example, we have recently proposed and demonstrated a controlled NOT (CNOT) gate based on a Michelson interferometer in which the photon polarization is the control bit and the first order transverse mode is the target. The device is based on a single lens design for an astigmatic mode converter that transforms the transverse mode of paraxial optical beams. In analogy with Bell's inequality for two-qubit quantum states, we propose an inequality criterion for the non-separability of the spin-orbit degrees of freedom of a laser beam. A definition of separable and non-separable spin-orbit modes is used in consonance with the one presented in Phys. Rev. Lett. 99, 2007. As the usual Bell's inequality can be violated for entangled two-qubit quantum states, we show both theoretically and experimentally that the proposed spin-orbit inequality criterion can be violated for non-separable modes. The inequality is discussed both in the classical and quantum domains. We propose a polarization to orbital angular momentum teleportation scheme using entangled photon pairs generated by spontaneous parametric down conversion. By making a joint detection of the polarization and angular momentum parity of a single photon, we are able to detect all the Bell-states and perform, in principle, perfect teleportation from a discrete to a continuous system using minimal resources. The proposed protocol implementation demands experimental resources that are currently available in quantum optics laboratories. (author)

  1. Triangle network motifs predict complexes by complementing high-error interactomes with structural information.

    Science.gov (United States)

    Andreopoulos, Bill; Winter, Christof; Labudde, Dirk; Schroeder, Michael

    2009-06-27

    A lot of high-throughput studies produce protein-protein interaction networks (PPINs) with many errors and missing information. Even for genome-wide approaches, there is often a low overlap between PPINs produced by different studies. Second-level neighbors separated by two protein-protein interactions (PPIs) were previously used for predicting protein function and finding complexes in high-error PPINs. We retrieve second level neighbors in PPINs, and complement these with structural domain-domain interactions (SDDIs) representing binding evidence on proteins, forming PPI-SDDI-PPI triangles. We find low overlap between PPINs, SDDIs and known complexes, all well below 10%. We evaluate the overlap of PPI-SDDI-PPI triangles with known complexes from Munich Information center for Protein Sequences (MIPS). PPI-SDDI-PPI triangles have ~20 times higher overlap with MIPS complexes than using second-level neighbors in PPINs without SDDIs. The biological interpretation for triangles is that a SDDI causes two proteins to be observed with common interaction partners in high-throughput experiments. The relatively few SDDIs overlapping with PPINs are part of highly connected SDDI components, and are more likely to be detected in experimental studies. We demonstrate the utility of PPI-SDDI-PPI triangles by reconstructing myosin-actin processes in the nucleus, cytoplasm, and cytoskeleton, which were not obvious in the original PPIN. Using other complementary datatypes in place of SDDIs to form triangles, such as PubMed co-occurrences or threading information, results in a similar ability to find protein complexes. Given high-error PPINs with missing information, triangles of mixed datatypes are a promising direction for finding protein complexes. Integrating PPINs with SDDIs improves finding complexes. Structural SDDIs partially explain the high functional similarity of second-level neighbors in PPINs. We estimate that relatively little structural information would be sufficient

  2. Triangle network motifs predict complexes by complementing high-error interactomes with structural information

    Directory of Open Access Journals (Sweden)

    Labudde Dirk

    2009-06-01

    Full Text Available Abstract Background A lot of high-throughput studies produce protein-protein interaction networks (PPINs with many errors and missing information. Even for genome-wide approaches, there is often a low overlap between PPINs produced by different studies. Second-level neighbors separated by two protein-protein interactions (PPIs were previously used for predicting protein function and finding complexes in high-error PPINs. We retrieve second level neighbors in PPINs, and complement these with structural domain-domain interactions (SDDIs representing binding evidence on proteins, forming PPI-SDDI-PPI triangles. Results We find low overlap between PPINs, SDDIs and known complexes, all well below 10%. We evaluate the overlap of PPI-SDDI-PPI triangles with known complexes from Munich Information center for Protein Sequences (MIPS. PPI-SDDI-PPI triangles have ~20 times higher overlap with MIPS complexes than using second-level neighbors in PPINs without SDDIs. The biological interpretation for triangles is that a SDDI causes two proteins to be observed with common interaction partners in high-throughput experiments. The relatively few SDDIs overlapping with PPINs are part of highly connected SDDI components, and are more likely to be detected in experimental studies. We demonstrate the utility of PPI-SDDI-PPI triangles by reconstructing myosin-actin processes in the nucleus, cytoplasm, and cytoskeleton, which were not obvious in the original PPIN. Using other complementary datatypes in place of SDDIs to form triangles, such as PubMed co-occurrences or threading information, results in a similar ability to find protein complexes. Conclusion Given high-error PPINs with missing information, triangles of mixed datatypes are a promising direction for finding protein complexes. Integrating PPINs with SDDIs improves finding complexes. Structural SDDIs partially explain the high functional similarity of second-level neighbors in PPINs. We estimate that

  3. Information Support of Processes in Warehouse Logistics

    Directory of Open Access Journals (Sweden)

    Gordei Kirill

    2013-11-01

    Full Text Available In the conditions of globalization and the world economic communications, the role of information support of business processes increases in various branches and fields of activity. There is not an exception for the warehouse activity. Such information support is realized in warehouse logistic systems. In relation to territorial administratively education, the warehouse logistic system gets a format of difficult social and economic structure which controls the economic streams covering the intermediary, trade and transport organizations and the enterprises of other branches and spheres. Spatial movement of inventory items makes new demands to participants of merchandising. Warehousing (in the meaning – storage – is one of the operations entering into logistic activity, on the organization of a material stream, as a requirement. Therefore, warehousing as "management of spatial movement of stocks" – is justified. Warehousing, in such understanding, tries to get rid of the perception as to containing stocks – a business expensive. This aspiration finds reflection in the logistic systems working by the principle: "just in time", "economical production" and others. Therefore, the role of warehouses as places of storage is transformed to understanding of warehousing as an innovative logistic system.

  4. Natural language processing and advanced information management

    Science.gov (United States)

    Hoard, James E.

    1989-01-01

    Integrating diverse information sources and application software in a principled and general manner will require a very capable advanced information management (AIM) system. In particular, such a system will need a comprehensive addressing scheme to locate the material in its docuverse. It will also need a natural language processing (NLP) system of great sophistication. It seems that the NLP system must serve three functions. First, it provides an natural language interface (NLI) for the users. Second, it serves as the core component that understands and makes use of the real-world interpretations (RWIs) contained in the docuverse. Third, it enables the reasoning specialists (RSs) to arrive at conclusions that can be transformed into procedures that will satisfy the users' requests. The best candidate for an intelligent agent that can satisfactorily make use of RSs and transform documents (TDs) appears to be an object oriented data base (OODB). OODBs have, apparently, an inherent capacity to use the large numbers of RSs and TDs that will be required by an AIM system and an inherent capacity to use them in an effective way.

  5. Ferric and cobaltous hydroacid complexes for forward osmosis (FO) processes

    KAUST Repository

    Ge, Qingchun; Fu, Fengjiang; Chung, Neal Tai-Shung

    2014-01-01

    Cupric and ferric hydroacid complexes have proven their advantages as draw solutes in forward osmosis in terms of high water fluxes, negligible reverse solute fluxes and easy recovery (Ge and Chung, 2013. Hydroacid complexes: A new class of draw solutes to promote forward osmosis (FO) processes. Chemical Communications 49, 8471-8473.). In this study, cobaltous hydroacid complexes were explored as draw solutes and compared with the ferric hydroacid complex to study the factors influencing their FO performance. The solutions of the cobaltous complexes produce high osmotic pressures due to the presence of abundant hydrophilic groups. These solutes are able to dissociate and form a multi-charged anion and Na+ cations in water. In addition, these complexes have expanded structures which lead to negligible reverse solute fluxes and provide relatively easy approaches in regeneration. These characteristics make the newly synthesized cobaltous complexes appropriate as draw solutes. The FO performance of the cobaltous and ferric-citric acid (Fe-CA) complexes were evaluated respectively through cellulose acetate membranes, thin-film composite membranes fabricated on polyethersulfone supports (referred as TFC-PES), and polybenzimidazole and PES dual-layer (referred as PBI/PES) hollow fiber membranes. Under the conditions of DI water as the feed and facing the support layer of TFC-PES FO membranes (PRO mode), draw solutions at 2.0M produced relatively high water fluxes of 39-48 LMH (Lm-2hr-1) with negligible reverse solute fluxes. A water flux of 17.4 LMH was achieved when model seawater of 3.5wt.% NaCl replaced DI water as the feed and 2.0M Fe-CA as the draw solution under the same conditions. The performance of these hydroacid complexes surpasses those of the synthetic draw solutes developed in recent years. This observation, along with the relatively easy regeneration, makes these complexes very promising as a novel class of draw solutes. © 2014 Elsevier Ltd.

  6. Ferric and cobaltous hydroacid complexes for forward osmosis (FO) processes

    KAUST Repository

    Ge, Qingchun

    2014-07-01

    Cupric and ferric hydroacid complexes have proven their advantages as draw solutes in forward osmosis in terms of high water fluxes, negligible reverse solute fluxes and easy recovery (Ge and Chung, 2013. Hydroacid complexes: A new class of draw solutes to promote forward osmosis (FO) processes. Chemical Communications 49, 8471-8473.). In this study, cobaltous hydroacid complexes were explored as draw solutes and compared with the ferric hydroacid complex to study the factors influencing their FO performance. The solutions of the cobaltous complexes produce high osmotic pressures due to the presence of abundant hydrophilic groups. These solutes are able to dissociate and form a multi-charged anion and Na+ cations in water. In addition, these complexes have expanded structures which lead to negligible reverse solute fluxes and provide relatively easy approaches in regeneration. These characteristics make the newly synthesized cobaltous complexes appropriate as draw solutes. The FO performance of the cobaltous and ferric-citric acid (Fe-CA) complexes were evaluated respectively through cellulose acetate membranes, thin-film composite membranes fabricated on polyethersulfone supports (referred as TFC-PES), and polybenzimidazole and PES dual-layer (referred as PBI/PES) hollow fiber membranes. Under the conditions of DI water as the feed and facing the support layer of TFC-PES FO membranes (PRO mode), draw solutions at 2.0M produced relatively high water fluxes of 39-48 LMH (Lm-2hr-1) with negligible reverse solute fluxes. A water flux of 17.4 LMH was achieved when model seawater of 3.5wt.% NaCl replaced DI water as the feed and 2.0M Fe-CA as the draw solution under the same conditions. The performance of these hydroacid complexes surpasses those of the synthetic draw solutes developed in recent years. This observation, along with the relatively easy regeneration, makes these complexes very promising as a novel class of draw solutes. © 2014 Elsevier Ltd.

  7. Electrospray ionization mass spectrometry for the hydrolysis complexes of cisplatin: implications for the hydrolysis process of platinum complexes.

    Science.gov (United States)

    Feifan, Xie; Pieter, Colin; Jan, Van Bocxlaer

    2017-07-01

    Non-enzyme-dependent hydrolysis of the drug cisplatin is important for its mode of action and toxicity. However, up until today, the hydrolysis process of cisplatin is still not completely understood. In the present study, the hydrolysis of cisplatin in an aqueous solution was systematically investigated by using electrospray ionization mass spectrometry coupled to liquid chromatography. A variety of previously unreported hydrolysis complexes corresponding to monomeric, dimeric and trimeric species were detected and identified. The characteristics of the Pt-containing complexes were investigated by using collision-induced dissociation (CID). The hydrolysis complexes demonstrate distinctive and correlative CID characteristics, which provides tools for an informative identification. The most frequently observed dissociation mechanism was sequential loss of NH 3 , H 2 O and HCl. Loss of the Pt atom was observed as the final step during the CID process. The formation mechanisms of the observed complexes were explored and experimentally examined. The strongly bound dimeric species, which existed in solution, are assumed to be formed from the clustering of the parent compound and its monohydrated or dihydrated complexes. The role of the electrospray process in the formation of some of the observed ions was also evaluated, and the electrospray ionization-related cold clusters were identified. The previously reported hydrolysis equilibria were tested and subsequently refined via a hydrolysis study resulting in a renewed mechanistic equilibrium system of cisplatin as proposed from our results. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  8. Analytic information processing style in epilepsy patients.

    Science.gov (United States)

    Buonfiglio, Marzia; Di Sabato, Francesco; Mandillo, Silvia; Albini, Mariarita; Di Bonaventura, Carlo; Giallonardo, Annateresa; Avanzini, Giuliano

    2017-08-01

    Relevant to the study of epileptogenesis is learning processing, given the pivotal role that neuroplasticity assumes in both mechanisms. Recently, evoked potential analyses showed a link between analytic cognitive style and altered neural excitability in both migraine and healthy subjects, regardless of cognitive impairment or psychological disorders. In this study we evaluated analytic/global and visual/auditory perceptual dimensions of cognitive style in patients with epilepsy. Twenty-five cryptogenic temporal lobe epilepsy (TLE) patients matched with 25 idiopathic generalized epilepsy (IGE) sufferers and 25 healthy volunteers were recruited and participated in three cognitive style tests: "Sternberg-Wagner Self-Assessment Inventory", the C. Cornoldi test series called AMOS, and the Mariani Learning style Questionnaire. Our results demonstrate a significant association between analytic cognitive style and both IGE and TLE and respectively a predominant auditory and visual analytic style (ANOVA: p values <0,0001). These findings should encourage further research to investigate information processing style and its neurophysiological correlates in epilepsy. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. The Concept of Information Sharing Behaviors in Complex Organizations: Research in Latvian Enterprises

    Directory of Open Access Journals (Sweden)

    Andrejs Cekuls

    2016-12-01

    Full Text Available The purpose of this paper is to explore the factors influencing behaviors of information sharing in complex organizations. Evaluation of the previous studies on provision of information turnover process and the role of organizational culture in competitive intelligence of business environment in Latvia indicated the trends that employees of Latvian enterprises lack incentive to share information. Tasks of the study were to research the basis of the review of scientific sources and study aspects influencing habits of information sharing in complex organizations. For this particular study, the focus group is selected as the most appropriate data collection method for high-quality research. To find out individuals' opinions and attitudes two focus group discussions were carried out. Members from various industries and with different employment period were included in discussion groups. In aggregate, opinions of the employees from 41 different companies were summarized regarding the aspects affecting the process of information sharing in organizations. Results of researches show that that influence the sharing of information are closely related to the values: interpersonal trust, organizational trust, and organizational identification, support, fairness etc. Results of discussions showed that it is important for a manager to be aware of the factors affecting the performance of the organization. To identify the need for changes, a manager should follow events in the environment and analyze the extent, to which they affect the performance of the organization. Complexity science suggests that maturity to changes emerges when the system is far from balance, but the tension makes to accept changes.

  10. Intelligent Transportation Control based on Proactive Complex Event Processing

    OpenAIRE

    Wang Yongheng; Geng Shaofeng; Li Qian

    2016-01-01

    Complex Event Processing (CEP) has become the key part of Internet of Things (IoT). Proactive CEP can predict future system states and execute some actions to avoid unwanted states which brings new hope to intelligent transportation control. In this paper, we propose a proactive CEP architecture and method for intelligent transportation control. Based on basic CEP technology and predictive analytic technology, a networked distributed Markov decision processes model with predicting states is p...

  11. Can complex cellular processes be governed by simple linear rules?

    Science.gov (United States)

    Selvarajoo, Kumar; Tomita, Masaru; Tsuchiya, Masa

    2009-02-01

    Complex living systems have shown remarkably well-orchestrated, self-organized, robust, and stable behavior under a wide range of perturbations. However, despite the recent generation of high-throughput experimental datasets, basic cellular processes such as division, differentiation, and apoptosis still remain elusive. One of the key reasons is the lack of understanding of the governing principles of complex living systems. Here, we have reviewed the success of perturbation-response approaches, where without the requirement of detailed in vivo physiological parameters, the analysis of temporal concentration or activation response unravels biological network features such as causal relationships of reactant species, regulatory motifs, etc. Our review shows that simple linear rules govern the response behavior of biological networks in an ensemble of cells. It is daunting to know why such simplicity could hold in a complex heterogeneous environment. Provided physical reasons can be explained for these phenomena, major advancement in the understanding of basic cellular processes could be achieved.

  12. Increase in Complexity and Information through Molecular Evolution

    Directory of Open Access Journals (Sweden)

    Peter Schuster

    2016-11-01

    Full Text Available Biological evolution progresses by essentially three different mechanisms: (I optimization of properties through natural selection in a population of competitors; (II development of new capabilities through cooperation of competitors caused by catalyzed reproduction; and (III variation of genetic information through mutation or recombination. Simplified evolutionary processes combine two out of the three mechanisms: Darwinian evolution combines competition (I and variation (III and is represented by the quasispecies model, major transitions involve cooperation (II of competitors (I, and the third combination, cooperation (II and variation (III provides new insights in the role of mutations in evolution. A minimal kinetic model based on simple molecular mechanisms for reproduction, catalyzed reproduction and mutation is introduced, cast into ordinary differential equations (ODEs, and analyzed mathematically in form of its implementation in a flow reactor. Stochastic aspects are investigated through computer simulation of trajectories of the corresponding chemical master equations. The competition-cooperation model, mechanisms (I and (II, gives rise to selection at low levels of resources and leads to symbiontic cooperation in case the material required is abundant. Accordingly, it provides a kind of minimal system that can undergo a (major transition. Stochastic effects leading to extinction of the population through self-enhancing oscillations destabilize symbioses of four or more partners. Mutations (III are not only the basis of change in phenotypic properties but can also prevent extinction provided the mutation rates are sufficiently large. Threshold phenomena are observed for all three combinations: The quasispecies model leads to an error threshold, the competition-cooperation model allows for an identification of a resource-triggered bifurcation with the transition, and for the cooperation-mutation model a kind of stochastic threshold for

  13. Some considerations on Bible translation as complex process | Van ...

    African Journals Online (AJOL)

    It is argued that translation is a complex process: meaning is "created" by decoding the source text on several levels (for instance, grammatical; structural; literary; and socio-cultural levels). This "meaning" must then be encoded into the target language by means of the linguistic, literary, and cultural conventions of the target ...

  14. Managing complexity in process digitalisation with dynamic condition response graphs

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Debois, Søren; Slaats, Tijs

    2017-01-01

    . Sadly, it is also witnessed by a number of expensive failed digitalisation projects. In this paper we point to two key problems in state-of-The art BPM technologies: 1) the use of rigid flow diagrams as the "source code" of process digitalisation is not suitable for managing the complexity of knowledge...

  15. Cueing Complex Animations: Does Direction of Attention Foster Learning Processes?

    Science.gov (United States)

    Lowe, Richard; Boucheix, Jean-Michel

    2011-01-01

    The time course of learners' processing of a complex animation was studied using a dynamic diagram of a piano mechanism. Over successive repetitions of the material, two forms of cueing (standard colour cueing and anti-cueing) were administered either before or during the animated segment of the presentation. An uncued group and two other control…

  16. Methods of Complex Data Processing from Technical Means of Monitoring

    Directory of Open Access Journals (Sweden)

    Serhii Tymchuk

    2017-03-01

    Full Text Available The problem of processing the information from different types of monitoring equipment was examined. The use of generalized methods of information processing, based on the techniques of clustering combined territorial information sources for monitoring and the use of framing model of knowledge base for identification of monitoring objects was proposed as a possible solution of the problem. Clustering methods were formed on the basis of Lance-Williams hierarchical agglomerative procedure using the Ward metrics. Frame model of knowledge base was built using the tools of object-oriented modeling.

  17. Patterns of patient safety culture: a complexity and arts-informed project of knowledge translation.

    Science.gov (United States)

    Mitchell, Gail J; Tregunno, Deborah; Gray, Julia; Ginsberg, Liane

    2011-01-01

    The purpose of this paper is to describe patterns of patient safety culture that emerged from an innovative collaboration among health services researchers and fine arts colleagues. The group engaged in an arts-informed knowledge translation project to produce a dramatic expression of patient safety culture research for inclusion in a symposium. Scholars have called for a deeper understanding of the complex interrelationships among structure, process and outcomes relating to patient safety. Four patterns of patient safety culture--blinding familiarity, unyielding determination, illusion of control and dismissive urgency--are described with respect to how they informed creation of an arts-informed project for knowledge translation.

  18. A tool for filtering information in complex systems

    Science.gov (United States)

    Tumminello, M.; Aste, T.; Di Matteo, T.; Mantegna, R. N.

    2005-07-01

    We introduce a technique to filter out complex data sets by extracting a subgraph of representative links. Such a filtering can be tuned up to any desired level by controlling the genus of the resulting graph. We show that this technique is especially suitable for correlation-based graphs, giving filtered graphs that preserve the hierarchical organization of the minimum spanning tree but containing a larger amount of information in their internal structure. In particular in the case of planar filtered graphs (genus equal to 0), triangular loops and four-element cliques are formed. The application of this filtering procedure to 100 stocks in the U.S. equity markets shows that such loops and cliques have important and significant relationships with the market structure and properties. This paper was submitted directly (Track II) to the PNAS office.Abbreviations: MST, minimum spanning tree; PMFG, Planar Maximally Filtered Graph; r-clique, clique of r elements.

  19. Information Processing Features Can Detect Behavioral Regimes of Dynamical Systems

    Directory of Open Access Journals (Sweden)

    Rick Quax

    2018-01-01

    Full Text Available In dynamical systems, local interactions between dynamical units generate correlations which are stored and transmitted throughout the system, generating the macroscopic behavior. However a framework to quantify exactly how these correlations are stored, transmitted, and combined at the microscopic scale is missing. Here we propose to characterize the notion of “information processing” based on all possible Shannon mutual information quantities between a future state and all possible sets of initial states. We apply it to the 256 elementary cellular automata (ECA, which are the simplest possible dynamical systems exhibiting behaviors ranging from simple to complex. Our main finding is that only a few information features are needed for full predictability of the systemic behavior and that the “information synergy” feature is always most predictive. Finally we apply the idea to foreign exchange (FX and interest-rate swap (IRS time-series data. We find an effective “slowing down” leading indicator in all three markets for the 2008 financial crisis when applied to the information features, as opposed to using the data itself directly. Our work suggests that the proposed characterization of the local information processing of units may be a promising direction for predicting emergent systemic behaviors.

  20. Impact of delayed information in sub-second complex systems

    Science.gov (United States)

    Manrique, Pedro D.; Zheng, Minzhang; Johnson Restrepo, D. Dylan; Hui, Pak Ming; Johnson, Neil F.

    What happens when you slow down the delivery of information in large-scale complex systems that operate faster than the blink of an eye? This question just adopted immediate commercial, legal and political importance following U.S. regulators' decision to allow an intentional 350 microsecond delay to be added in the ultrafast network of financial exchanges. However there is still no scientific understanding available to policymakers of the potential system-wide impact of such delays. Here we take a first step in addressing this question using a minimal model of a population of competing, heterogeneous, adaptive agents which has previously been shown to produce similar statistical features to real markets. We find that while certain extreme system-level behaviors can be prevented by such delays, the duration of others is increased. This leads to a highly non-trivial relationship between delays and system-wide instabilities which warrants deeper empirical investigation. The generic nature of our model suggests there should be a fairly wide class of complex systems where such delay-driven extreme behaviors can arise, e.g. sub-second delays in brain function possibly impacting individuals' behavior, and sub-second delays in navigational systems potentially impacting the safety of driverless vehicles.

  1. Impact of delayed information in sub-second complex systems

    Directory of Open Access Journals (Sweden)

    Pedro D. Manrique

    Full Text Available What happens when you slow down the delivery of information in large-scale complex systems that operate faster than the blink of an eye? This question just adopted immediate commercial, legal and political importance following U.S. regulators’ decision to allow an intentional 350 microsecond delay to be added in the ultrafast network of financial exchanges. However there is still no scientific understanding available to policymakers of the potential system-wide impact of such delays. Here we take a first step in addressing this question using a minimal model of a population of competing, heterogeneous, adaptive agents which has previously been shown to produce similar statistical features to real markets. We find that while certain extreme system-level behaviors can be prevented by such delays, the duration of others is increased. This leads to a highly non-trivial relationship between delays and system-wide instabilities which warrants deeper empirical investigation. The generic nature of our model suggests there should be a fairly wide class of complex systems where such delay-driven extreme behaviors can arise, e.g. sub-second delays in brain function possibly impacting individuals’ behavior, and sub-second delays in navigational systems potentially impacting the safety of driverless vehicles. Keywords: Ultra-fast networks, Temporal perturbation, Competition, Modeling

  2. Visual perception of complex shape-transforming processes.

    Science.gov (United States)

    Schmidt, Filipp; Fleming, Roland W

    2016-11-01

    Morphogenesis-or the origin of complex natural form-has long fascinated researchers from practically every branch of science. However, we know practically nothing about how we perceive and understand such processes. Here, we measured how observers visually infer shape-transforming processes. Participants viewed pairs of objects ('before' and 'after' a transformation) and identified points that corresponded across the transformation. This allowed us to map out in spatial detail how perceived shape and space were affected by the transformations. Participants' responses were strikingly accurate and mutually consistent for a wide range of non-rigid transformations including complex growth-like processes. A zero-free-parameter model based on matching and interpolating/extrapolating the positions of high-salience contour features predicts the data surprisingly well, suggesting observers infer spatial correspondences relative to key landmarks. Together, our findings reveal the operation of specific perceptual organization processes that make us remarkably adept at identifying correspondences across complex shape-transforming processes by using salient object features. We suggest that these abilities, which allow us to parse and interpret the causally significant features of shapes, are invaluable for many tasks that involve 'making sense' of shape. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  3. Ethnographic methods for process evaluations of complex health behaviour interventions.

    Science.gov (United States)

    Morgan-Trimmer, Sarah; Wood, Fiona

    2016-05-04

    This article outlines the contribution that ethnography could make to process evaluations for trials of complex health-behaviour interventions. Process evaluations are increasingly used to examine how health-behaviour interventions operate to produce outcomes and often employ qualitative methods to do this. Ethnography shares commonalities with the qualitative methods currently used in health-behaviour evaluations but has a distinctive approach over and above these methods. It is an overlooked methodology in trials of complex health-behaviour interventions that has much to contribute to the understanding of how interventions work. These benefits are discussed here with respect to three strengths of ethnographic methodology: (1) producing valid data, (2) understanding data within social contexts, and (3) building theory productively. The limitations of ethnography within the context of process evaluations are also discussed.

  4. [Complex automatic data processing in multi-profile hospitals].

    Science.gov (United States)

    Dovzhenko, Iu M; Panov, G D

    1990-01-01

    The computerization of data processing in multi-disciplinary hospitals is the key factor in raising the quality of medical care provided to the population, intensifying the work of the personnel, improving the curative and diagnostic process and the use of resources. Even a small experience in complex computerization at the Botkin Hospital indicates that due to the use of the automated system the quality of data processing in being improved, a high level of patients' examination is being provided, a speedy training of young specialists is being achieved, conditions are being created for continuing education of physicians through the analysis of their own activity. At big hospitals a complex solution of administrative and curative diagnostic tasks on the basis of general hospital network of display connection and general hospital data bank is the most prospective form of computerization.

  5. Complex processing of rubber waste through energy recovery

    Directory of Open Access Journals (Sweden)

    Roman Smelík

    2015-12-01

    Full Text Available This article deals with the applied energy recovery solutions for complex processing of rubber waste for energy recovery. It deals specifically with the solution that could maximize possible use of all rubber waste and does not create no additional waste that disposal would be expensive and dangerous for the environment. The project is economically viable and energy self-sufficient. The outputs of the process could replace natural gas and crude oil products. The other part of the process is also the separation of metals, which can be returned to the metallurgical secondary production.

  6. Advances in intelligent process-aware information systems concepts, methods, and technologies

    CERN Document Server

    Oberhauser, Roy; Reichert, Manfred

    2017-01-01

    This book provides a state-of-the-art perspective on intelligent process-aware information systems and presents chapters on specific facets and approaches applicable to such systems. Further, it highlights novel advances and developments in various aspects of intelligent process-aware information systems and business process management systems. Intelligence capabilities are increasingly being integrated into or created in many of today’s software products and services. Process-aware information systems provide critical computing infrastructure to support the various processes involved in the creation and delivery of business products and services. Yet the integration of intelligence capabilities into process-aware information systems is a non-trivial yet necessary evolution of these complex systems. The book’s individual chapters address adaptive process management, case management processes, autonomically-capable processes, process-oriented information logistics, process recommendations, reasoning over ...

  7. Processing of Complex Auditory Patterns in Musicians and Nonmusicians

    OpenAIRE

    Boh, Bastiaan; Herholz, Sibylle C.; Lappe, Claudia; Pantev, Christo

    2011-01-01

    In the present study we investigated the capacity of the memory store underlying the mismatch negativity (MMN) response in musicians and nonmusicians for complex tone patterns. While previous studies have focused either on the kind of information that can be encoded or on the decay of the memory trace over time, we studied capacity in terms of the length of tone sequences, i.e., the number of individual tones that can be fully encoded and maintained. By means of magnetoencephalography (MEG) w...

  8. Mathematics Education as a Proving-Ground for Information-Processing Theories.

    Science.gov (United States)

    Greer, Brian, Ed.; Verschaffel, Lieven, Ed.

    1990-01-01

    Five papers discuss the current and potential contributions of information-processing theory to our understanding of mathematical thinking as those contributions affect the practice of mathematics education. It is concluded that information-processing theories need to be supplemented in various ways to more adequately reflect the complexity of…

  9. Aligning Business Process Quality and Information System Quality

    OpenAIRE

    Heinrich, Robert

    2013-01-01

    Business processes and information systems mutually affect each other in non-trivial ways. Frequently, the business process design and the information system design are not well aligned. This means that business processes are designed without taking the information system impact into account, and vice versa. Missing alignment at design time often results in quality problems at runtime, such as large response times of information systems, large process execution times, overloaded information s...

  10. Statistical physics of networks, information and complex systems

    Energy Technology Data Exchange (ETDEWEB)

    Ecke, Robert E [Los Alamos National Laboratory

    2009-01-01

    In this project we explore the mathematical methods and concepts of statistical physics that are fmding abundant applications across the scientific and technological spectrum from soft condensed matter systems and bio-infonnatics to economic and social systems. Our approach exploits the considerable similarity of concepts between statistical physics and computer science, allowing for a powerful multi-disciplinary approach that draws its strength from cross-fertilization and mUltiple interactions of researchers with different backgrounds. The work on this project takes advantage of the newly appreciated connection between computer science and statistics and addresses important problems in data storage, decoding, optimization, the infonnation processing properties of the brain, the interface between quantum and classical infonnation science, the verification of large software programs, modeling of complex systems including disease epidemiology, resource distribution issues, and the nature of highly fluctuating complex systems. Common themes that the project has been emphasizing are (i) neural computation, (ii) network theory and its applications, and (iii) a statistical physics approach to infonnation theory. The project's efforts focus on the general problem of optimization and variational techniques, algorithm development and infonnation theoretic approaches to quantum systems. These efforts are responsible for fruitful collaborations and the nucleation of science efforts that span multiple divisions such as EES, CCS, 0 , T, ISR and P. This project supports the DOE mission in Energy Security and Nuclear Non-Proliferation by developing novel infonnation science tools for communication, sensing, and interacting complex networks such as the internet or energy distribution system. The work also supports programs in Threat Reduction and Homeland Security.

  11. Acquisition of Computers That Process Corporate Information

    National Research Council Canada - National Science Library

    Gimble, Thomas

    1999-01-01

    The Secretary of Defense announced the Corporate Information Management initiative on November 16, 1990, to establish a DoD-wide concept for managing computer, communications, and information management functions...

  12. Hyperbolic mapping of complex networks based on community information

    Science.gov (United States)

    Wang, Zuxi; Li, Qingguang; Jin, Fengdong; Xiong, Wei; Wu, Yao

    2016-08-01

    To improve the hyperbolic mapping methods both in terms of accuracy and running time, a novel mapping method called Community and Hyperbolic Mapping (CHM) is proposed based on community information in this paper. Firstly, an index called Community Intimacy (CI) is presented to measure the adjacency relationship between the communities, based on which a community ordering algorithm is introduced. According to the proposed Community-Sector hypothesis, which supposes that most nodes of one community gather in a same sector in hyperbolic space, CHM maps the ordered communities into hyperbolic space, and then the angular coordinates of nodes are randomly initialized within the sector that they belong to. Therefore, all the network nodes are so far mapped to hyperbolic space, and then the initialized angular coordinates can be optimized by employing the information of all nodes, which can greatly improve the algorithm precision. By applying the proposed dual-layer angle sampling method in the optimization procedure, CHM reduces the time complexity to O(n2) . The experiments show that our algorithm outperforms the state-of-the-art methods.

  13. Three faces of entropy for complex systems: Information, thermodynamics, and the maximum entropy principle

    Science.gov (United States)

    Thurner, Stefan; Corominas-Murtra, Bernat; Hanel, Rudolf

    2017-09-01

    There are at least three distinct ways to conceptualize entropy: entropy as an extensive thermodynamic quantity of physical systems (Clausius, Boltzmann, Gibbs), entropy as a measure for information production of ergodic sources (Shannon), and entropy as a means for statistical inference on multinomial processes (Jaynes maximum entropy principle). Even though these notions represent fundamentally different concepts, the functional form of the entropy for thermodynamic systems in equilibrium, for ergodic sources in information theory, and for independent sampling processes in statistical systems, is degenerate, H (p ) =-∑ipilogpi . For many complex systems, which are typically history-dependent, nonergodic, and nonmultinomial, this is no longer the case. Here we show that for such processes, the three entropy concepts lead to different functional forms of entropy, which we will refer to as SEXT for extensive entropy, SIT for the source information rate in information theory, and SMEP for the entropy functional that appears in the so-called maximum entropy principle, which characterizes the most likely observable distribution functions of a system. We explicitly compute these three entropy functionals for three concrete examples: for Pólya urn processes, which are simple self-reinforcing processes, for sample-space-reducing (SSR) processes, which are simple history dependent processes that are associated with power-law statistics, and finally for multinomial mixture processes.

  14. The process of urban regeneration in context of information society

    Directory of Open Access Journals (Sweden)

    Bazik Dragana

    2006-01-01

    Full Text Available This paper deals with the concept of innovation of the urban regeneration process in context of transformations which are generated by information-communication technologies. From one aspect, Serbia has an exceptional human potential presented in number of 13,000 graduates each year, or in share of 42% of population who speaks English, which is the largest among all Eastern and Central European countries. This forms a basis for formulation of strategies of information society development in Serbia as well as for economic adjustments based upon knowledge, and for tracing the way to future knowledge society, i.e. eEurope 2020. On the other hand, we are witnessing an intensive development of huge complexes of mega and hypermarkets as a present dominant way for our city spaces' regeneration. At the same time, experiences from some other locations point to the deterioration of cities' urban identity as a consequence of the global capital infiltration and of development within an urban tissue of a huge complex of multi-national companies. Aiming to overcome the mistakes portrayed by international experience, as well as potential oversights that may occur because of routine and mismatch between certain phases of the sustainable development process, this paper makes an emphasis on the importance of an integral evaluation of the information society development trends and the spatial aspects of urban regeneration. It is essential to adjust devastated urban spaces as artifacts of one technological era to the actual information era with indication of future digital knowledge era, i.e. to plan, design and develop *according to new technological requirements and possibilities for new working places and new quality of living.

  15. Development of technical information processing system (VII)

    International Nuclear Information System (INIS)

    Kim, Tae Whan; Choi, Kwang; Oh, Jeong Hoon; Jeong, Hyun Sook; Keum, Jong Yong

    1995-12-01

    The goal of this project is to establish integrated environment focused on enhanced information services to researchers through the providing of acquisition information, key phrase retrieval function, journal content information linked with various subsystems already developed. The results of the project are as follows. 1. It is possible to serve information on unreceivable materials among required materials throughout the system. 2. Retrieval efficiency is increased by the adding of key phrase retrieval function. 3. Rapidity of information service is enhanced by the providing of journal contents of each issue received and work performance of contents service is become higher. 4. It is possible to acquire, store, serve technical information needed in R and D synthetically and systematically throughout the development of total system linked with various subsystems required to technical information management and service. 21 refs. (Author)

  16. Foundations for Streaming Model Transformations by Complex Event Processing.

    Science.gov (United States)

    Dávid, István; Ráth, István; Varró, Dániel

    2018-01-01

    Streaming model transformations represent a novel class of transformations to manipulate models whose elements are continuously produced or modified in high volume and with rapid rate of change. Executing streaming transformations requires efficient techniques to recognize activated transformation rules over a live model and a potentially infinite stream of events. In this paper, we propose foundations of streaming model transformations by innovatively integrating incremental model query, complex event processing (CEP) and reactive (event-driven) transformation techniques. Complex event processing allows to identify relevant patterns and sequences of events over an event stream. Our approach enables event streams to include model change events which are automatically and continuously populated by incremental model queries. Furthermore, a reactive rule engine carries out transformations on identified complex event patterns. We provide an integrated domain-specific language with precise semantics for capturing complex event patterns and streaming transformations together with an execution engine, all of which is now part of the Viatra reactive transformation framework. We demonstrate the feasibility of our approach with two case studies: one in an advanced model engineering workflow; and one in the context of on-the-fly gesture recognition.

  17. NEA, Nuclear law and information processing

    International Nuclear Information System (INIS)

    Reyners, P.

    1977-01-01

    NEA has for many years now been collating information on, and analysing, laws and regulations on the peaceful uses of nuclear energy, and this work has resulted in a series of publications. However, as seen by the multiplication of computer-based legal information centres, both at national and international level, conventional information systems are no longer adequate to deal with the increasing volume of information and with users' needs. In view of the particular aspects of nuclear law and of its own availabilities, NEA has endeavoured to make the best possible use of existing structures by opting for participation in the IAEA International Nuclear Information System rather than by creating a specialised centre. Before becoming operational, the arrangements concluded between NEA and IAEA required that the INIS rules be altered somewhat to take account of the specific problems raised by treatment of legal literature and also to improve the quality of information provided to users. (auth.) [fr

  18. Complex service recovery processes: how to avoid triple deviation

    OpenAIRE

    Edvardsson, Bo; Tronvoll, Bård; Höykinpuro, Ritva

    2011-01-01

    Purpose – This article seeks to develop a new framework to outline factors that influence the resolution of unfavourable service experiences as a result of double deviation. The focus is on understanding and managing complex service recovery processes. Design/methodology/approach – An inductive, explorative and narrative approach was selected. Data were collected in the form of narratives from the field through interviews with actors at various levels in organisations as well as with custo...

  19. Automated complex spectra processing of actinide α-radiation

    International Nuclear Information System (INIS)

    Anichenkov, S.V.; Popov, Yu.S.; Tselishchev, I.V.; Mishenev, V.B.; Timofeev, G.A.

    1989-01-01

    Earlier described algorithms of automated processing of complex α - spectra of actinides with the use of Ehlektronika D3-28 computer line, connected with ICA-070 multichannel amplitude pulse analyzer, were realized. The developed program enables to calculated peak intensity and the relative isotope content, to conduct energy calibration of spectra, to calculate peak center of gravity and energy resolution, to perform integral counting in particular part of the spectrum. Error of the method of automated processing depens on the degree of spectrum complication and lies within the limits of 1-12%. 8 refs.; 4 figs.; 2 tabs

  20. Methodology and Results of Mathematical Modelling of Complex Technological Processes

    Science.gov (United States)

    Mokrova, Nataliya V.

    2018-03-01

    The methodology of system analysis allows us to draw a mathematical model of the complex technological process. The mathematical description of the plasma-chemical process was proposed. The importance the quenching rate and initial temperature decrease time was confirmed for producing the maximum amount of the target product. The results of numerical integration of the system of differential equations can be used to describe reagent concentrations, plasma jet rate and temperature in order to achieve optimal mode of hardening. Such models are applicable both for solving control problems and predicting future states of sophisticated technological systems.

  1. Intelligent Transportation Control based on Proactive Complex Event Processing

    Directory of Open Access Journals (Sweden)

    Wang Yongheng

    2016-01-01

    Full Text Available Complex Event Processing (CEP has become the key part of Internet of Things (IoT. Proactive CEP can predict future system states and execute some actions to avoid unwanted states which brings new hope to intelligent transportation control. In this paper, we propose a proactive CEP architecture and method for intelligent transportation control. Based on basic CEP technology and predictive analytic technology, a networked distributed Markov decision processes model with predicting states is proposed as sequential decision model. A Q-learning method is proposed for this model. The experimental evaluations show that this method works well when used to control congestion in in intelligent transportation systems.

  2. Emotional Picture and Word Processing: An fMRI Study on Effects of Stimulus Complexity

    Science.gov (United States)

    Schlochtermeier, Lorna H.; Kuchinke, Lars; Pehrs, Corinna; Urton, Karolina; Kappelhoff, Hermann; Jacobs, Arthur M.

    2013-01-01

    Neuroscientific investigations regarding aspects of emotional experiences usually focus on one stimulus modality (e.g., pictorial or verbal). Similarities and differences in the processing between the different modalities have rarely been studied directly. The comparison of verbal and pictorial emotional stimuli often reveals a processing advantage of emotional pictures in terms of larger or more pronounced emotion effects evoked by pictorial stimuli. In this study, we examined whether this picture advantage refers to general processing differences or whether it might partly be attributed to differences in visual complexity between pictures and words. We first developed a new stimulus database comprising valence and arousal ratings for more than 200 concrete objects representable in different modalities including different levels of complexity: words, phrases, pictograms, and photographs. Using fMRI we then studied the neural correlates of the processing of these emotional stimuli in a valence judgment task, in which the stimulus material was controlled for differences in emotional arousal. No superiority for the pictorial stimuli was found in terms of emotional information processing with differences between modalities being revealed mainly in perceptual processing regions. While visual complexity might partly account for previously found differences in emotional stimulus processing, the main existing processing differences are probably due to enhanced processing in modality specific perceptual regions. We would suggest that both pictures and words elicit emotional responses with no general superiority for either stimulus modality, while emotional responses to pictures are modulated by perceptual stimulus features, such as picture complexity. PMID:23409009

  3. Imperfect Information in Software Design Processes

    NARCIS (Netherlands)

    Noppen, J.A.R.

    2007-01-01

    The process of designing high-quality software systems is one of the major issues in software engineering research. Over the years, this has resulted in numerous design methods, each with specific qualities and drawbacks. For example, the Rational Unified Process is a comprehensive design process,

  4. What do information reuse and automated processing require in engineering design? Semantic process

    Directory of Open Access Journals (Sweden)

    Ossi Nykänen

    2011-12-01

    Full Text Available Purpose: The purpose of this study is to characterize, analyze, and demonstrate machine-understandable semantic process for validating, integrating, and processing technical design information. This establishes both a vision and tools for information reuse and semi-automatic processing in engineering design projects, including virtual machine laboratory applications with generated components.Design/methodology/approach: The process model has been developed iteratively in terms of action research, constrained by the existing technical design practices and assumptions (design documents, expert feedback, available technologies (pre-studies and experiments with scripting and pipeline tools, benchmarking with other process models and methods (notably the RUP and DITA, and formal requirements (computability and the critical information paths for the generated applications. In practice, the work includes both quantitative and qualitative components.Findings: Technical design processes may be greatly enhanced in terms of semantic process thinking, by enriching design information, and automating information validation and transformation tasks. Contemporary design information, however, is mainly intended for human consumption, and needs to be explicitly enriched with the currently missing data and interfaces. In practice, this may require acknowledging the role of technical information or knowledge engineer, to lead the development of the semantic design information process in a design organization. There is also a trade-off between machine-readability and system complexity that needs to be studied further, both empirically and in theory.Research limitations/implications: The conceptualization of the semantic process is essentially an abstraction based on the idea of progressive design. While this effectively allows implementing semantic processes with, e.g., pipeline technologies, the abstraction is valid only when technical design is organized into

  5. Essays on Imperfect Information Processing in Economics

    NARCIS (Netherlands)

    S.S. Ficco (Stefano)

    2007-01-01

    textabstractEconomic agents generally operate in uncertain environments and, prior to making decisions, invest time and resources to collect useful information. Consumers compare the prices charged by di..erent firms before purchasing a product. Politicians gather information from di..erent

  6. Analytic Hierarchy Process for Personalising Environmental Information

    Science.gov (United States)

    Kabassi, Katerina

    2014-01-01

    This paper presents how a Geographical Information System (GIS) can be incorporated in an intelligent learning software system for environmental matters. The system is called ALGIS and incorporates the GIS in order to present effectively information about the physical and anthropogenic environment of Greece in a more interactive way. The system…

  7. The effects of mild and severe traumatic brain injury on speed of information processing as measured by the computerized tests of information processing (CTIP).

    Science.gov (United States)

    Tombaugh, Tom N; Rees, Laura; Stormer, Peter; Harrison, Allyson G; Smith, Andra

    2007-01-01

    In spite of the fact that reaction time (RT) measures are sensitive to the effects of traumatic brain injury (TBI), few RT procedures have been developed for use in standard clinical evaluations. The computerized test of information processing (CTIP) [Tombaugh, T. N., & Rees, L. (2000). Manual for the computerized tests of information processing (CTIP). Ottawa, Ont.: Carleton University] was designed to measure the degree to which TBI decreases the speed at which information is processed. The CTIP consists of three computerized programs that progressively increase the amount of information that is processed. Results of the current study demonstrated that RT increased as the difficulty of the CTIP tests increased (known as the complexity effect), and as severity of injury increased (from mild to severe TBI). The current study also demonstrated the importance of selecting a non-biased measure of variability. Overall, findings suggest that the CTIP is an easy to administer and sensitive measure of information processing speed.

  8. Novel Complexity Indicator of Manufacturing Process Chains and Its Relations to Indirect Complexity Indicators

    Directory of Open Access Journals (Sweden)

    Vladimir Modrak

    2017-01-01

    Full Text Available Manufacturing systems can be considered as a network of machines/workstations, where parts are produced in flow shop or job shop environment, respectively. Such network of machines/workstations can be depicted as a graph, with machines as nodes and material flow between the nodes as links. The aim of this paper is to use sequences of operations and machine network to measure static complexity of manufacturing processes. In this order existing approaches to measure the static complexity of manufacturing systems are analyzed and subsequently compared. For this purpose, analyzed competitive complexity indicators were tested on two different manufacturing layout examples. A subsequent analysis showed relevant potential of the proposed method.

  9. New Product Development (Npd) Process In Subsidiary: Information Perspectives

    OpenAIRE

    Firmanzah

    2008-01-01

    Information is an important resource for new product development (NPD) process in subsidiary. However, we still lack of research to analyze NPD process from information perspective in subsidiary context. This research is an exploratory research and it exploited 8 cases of NPD process in consumer goods subsidiaries operating in Indonesian market. Three types of information have been identified and analyzed NPD process; global, regional and local information. The result of this research ...

  10. Influence Business Process On The Quality Of Accounting Information System

    OpenAIRE

    Meiryani; Muhammad Syaifullah

    2015-01-01

    Abstract The purpose of this study was to determine the influence of business process to the quality of the accounting information system. This study aims to examine the influence of business process on the quality of the information system of accounting information system. The study was theoritical research which considered the roles of business process on quality of accounting information system which use secondary data collection. The results showed that the business process have a signifi...

  11. Integrating technology into complex intervention trial processes: a case study.

    Science.gov (United States)

    Drew, Cheney J G; Poile, Vincent; Trubey, Rob; Watson, Gareth; Kelson, Mark; Townson, Julia; Rosser, Anne; Hood, Kerenza; Quinn, Lori; Busse, Monica

    2016-11-17

    Trials of complex interventions are associated with high costs and burdens in terms of paperwork, management, data collection, validation, and intervention fidelity assessment occurring across multiple sites. Traditional data collection methods rely on paper-based forms, where processing can be time-consuming and error rates high. Electronic source data collection can potentially address many of these inefficiencies, but has not routinely been used in complex intervention trials. Here we present the use of an on-line system for managing all aspects of data handling and for the monitoring of trial processes in a multicentre trial of a complex intervention. We custom built a web-accessible software application for the delivery of ENGAGE-HD, a multicentre trial of a complex physical therapy intervention. The software incorporated functionality for participant randomisation, data collection and assessment of intervention fidelity. It was accessible to multiple users with differing levels of access depending on required usage or to maintain blinding. Each site was supplied with a 4G-enabled iPad for accessing the system. The impact of this system was quantified through review of data quality and collation of feedback from site coordinators and assessors through structured process interviews. The custom-built system was an efficient tool for collecting data and managing trial processes. Although the set-up time required was significant, using the system resulted in an overall data completion rate of 98.5% with a data query rate of 0.1%, the majority of which were resolved in under a week. Feedback from research staff indicated that the system was highly acceptable for use in a research environment. This was a reflection of the portability and accessibility of the system when using the iPad and its usefulness in aiding accurate data collection, intervention fidelity and general administration. A combination of commercially available hardware and a bespoke online database

  12. Pure sources and efficient detectors for optical quantum information processing

    Science.gov (United States)

    Zielnicki, Kevin

    Over the last sixty years, classical information theory has revolutionized the understanding of the nature of information, and how it can be quantified and manipulated. Quantum information processing extends these lessons to quantum systems, where the properties of intrinsic uncertainty and entanglement fundamentally defy classical explanation. This growing field has many potential applications, including computing, cryptography, communication, and metrology. As inherently mobile quantum particles, photons are likely to play an important role in any mature large-scale quantum information processing system. However, the available methods for producing and detecting complex multi-photon states place practical limits on the feasibility of sophisticated optical quantum information processing experiments. In a typical quantum information protocol, a source first produces an interesting or useful quantum state (or set of states), perhaps involving superposition or entanglement. Then, some manipulations are performed on this state, perhaps involving quantum logic gates which further manipulate or entangle the intial state. Finally, the state must be detected, obtaining some desired measurement result, e.g., for secure communication or computationally efficient factoring. The work presented here concerns the first and last stages of this process as they relate to photons: sources and detectors. Our work on sources is based on the need for optimized non-classical states of light delivered at high rates, particularly of single photons in a pure quantum state. We seek to better understand the properties of spontaneous parameteric downconversion (SPDC) sources of photon pairs, and in doing so, produce such an optimized source. We report an SPDC source which produces pure heralded single photons with little or no spectral filtering, allowing a significant rate enhancement. Our work on detectors is based on the need to reliably measure single-photon states. We have focused on

  13. Process control using modern systems of information processing

    International Nuclear Information System (INIS)

    Baldeweg, F.

    1984-01-01

    Modern digital automation techniques allow the application of demanding types of process control. These types of process control are characterized by their belonging to higher levels in a multilevel model. Functional and technical aspects of the performance of digital automation plants are presented and explained. A modern automation system is described considering special procedures of process control (e.g. real time diagnosis)

  14. Information Systems to Support a Decision Process at Stanford.

    Science.gov (United States)

    Chaffee, Ellen Earle

    1982-01-01

    When a rational decision process is desired, information specialists can contribute information and also contribute to the process in which that information is used, thereby promoting rational decision-making. The contribution of Stanford's information specialists to rational decision-making is described. (MLW)

  15. Cognitive Structures in Vocational Information Processing and Decision Making.

    Science.gov (United States)

    Nevill, Dorothy D.; And Others

    1986-01-01

    Tested the assumptions that the structural features of vocational schemas affect vocational information processing and career self-efficacy. Results indicated that effective vocational information processing was facilitated by well-integrated systems that processed information along fewer dimensions. The importance of schematic organization on the…

  16. Dynamics of analyst forecasts and emergence of complexity: Role of information disparity.

    Directory of Open Access Journals (Sweden)

    Chansoo Kim

    Full Text Available We report complex phenomena arising among financial analysts, who gather information and generate investment advice, and elucidate them with the help of a theoretical model. Understanding how analysts form their forecasts is important in better understanding the financial market. Carrying out big-data analysis of the analyst forecast data from I/B/E/S for nearly thirty years, we find skew distributions as evidence for emergence of complexity, and show how information asymmetry or disparity affects financial analysts' forming their forecasts. Here regulations, information dissemination throughout a fiscal year, and interactions among financial analysts are regarded as the proxy for a lower level of information disparity. It is found that financial analysts with better access to information display contrasting behaviors: a few analysts become bolder and issue forecasts independent of other forecasts while the majority of analysts issue more accurate forecasts and flock to each other. Main body of our sample of optimistic forecasts fits a log-normal distribution, with the tail displaying a power law. Based on the Yule process, we propose a model for the dynamics of issuing forecasts, incorporating interactions between analysts. Explaining nicely empirical data on analyst forecasts, this provides an appealing instance of understanding social phenomena in the perspective of complex systems.

  17. Can Intrinsic Fluctuations Increase Efficiency in Neural Information Processing?

    Science.gov (United States)

    Liljenström, Hans

    2003-05-01

    All natural processes are accompanied by fluctuations, characterized as noise or chaos. Biological systems, which have evolved during billions of years, are likely to have adapted, not only to cope with such fluctuations, but also to make use of them. We investigate how the complex dynamics of the brain, including oscillations, chaos and noise, can affect the efficiency of neural information processing. In particular, we consider the amplification and functional role of internal fluctuations. Using computer simulations of a neural network model of the olfactory cortex and hippocampus, we demonstrate how microscopic fluctuations can result in global effects at the network level. We show that the rate of information processing in associative memory tasks can be maximized for optimal noise levels, analogous to stochastic resonance phenomena. Noise can also induce transitions between different dynamical states, which could be of significance for learning and memory. A chaotic-like behavior, induced by noise or by an increase in neuronal excitability, can enhance system performance if it is transient and converges to a limit cycle memory state. We speculate whether this dynamical behavior perhaps could be related to (creative) thinking.

  18. RATING MODELS AND INFORMATION TECHNOLOGIES APPLICATION FOR MANAGEMENT OF ADMINISTRATIVE-TERRITORIAL COMPLEXES

    Directory of Open Access Journals (Sweden)

    O. M. Pshinko

    2016-12-01

    Full Text Available Purpose. The paper aims to develop rating models and related information technologies designed to resolve the tasks of strategic planning of the administrative and territorial units’ development, as well as the tasks of multi-criteria control of inhomogeneous multiparameter objects operation. Methodology. When solving problems of strategic planning of administrative and territorial development and heterogeneous classes management of objects under control, a set of agreed methods is used. Namely the multi-criteria properties analysis for objects of planning and management, diagnostics of the state parameters, forecasting and management of complex systems of different classes. Their states are estimated by sets of different quality indicators, as well as represented by the individual models of operation process. A new information technology is proposed and created to implement the strategic planning and management tasks. This technology uses the procedures for solving typical tasks, that are implemented in MS SQL Server. Findings. A new approach to develop models of analyze and management of complex systems classes based on the ratings has been proposed. Rating models development for analysis of multicriteria and multiparameter systems has been obtained. The management of these systems is performed on the base of parameters of the current and predicted state by non-uniform distribution of resources. The procedure of sensitivity analysis of the changes in the rating model of inhomogeneous distribution of resources parameters has been developed. The information technology of strategic planning and management of heterogeneous classes of objects based on the rating model has been created. Originality. This article proposes a new approach of the rating indicators’ using as a general model for strategic planning of the development and management of heterogeneous objects that can be characterized by the sets of parameters measured on different scales

  19. Modeling Stochastic Complexity in Complex Adaptive Systems: Non-Kolmogorov Probability and the Process Algebra Approach.

    Science.gov (United States)

    Sulis, William H

    2017-10-01

    Walter Freeman III pioneered the application of nonlinear dynamical systems theories and methodologies in his work on mesoscopic brain dynamics.Sadly, mainstream psychology and psychiatry still cling to linear correlation based data analysis techniques, which threaten to subvert the process of experimentation and theory building. In order to progress, it is necessary to develop tools capable of managing the stochastic complexity of complex biopsychosocial systems, which includes multilevel feedback relationships, nonlinear interactions, chaotic dynamics and adaptability. In addition, however, these systems exhibit intrinsic randomness, non-Gaussian probability distributions, non-stationarity, contextuality, and non-Kolmogorov probabilities, as well as the absence of mean and/or variance and conditional probabilities. These properties and their implications for statistical analysis are discussed. An alternative approach, the Process Algebra approach, is described. It is a generative model, capable of generating non-Kolmogorov probabilities. It has proven useful in addressing fundamental problems in quantum mechanics and in the modeling of developing psychosocial systems.

  20. Overall analysis of meteorological information in the Daeduk nuclear complex

    International Nuclear Information System (INIS)

    Kim, Eun Han; Lee, Yung Bok; Han, Moon Heui; Suh, Kyung Suk; Hwang Won Tae

    1995-01-01

    Inspection and repair of tower structure and lift, instrument calibration have been done. Wireless data transmission to MIPS(Meteorological Information Processing System) has been done after collection in the DAS where environmental assessment can be done by the developed simulation programs in both cases of normal operation and emergency. Wind direction, wind speed, temperature, humidity, at 67 m, 27 m, and 10 m height and temperature, humidity, atmospheric pressure, solar radiation, precipitation, and visibility at surface have been measured and analyzed with statistical methods. At the site, the prevailing wind directions were SW in spring and summer, N and NW in autumn and winter season. The calm distributed 13.6% at 67 m, 24.5% at 27 m, 40.8% at 10 m height. 4 figs, 9 tabs, 6 refs. (Author)

  1. Overall analysis of meteorological information in the Daeduk nuclear complex

    Energy Technology Data Exchange (ETDEWEB)

    Han, Moon Hee; Lee, Young Bok; Kim, Eun Han; Seo, Kyung Seok; Hwang, Wan Tae [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-12-01

    Inspection and repair of tower structure and lift, instrument calibration have been done. Wireless data transmission to MIPS(Meteorological Information Processing System) has been done after collection in the DAS where environmental assessment can be done by the developed simulation programs in both cases of normal operation and emergency. Wind direction, wind speed, temperature, humidity at 67m, 27m, and 10m height and temperature, humidity, atmospheric pressure, solar radiation, precipitation, and visibility at surface have been measured and analyzed with statistical methods. At the site, the prevailing wind directions were SW in spring and summer, NNW in winter season. The calm distributed 28.6% at 67m, 20.5% at 27m, 39.2% at 10m height. 9 tabs., 4 figs., 6 refs. (Author).

  2. Overall analysis of meteorological information in the Daeduk nuclear complex

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Eun Han; Lee, Yung Bok; Han, Moon Heui; Suh, Kyung Suk; Tae, Hwang Won [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-01-01

    Inspection and repair of tower structure and lift, instrument calibration have been done. Wireless data transmission to MIPS(Meteorological Information Processing System) has been done after collection in the DAS where environmental assessment can be done by the developed simulation programs in both cases of normal operation and emergency. Wind direction, wind speed, temperature, humidity, at 67 m, 27 m, and 10 m height and temperature, humidity, atmospheric pressure, solar radiation, precipitation, and visibility at surface have been measured and analyzed with statistical methods. At the site, the prevailing wind directions were SW in spring and summer, N and NW in autumn and winter season. The calm distributed 13.6% at 67 m, 24.5% at 27 m, 40.8% at 10 m height. 4 figs, 9 tabs, 6 refs. (Author).

  3. Overall analysis of meteorological information in the daeduk nuclear complex

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Byung Woo; Lee, Young Bok; Han, Moon Hee; Kim, Eun Han; Suh, Kyung Suk; Hwang, Won Tae; Hong, Suk Boong [Korea Atomic Energy Res. Inst., Taejon (Korea, Republic of)

    1992-12-01

    Problem shooting in tower structure, sensor installation, earth, and cabling have been done with integrated field-test, establishment of data acquisition system, and instrument calibration since the completion of the main tower construction in this year. Procedure guide was also made for the effective management covering instrument operation, calibration and repair. Real measurement has been done during two months from this October after whole integration of equipments. Occurrence of nocturnal inversion layer, fogging, and frequent stable condition of atmospheric stability were shown as the analysis results of measured data which well represented seasonal and regional characteristics in the site. Wireless data transmission to MIPS(Meteorological Information Processing System) has been done after collection in the DAS(data acquision system) where environmental assessment can be done by the developed simulation programs in both cases of normal operation and emergency. (Author).

  4. Overall analysis of meteorological information in the Daeduk nuclear complex

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Byung Woo; Lee, Young Bok; Han, Moon Hee; Kim, Eun Han; Suh, Kyung Suk; Hwang, Won Tae [Korea Atomic Energy Res. Inst., Taejon (Korea, Republic of)

    1994-01-01

    Inspection and repair of tower structure and lift, instrument calibration have been done with DAS (data aquisition system) updating. Wind direction, wind speed, temperature, humidity at 67m, 27m, and 10m height and temperature, humidity, atmospheric pressure, solar radiation, precipitation, and visibility at surface have been measured and analyzed with statistical methods. Wireless data transmission to MIPS (Meteorological Information Processing System) has been done after collection in the DAS where enviromental assessment can be done by the developed simulation programs in both cases of normal operation and emergency. The meteorological data as the result of this project had been used to report `Environmental Impact Assessment of the Korean Multi-purpose Research Reactor` and {sup S}ite Selection of Meteorological Tower and Environment Impact Assessment of the Cooling Tower of the Korean Multi-purpose Research Reactor{sup .} (Author).

  5. PUBLIC RELATIONS AS AN INFORMATION PROCESS PHENOMENON

    Directory of Open Access Journals (Sweden)

    TKACH L. M.

    2016-06-01

    Full Text Available Formulation of the problem. If public relations as a phenomenon of information management are examined, we deal with the question of knowledge content and nature of relationship of PR with environment, ability to manage the perception and attitude of people to events in the environment; ensure priority of information over other resources. Goal. To investigate the concept of "public relations" of foreign and domestic experts; consider the typology of the public and the "laws" of public opinion; define the basic principles according to which relations with public should be built, and to identify PR activities as a kind of social communication. Conclusions. Public relations on the basis of advanced information and communication technologies create fundamentally new opportunities for information control and influence on public consciousness.

  6. Entanglement and optimal quantum information processing

    International Nuclear Information System (INIS)

    Siomau, Michael

    2011-01-01

    Today we are standing on the verge of new enigmatic era of quantum technologies. In spite of the significant progress that has been achieved over the last three decades in experimental generation and manipulation as well as in theoretical description of evolution of single quantum systems, there are many open problems in understanding the behavior and properties of complex multiparticle quantum systems. In this thesis, we investigate theoretically a number of problems related to the description of entanglement - the nonlocal feature of complex quantum systems - of multiparticle states of finite-dimensional quantum systems. We also consider the optimal ways of manipulation of such systems. The focus is made, especially, on such optimal quantum transformations that provide a desired operation independently on the initial state of the given system. The first part of this thesis, in particular, is devoted to the detailed analysis of evolution of entanglement of complex quantum systems subjected to general non-unitary dynamics. In the second part of the thesis we construct several optimal state independent transformations, analyze their properties and suggest their applications in quantum communication and quantum computing. (orig.)

  7. Crystallization process of a three-dimensional complex plasma

    Science.gov (United States)

    Steinmüller, Benjamin; Dietz, Christopher; Kretschmer, Michael; Thoma, Markus H.

    2018-05-01

    Characteristic timescales and length scales for phase transitions of real materials are in ranges where a direct visualization is unfeasible. Therefore, model systems can be useful. Here, the crystallization process of a three-dimensional complex plasma under gravity conditions is considered where the system ranges up to a large extent into the bulk plasma. Time-resolved measurements exhibit the process down to a single-particle level. Primary clusters, consisting of particles in the solid state, grow vertically and, secondarily, horizontally. The box-counting method shows a fractal dimension of df≈2.72 for the clusters. This value gives a hint that the formation process is a combination of local epitaxial and diffusion-limited growth. The particle density and the interparticle distance to the nearest neighbor remain constant within the clusters during crystallization. All results are in good agreement with former observations of a single-particle layer.

  8. Information and analytical data system on radioecological impact of the Russian nuclear complex

    International Nuclear Information System (INIS)

    Iskra, A. A.; Serezhnikov, D. A.

    2006-01-01

    The information and analytical system contains data on enterprises of the Russian nuclear complex, beginning from mining and processing of uranium ores and ending by processing of spent nuclear fuel (SNF) and ionizing radiation sources (IRS). Radioecological information is presented about radiation hazardous objects of civil mission of the Federal Agency for Atomic Energy (Rosatom): underground leaching sites, radioactive waste (RW) storage facilities, tailing dumps, burials, reactors and critical facilities, etc. Radioecological impact is examined and information and regulatory-methodical documents of Federal Agency on Hydro meteorology and Environmental Monitoring, Federal Agency for Atomic Energy, Federal Agency on ecological, technological and atomic control, Federal Agency on Geodesy and Cartography is used concerning: -radionuclide discharges from the enterprises; -radionuclide releases from the enterprises under routine and accidental conditions; -contaminated lands; -radioecological consequences of RW dumped in the Arctic and Far-East seas. The report is accompanied by the operating sophisticated database demonstration

  9. Modelling of the quenching process in complex superconducting magnet systems

    International Nuclear Information System (INIS)

    Hagedorn, D.; Rodriguez-Mateos, F.

    1992-01-01

    This paper reports that the superconducting twin bore dipole magnet for the proposed Large Hadron Collider (LHC) at CERN shows a complex winding structure consisting of eight compact layers each of them electromagnetically and thermally coupled with the others. This magnet is only one part of an electrical circuit; test and operation conditions are characterized by different circuits. In order to study the quenching process in this complex system, design adequate protection schemes, and provide a basis for the dimensioning of protection devices such as heaters, current breakers and dump resistors, a general simulation tool called QUABER has been developed using the analog system analysis program SABER. A complete set of electro-thermal models has been crated for the propagation of normal regions. Any network extension or modification is easy to implement without rewriting the whole set of differential equations

  10. Improved motion contrast and processing efficiency in OCT angiography using complex-correlation algorithm

    International Nuclear Information System (INIS)

    Guo, Li; Li, Pei; Pan, Cong; Cheng, Yuxuan; Ding, Zhihua; Li, Peng; Liao, Rujia; Hu, Weiwei; Chen, Zhong

    2016-01-01

    The complex-based OCT angiography (Angio-OCT) offers high motion contrast by combining both the intensity and phase information. However, due to involuntary bulk tissue motions, complex-valued OCT raw data are processed sequentially with different algorithms for correcting bulk image shifts (BISs), compensating global phase fluctuations (GPFs) and extracting flow signals. Such a complicated procedure results in massive computational load. To mitigate such a problem, in this work, we present an inter-frame complex-correlation (CC) algorithm. The CC algorithm is suitable for parallel processing of both flow signal extraction and BIS correction, and it does not need GPF compensation. This method provides high processing efficiency and shows superiority in motion contrast. The feasibility and performance of the proposed CC algorithm is demonstrated using both flow phantom and live animal experiments. (paper)

  11. On the fragmentation of process information : challenges, solutions, and outlook

    NARCIS (Netherlands)

    Aa, van der J.H.; Leopold, H.; Mannhardt, F.; Reijers, H.A.; Gaaloul, K.; Schmidt, R.; Nurcan, S.; Guerreiro, S.; Ma, Q.

    2015-01-01

    An organization’s knowledge on its business processes represents valuable corporate knowledge because it can be used to enhance the performance of these processes. In many organizations, documentation of process knowledge is scattered around various process information sources. Such information

  12. 40 CFR 68.65 - Process safety information.

    Science.gov (United States)

    2010-07-01

    ... (CONTINUED) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.65 Process safety... 40 Protection of Environment 15 2010-07-01 2010-07-01 false Process safety information. 68.65... compilation of written process safety information before conducting any process hazard analysis required by...

  13. Development of technical information processing systems

    International Nuclear Information System (INIS)

    Lee, Ji Ho; Kim, Tae Whan; Kim, Sun Ja; Kim, Young Min; Choi, Kwang; Oh, Joung Hun; Choung, Hyun Suk; Keum, Jong Yong; Yoo, An Na; Harn, Deuck Haing; Choun, Young Chun

    1993-12-01

    The major goal of this project is to develop a more efficient information management system by connecting the KAREI serials database which enable the users to access from their own laboratory facilities through KAREI-NET. The importance of this project is to make the serials information of KAERI easily accessible to users as valuable resources for R and D activities. The results of the project are as follows. 1) Development of the serials database and retrieval system enabled us to access to the serials holding information through KAERI-NET. 2) The database construction establishes a foundation for the management of 1,600 serials held in KAERI. 3) The system can be applied not only to KAERI but also to similar medium-level libraries. (Author)

  14. Complex Ornament Machining Process on a CNC Router

    Directory of Open Access Journals (Sweden)

    Camelia COŞEREANU

    2014-03-01

    Full Text Available The paper investigates the CNC routering possibilities for three species of wood, namely ash (Fraxinus Excelsior, lime wood (Tilia cordata and fir wood (Abies Alba, in order to obtain right surfaces of Art Nouveau sculptured ornaments. Given the complexity of the CNC tool path for getting wavy shapes of Art Nouveau decorations, the choice of processing parameters for each processed species of wood requires a laborious research work to correlate these parameters. Two Art Nouveau ornaments are proposed for the investigation. They are CNC routered using two types of cutting tools. The processed parameters namely the spindle speed, feed speed and depth of cut were the three variables of the machining process for the three species of wood, which were combined so, to provide good surface finish as a quality attribute. There were totally forty six variants of combining the processing parameter which were applied for CNC routering the samples made of the three species of wood. At the end, an optimum combination of the processed parameters is recommended for each species of wood.

  15. Internet-based intelligent information processing systems

    CERN Document Server

    Tonfoni, G; Ichalkaranje, N S

    2003-01-01

    The Internet/WWW has made it possible to easily access quantities of information never available before. However, both the amount of information and the variation in quality pose obstacles to the efficient use of the medium. Artificial intelligence techniques can be useful tools in this context. Intelligent systems can be applied to searching the Internet and data-mining, interpreting Internet-derived material, the human-Web interface, remote condition monitoring and many other areas. This volume presents the latest research on the interaction between intelligent systems (neural networks, adap

  16. Information visualization for the Structural Complexity Management Approach

    OpenAIRE

    Maurer, Maik;Braun, Thomas;Lindemann, Udo

    2017-01-01

    The handling of complexity poses an important challenge and a success factor for product design. A considerable percentage of complexity results from dependencies between system elements – as adaptations to single system elements can cause far-reaching consequences. The Structural Complexity Management (SCM) approach provides a five-step procedure that supports users in the identification, acquisition, analysis and optimization of system dependencies. The approach covers the handling of multi...

  17. Uncertainty Reduction for Stochastic Processes on Complex Networks

    Science.gov (United States)

    Radicchi, Filippo; Castellano, Claudio

    2018-05-01

    Many real-world systems are characterized by stochastic dynamical rules where a complex network of interactions among individual elements probabilistically determines their state. Even with full knowledge of the network structure and of the stochastic rules, the ability to predict system configurations is generally characterized by a large uncertainty. Selecting a fraction of the nodes and observing their state may help to reduce the uncertainty about the unobserved nodes. However, choosing these points of observation in an optimal way is a highly nontrivial task, depending on the nature of the stochastic process and on the structure of the underlying interaction pattern. In this paper, we introduce a computationally efficient algorithm to determine quasioptimal solutions to the problem. The method leverages network sparsity to reduce computational complexity from exponential to almost quadratic, thus allowing the straightforward application of the method to mid-to-large-size systems. Although the method is exact only for equilibrium stochastic processes defined on trees, it turns out to be effective also for out-of-equilibrium processes on sparse loopy networks.

  18. An information theory-based approach to modeling the information processing of NPP operators

    International Nuclear Information System (INIS)

    Kim, Jong Hyun; Seong, Poong Hyun

    2002-01-01

    This paper proposes a quantitative approach to modeling the information processing of NPP operators. The aim of this work is to derive the amount of the information processed during a certain control task. The focus will be on i) developing a model for information processing of NPP operators and ii) quantifying the model. To resolve the problems of the previous approaches based on the information theory, i.e. the problems of single channel approaches, we primarily develop the information processing model having multiple stages, which contains information flows. Then the uncertainty of the information is quantified using the Conant's model, a kind of information theory

  19. Moral Judgment as Information Processing: An Integrative Review

    Directory of Open Access Journals (Sweden)

    Steve eGuglielmo

    2015-10-01

    Full Text Available This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two fundamental questions: (1 What input information guides moral judgments?; and (2 What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework, critically evaluates them on empirical and theoretical grounds, outlines a general integrative model grounded in information processing, and offers conceptual and methodological suggestions for future research. The information processing perspective provides a useful theoretical framework for organizing extant and future work in the rapidly growing field of moral judgment.

  20. Processing data base information having nonwhite noise

    Science.gov (United States)

    Gross, Kenneth C.; Morreale, Patricia

    1995-01-01

    A method and system for processing a set of data from an industrial process and/or a sensor. The method and system can include processing data from either real or calculated data related to an industrial process variable. One of the data sets can be an artificial signal data set generated by an autoregressive moving average technique. After obtaining two data sets associated with one physical variable, a difference function data set is obtained by determining the arithmetic difference between the two pairs of data sets over time. A frequency domain transformation is made of the difference function data set to obtain Fourier modes describing a composite function data set. A residual function data set is obtained by subtracting the composite function data set from the difference function data set and the residual function data set (free of nonwhite noise) is analyzed by a statistical probability ratio test to provide a validated data base.

  1. Information Processing Approaches to Cognitive Development

    Science.gov (United States)

    1988-07-01

    Craik . F.I.M., & Lockhart , R.S. (1972). Levels of processing : A framework for memory research. Journal of Verbal Learning and Verbal Behavior, 11...task at both levels of performance, then one would, in both cases, postulate systems that had the ability to process symbols at the microscopic level ...821760s and early 70s. (cf. Atkinson & Shiffrin. 1968: Craik & Lockhart . 1972: Norman, Rumelhart, & LNR, 1975). This architecture is comprised of several

  2. Numerical support, information processing and attitude change

    OpenAIRE

    de Dreu, C.K.W.; de Vries, N.K.

    1993-01-01

    In two experiments we studied the prediction that majority support induces stronger convergent processing than minority support for a persuasive message, the more so when recipients are explicitly forced to pay attention to the source's point of view; this in turn affects the amount of attitude change on related issues. Convergent processing is the systematic elaboration on the sources position, but with a stronger focus on verification and justification rather than falsification. In Exp 1 wi...

  3. Event-related potentials and information processing

    NARCIS (Netherlands)

    Brookhuis, Karel Anton

    1989-01-01

    We set out to test the hypotheses generated by Shiffrin & Schneider’s model of information procesing with our new tool, the ERP. The experiments were devised to test hypotheses that were orginally based on performance data alone, i.e. reaction time and errors. Although the overt behaviour was

  4. Visual Motion Perception and Visual Information Processing

    Science.gov (United States)

    1993-12-31

    tradi- tionally called the "span of apprehension" (Kulpe, 1904; Durable Storage Wundt , 1899). However, a partial-report procedure demon- strates...Gehrig. P. (1992). On the time course Wundt . W. (1899). Zur Kritik tachistoskopischer Versuche [A crit- of perceptual information that results from a

  5. Information theory and signal transduction systems: from molecular information processing to network inference.

    Science.gov (United States)

    Mc Mahon, Siobhan S; Sim, Aaron; Filippi, Sarah; Johnson, Robert; Liepe, Juliane; Smith, Dominic; Stumpf, Michael P H

    2014-11-01

    Sensing and responding to the environment are two essential functions that all biological organisms need to master for survival and successful reproduction. Developmental processes are marshalled by a diverse set of signalling and control systems, ranging from systems with simple chemical inputs and outputs to complex molecular and cellular networks with non-linear dynamics. Information theory provides a powerful and convenient framework in which such systems can be studied; but it also provides the means to reconstruct the structure and dynamics of molecular interaction networks underlying physiological and developmental processes. Here we supply a brief description of its basic concepts and introduce some useful tools for systems and developmental biologists. Along with a brief but thorough theoretical primer, we demonstrate the wide applicability and biological application-specific nuances by way of different illustrative vignettes. In particular, we focus on the characterisation of biological information processing efficiency, examining cell-fate decision making processes, gene regulatory network reconstruction, and efficient signal transduction experimental design. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. A quantitative approach to modeling the information processing of NPP operators under input information overload

    International Nuclear Information System (INIS)

    Kim, Jong Hyun; Seong, Poong Hyun

    2002-01-01

    This paper proposes a quantitative approach to modeling the information processing of NPP operators. The aim of this work is to derive the amount of the information processed during a certain control task under input information overload. We primarily develop the information processing model having multiple stages, which contains information flow. Then the uncertainty of the information is quantified using the Conant's model, a kind of information theory. We also investigate the applicability of this approach to quantifying the information reduction of operators under the input information overload

  7. Photonic single nonlinear-delay dynamical node for information processing

    Science.gov (United States)

    Ortín, Silvia; San-Martín, Daniel; Pesquera, Luis; Gutiérrez, José Manuel

    2012-06-01

    An electro-optical system with a delay loop based on semiconductor lasers is investigated for information processing by performing numerical simulations. This system can replace a complex network of many nonlinear elements for the implementation of Reservoir Computing. We show that a single nonlinear-delay dynamical system has the basic properties to perform as reservoir: short-term memory and separation property. The computing performance of this system is evaluated for two prediction tasks: Lorenz chaotic time series and nonlinear auto-regressive moving average (NARMA) model. We sweep the parameters of the system to find the best performance. The results achieved for the Lorenz and the NARMA-10 tasks are comparable to those obtained by other machine learning methods.

  8. Information theoretic methods for image processing algorithm optimization

    Science.gov (United States)

    Prokushkin, Sergey F.; Galil, Erez

    2015-01-01

    Modern image processing pipelines (e.g., those used in digital cameras) are full of advanced, highly adaptive filters that often have a large number of tunable parameters (sometimes > 100). This makes the calibration procedure for these filters very complex, and the optimal results barely achievable in the manual calibration; thus an automated approach is a must. We will discuss an information theory based metric for evaluation of algorithm adaptive characteristics ("adaptivity criterion") using noise reduction algorithms as an example. The method allows finding an "orthogonal decomposition" of the filter parameter space into the "filter adaptivity" and "filter strength" directions. This metric can be used as a cost function in automatic filter optimization. Since it is a measure of a physical "information restoration" rather than perceived image quality, it helps to reduce the set of the filter parameters to a smaller subset that is easier for a human operator to tune and achieve a better subjective image quality. With appropriate adjustments, the criterion can be used for assessment of the whole imaging system (sensor plus post-processing).

  9. Dystrophic Cardiomyopathy: Complex Pathobiological Processes to Generate Clinical Phenotype

    Directory of Open Access Journals (Sweden)

    Takeshi Tsuda

    2017-09-01

    Full Text Available Duchenne muscular dystrophy (DMD, Becker muscular dystrophy (BMD, and X-linked dilated cardiomyopathy (XL-DCM consist of a unique clinical entity, the dystrophinopathies, which are due to variable mutations in the dystrophin gene. Dilated cardiomyopathy (DCM is a common complication of dystrophinopathies, but the onset, progression, and severity of heart disease differ among these subgroups. Extensive molecular genetic studies have been conducted to assess genotype-phenotype correlation in DMD, BMD, and XL-DCM to understand the underlying mechanisms of these diseases, but the results are not always conclusive, suggesting the involvement of complex multi-layers of pathological processes that generate the final clinical phenotype. Dystrophin protein is a part of dystrophin-glycoprotein complex (DGC that is localized in skeletal muscles, myocardium, smooth muscles, and neuronal tissues. Diversity of cardiac phenotype in dystrophinopathies suggests multiple layers of pathogenetic mechanisms in forming dystrophic cardiomyopathy. In this review article, we review the complex molecular interactions involving the pathogenesis of dystrophic cardiomyopathy, including primary gene mutations and loss of structural integrity, secondary cellular responses, and certain epigenetic and other factors that modulate gene expressions. Involvement of epigenetic gene regulation appears to lead to specific cardiac phenotypes in dystrophic hearts.

  10. Symposium on Information Processing in Organizations.

    Science.gov (United States)

    1982-04-01

    Mental Hygiene Bourdieu , Pierre 1977 Outline of a Theory of Practice. Richard Nice, trans. Cambridge: Cambridge University Press. Cain, Leo F. and Samuel...hospital posed a unique evacuation problem. When a fire occurs in a hospital, information is typically communicated to doctors , nurses, and other...bore no relation whatsoever to the emergencies they announced, and they differed from institution to institution. Thus doctors , nuerses and other staff

  11. Strategic Information Processing from Behavioural Data in Iterated Games

    Directory of Open Access Journals (Sweden)

    Michael S. Harré

    2018-01-01

    Full Text Available Iterated games are an important framework of economic theory and application, at least since the original work of Axelrod’s computational tournaments of the early 80’s. Recent theoretical results have shown that games (the economic context and game theory (the decision-making process are both formally equivalent to computational logic gates. Here these results are extended to behavioural data obtained from an experiment in which rhesus monkeys sequentially played thousands of the “matching pennies” game, an empirical example similar to Axelrod’s tournaments in which algorithms played against one another. The results show that the monkeys exhibit a rich variety of behaviours, both between and within subjects when playing opponents of varying complexity. Despite earlier suggestions, there is no clear evidence that the win-stay, lose-switch strategy is used, however there is evidence of non-linear strategy-based interactions between the predictors of future choices. It is also shown that there is consistent evidence across protocols and across individuals that the monkeys extract non-markovian information, i.e., information from more than just the most recent state of the game. This work shows that the use of information theory in game theory can test important hypotheses that would otherwise be more difficult to extract using traditional statistical methods.

  12. Multiple-predators-based capture process on complex networks

    International Nuclear Information System (INIS)

    Sharafat, Rajput Ramiz; Pu Cunlai; Li Jie; Chen Rongbin; Xu Zhongqi

    2017-01-01

    The predator/prey (capture) problem is a prototype of many network-related applications. We study the capture process on complex networks by considering multiple predators from multiple sources. In our model, some lions start from multiple sources simultaneously to capture the lamb by biased random walks, which are controlled with a free parameter α . We derive the distribution of the lamb’s lifetime and the expected lifetime 〈 T 〉. Through simulation, we find that the expected lifetime drops substantially with the increasing number of lions. Moreover, we study how the underlying topological structure affects the capture process, and obtain that locating on small-degree nodes is better than on large-degree nodes to prolong the lifetime of the lamb. The dense or homogeneous network structures are against the survival of the lamb. We also discuss how to improve the capture efficiency in our model. (paper)

  13. An information theory based approach for quantitative evaluation of man-machine interface complexity

    International Nuclear Information System (INIS)

    Kang, Hyun Gook

    1999-02-01

    In complex and high-risk work conditions, especially such as in nuclear power plants, human understanding of the plant is highly cognitive and thus largely dependent on the effectiveness of the man-machine interface system. In order to provide more effective and reliable operating conditions for future nuclear power plants, developing more credible and easy to use evaluation methods will afford great help in designing interface systems in a more efficient manner. In this study, in order to analyze the human-machine interactions, I propose the Human-processor Communication(HPC) model which is based on the information flow concept. It identifies the information flow around a human-processor. Information flow has two aspects: appearance and content. Based on the HPC model, I propose two kinds of measures for evaluating a user interface from the viewpoint of these two aspects of information flow. They measure the communicative complexity of each aspect. In this study, for the evaluation of the aspect of appearance, I propose three complexity measures: Operation Complexity, Transition Complexity, and Screen Complexity. Each one of these measures has its own physical meaning. Two experiments carried out in this work support the utility of these measures. The result of the quiz game experiment shows that as the complexity of task context increases, the usage of the interface system becomes more complex. The experimental results of the three example systems(digital view, LDP style view and hierarchy view) show the utility of the proposed complexity measures. In this study, for the evaluation of the aspect of content, I propose the degree of informational coincidence, R (K, P) as a measure for the usefulness of an alarm-processing system. It is designed to perform user-oriented evaluation based on the informational entropy concept. It will be especially useful inearly design phase because designers can estimate the usefulness of an alarm system by short calculations instead

  14. An information theory based approach for quantitative evaluation of man-machine interface complexity

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Hyun Gook

    1999-02-15

    In complex and high-risk work conditions, especially such as in nuclear power plants, human understanding of the plant is highly cognitive and thus largely dependent on the effectiveness of the man-machine interface system. In order to provide more effective and reliable operating conditions for future nuclear power plants, developing more credible and easy to use evaluation methods will afford great help in designing interface systems in a more efficient manner. In this study, in order to analyze the human-machine interactions, I propose the Human-processor Communication(HPC) model which is based on the information flow concept. It identifies the information flow around a human-processor. Information flow has two aspects: appearance and content. Based on the HPC model, I propose two kinds of measures for evaluating a user interface from the viewpoint of these two aspects of information flow. They measure the communicative complexity of each aspect. In this study, for the evaluation of the aspect of appearance, I propose three complexity measures: Operation Complexity, Transition Complexity, and Screen Complexity. Each one of these measures has its own physical meaning. Two experiments carried out in this work support the utility of these measures. The result of the quiz game experiment shows that as the complexity of task context increases, the usage of the interface system becomes more complex. The experimental results of the three example systems(digital view, LDP style view and hierarchy view) show the utility of the proposed complexity measures. In this study, for the evaluation of the aspect of content, I propose the degree of informational coincidence, R (K, P) as a measure for the usefulness of an alarm-processing system. It is designed to perform user-oriented evaluation based on the informational entropy concept. It will be especially useful inearly design phase because designers can estimate the usefulness of an alarm system by short calculations instead

  15. Influence Business Process On The Quality Of Accounting Information System

    Directory of Open Access Journals (Sweden)

    Meiryani

    2015-01-01

    Full Text Available Abstract The purpose of this study was to determine the influence of business process to the quality of the accounting information system. This study aims to examine the influence of business process on the quality of the information system of accounting information system. The study was theoritical research which considered the roles of business process on quality of accounting information system which use secondary data collection. The results showed that the business process have a significant effect on the quality of accounting information systems.

  16. Tailored information for cancer patients on the Internet: effects of visual cues and language complexity on information recall and satisfaction.

    NARCIS (Netherlands)

    Weert, J.C.M. van; Noort, G. van; Bol, N.; Dijk, L. van; Tates, K.; Jansen, J.

    2011-01-01

    Objective: This study was designed to investigate the effects of visual cues and language complexity on satisfaction and information recall using a personalised website for lung cancer patients. In addition, age effects were investigated. Methods: An experiment using a 2 (complex vs. non-complex

  17. Tailored information for cancer patients on the Internet: effects of visual cues and language complexity on information recall and satisfaction

    NARCIS (Netherlands)

    van Weert, J.C.M.; van Noort, G.; Bol, N.; van Dijk, L.; Tates, K.; Jansen, J.

    2011-01-01

    Objective This study was designed to investigate the effects of visual cues and language complexity on satisfaction and information recall using a personalised website for lung cancer patients. In addition, age effects were investigated. Methods An experiment using a 2 (complex vs. non-complex

  18. A Study on Improving Information Processing Abilities Based on PBL

    Science.gov (United States)

    Kim, Du Gyu; Lee, JaeMu

    2014-01-01

    This study examined an instruction method for the improvement of information processing abilities in elementary school students. Current elementary students are required to develop information processing abilities to create new knowledge for this digital age. There is, however, a shortage of instruction strategies for these information processing…

  19. Passive Polarimetric Information Processing for Target Classification

    Science.gov (United States)

    Sadjadi, Firooz; Sadjadi, Farzad

    Polarimetric sensing is an area of active research in a variety of applications. In particular, the use of polarization diversity has been shown to improve performance in automatic target detection and recognition. Within the diverse scope of polarimetric sensing, the field of passive polarimetric sensing is of particular interest. This chapter presents several new methods for gathering in formation using such passive techniques. One method extracts three-dimensional (3D) information and surface properties using one or more sensors. Another method extracts scene-specific algebraic expressions that remain unchanged under polariza tion transformations (such as along the transmission path to the sensor).

  20. Information paths within the new product development process

    DEFF Research Database (Denmark)

    Jespersen, Kristina Risom

    2007-01-01

    collection platform to obtain measurements from within the NPD process. 42 large, international companies participated in the data collecting simulation. Results revealed five different information paths that were not connecting all stages of the NPD process. Moreover, results show that the front......-end is not driving the information acquisition through the stages of the NPD process, and that environmental turbulence disconnects stages from the information paths in the NPD process. This implies that information is at the same time a key to success and a key to entrapment in the NPD process....

  1. Default mode contributions to automated information processing.

    Science.gov (United States)

    Vatansever, Deniz; Menon, David K; Stamatakis, Emmanuel A

    2017-11-28

    Concurrent with mental processes that require rigorous computation and control, a series of automated decisions and actions govern our daily lives, providing efficient and adaptive responses to environmental demands. Using a cognitive flexibility task, we show that a set of brain regions collectively known as the default mode network plays a crucial role in such "autopilot" behavior, i.e., when rapidly selecting appropriate responses under predictable behavioral contexts. While applying learned rules, the default mode network shows both greater activity and connectivity. Furthermore, functional interactions between this network and hippocampal and parahippocampal areas as well as primary visual cortex correlate with the speed of accurate responses. These findings indicate a memory-based "autopilot role" for the default mode network, which may have important implications for our current understanding of healthy and adaptive brain processing.

  2. Environmental information document defense waste processing facility

    International Nuclear Information System (INIS)

    1981-07-01

    This report documents the impact analysis of a proposed Defense Waste Processing Facility (DWPF) for immobilizing high-level waste currently being stored on an interim basis at the Savannah River Plant (SRP). The DWPF will process the waste into a form suitable for shipment to and disposal in a federal repository. The DWPF will convert the high-level waste into: a leach-resistant form containing above 99.9% of all the radioactivity, and a residue of slightly contaminated salt. The document describes the SRP site and environs, including population, land and water uses; surface and subsurface soils and waters; meteorology; and ecology. A conceptual integrated facility for concurrently producing glass waste and saltcrete is described, and the environmental effects of constructing and operating the facility are presented. Alternative sites and waste disposal options are addressed. Also environmental consultations and permits are discussed

  3. Musical beauty and information compression: Complex to the ear but simple to the mind?

    Science.gov (United States)

    Hudson, Nicholas J

    2011-01-20

    The biological origin of music, its universal appeal across human cultures and the cause of its beauty remain mysteries. For example, why is Ludwig Van Beethoven considered a musical genius but Kylie Minogue is not? Possible answers to these questions will be framed in the context of Information Theory. The entire life-long sensory data stream of a human is enormous. The adaptive solution to this problem of scale is information compression, thought to have evolved to better handle, interpret and store sensory data. In modern humans highly sophisticated information compression is clearly manifest in philosophical, mathematical and scientific insights. For example, the Laws of Physics explain apparently complex observations with simple rules. Deep cognitive insights are reported as intrinsically satisfying, implying that at some point in evolution, the practice of successful information compression became linked to the physiological reward system. I hypothesise that the establishment of this "compression and pleasure" connection paved the way for musical appreciation, which subsequently became free (perhaps even inevitable) to emerge once audio compression had become intrinsically pleasurable in its own right. For a range of compositions, empirically determine the relationship between the listener's pleasure and "lossless" audio compression. I hypothesise that enduring musical masterpieces will possess an interesting objective property: despite apparent complexity, they will also exhibit high compressibility. Artistic masterpieces and deep Scientific insights share the common process of data compression. Musical appreciation is a parasite on a much deeper information processing capacity. The coalescence of mathematical and musical talent in exceptional individuals has a parsimonious explanation. Musical geniuses are skilled in composing music that appears highly complex to the ear yet transpires to be highly simple to the mind. The listener's pleasure is influenced

  4. Musical beauty and information compression: Complex to the ear but simple to the mind?

    Directory of Open Access Journals (Sweden)

    Hudson Nicholas J

    2011-01-01

    Full Text Available Abstract Background The biological origin of music, its universal appeal across human cultures and the cause of its beauty remain mysteries. For example, why is Ludwig Van Beethoven considered a musical genius but Kylie Minogue is not? Possible answers to these questions will be framed in the context of Information Theory. Presentation of the Hypothesis The entire life-long sensory data stream of a human is enormous. The adaptive solution to this problem of scale is information compression, thought to have evolved to better handle, interpret and store sensory data. In modern humans highly sophisticated information compression is clearly manifest in philosophical, mathematical and scientific insights. For example, the Laws of Physics explain apparently complex observations with simple rules. Deep cognitive insights are reported as intrinsically satisfying, implying that at some point in evolution, the practice of successful information compression became linked to the physiological reward system. I hypothesise that the establishment of this "compression and pleasure" connection paved the way for musical appreciation, which subsequently became free (perhaps even inevitable to emerge once audio compression had become intrinsically pleasurable in its own right. Testing the Hypothesis For a range of compositions, empirically determine the relationship between the listener's pleasure and "lossless" audio compression. I hypothesise that enduring musical masterpieces will possess an interesting objective property: despite apparent complexity, they will also exhibit high compressibility. Implications of the Hypothesis Artistic masterpieces and deep Scientific insights share the common process of data compression. Musical appreciation is a parasite on a much deeper information processing capacity. The coalescence of mathematical and musical talent in exceptional individuals has a parsimonious explanation. Musical geniuses are skilled in composing music

  5. Cloud Standardization: Consistent Business Processes and Information

    Directory of Open Access Journals (Sweden)

    Razvan Daniel ZOTA

    2013-01-01

    Full Text Available Cloud computing represents one of the latest emerging trends in distributed computing that enables the existence of hardware infrastructure and software applications as services. The present paper offers a general approach to the cloud computing standardization as a mean of improving the speed of adoption for the cloud technologies. Moreover, this study tries to show out how organizations may achieve more consistent business processes while operating with cloud computing technologies.

  6. Application of Fisher Information to Complex Dynamic Systems (Tucson)

    Science.gov (United States)

    Fisher information was developed by the statistician Ronald Fisher as a measure of the information obtainable from data being used to fit a related parameter. Starting from the work of Ronald Fisher1 and B. Roy Frieden2, we have developed Fisher information as a measure of order ...

  7. Application of Fisher Information to Complex Dynamic Systems

    Science.gov (United States)

    Fisher information was developed by the statistician Ronald Fisher as a measure of the information obtainable from data being used to fit a related parameter. Starting from the work of Ronald Fisher1 and B. Roy Frieden2, we have developed Fisher information as a measure of order ...

  8. Quantum-Classical Hybrid for Information Processing

    Science.gov (United States)

    Zak, Michail

    2011-01-01

    Based upon quantum-inspired entanglement in quantum-classical hybrids, a simple algorithm for instantaneous transmissions of non-intentional messages (chosen at random) to remote distances is proposed. The idea is to implement instantaneous transmission of conditional information on remote distances via a quantum-classical hybrid that preserves superposition of random solutions, while allowing one to measure its state variables using classical methods. Such a hybrid system reinforces the advantages, and minimizes the limitations, of both quantum and classical characteristics. Consider n observers, and assume that each of them gets a copy of the system and runs it separately. Although they run identical systems, the outcomes of even synchronized runs may be different because the solutions of these systems are random. However, the global constrain must be satisfied. Therefore, if the observer #1 (the sender) made a measurement of the acceleration v(sub 1) at t =T, then the receiver, by measuring the corresponding acceleration v(sub 1) at t =T, may get a wrong value because the accelerations are random, and only their ratios are deterministic. Obviously, the transmission of this knowledge is instantaneous as soon as the measurements have been performed. In addition to that, the distance between the observers is irrelevant because the x-coordinate does not enter the governing equations. However, the Shannon information transmitted is zero. None of the senders can control the outcomes of their measurements because they are random. The senders cannot transmit intentional messages. Nevertheless, based on the transmitted knowledge, they can coordinate their actions based on conditional information. If the observer #1 knows his own measurements, the measurements of the others can be fully determined. It is important to emphasize that the origin of entanglement of all the observers is the joint probability density that couples their actions. There is no centralized source

  9. The eyes have it: Using eye tracking to inform information processing strategies in multi-attributes choices.

    Science.gov (United States)

    Ryan, Mandy; Krucien, Nicolas; Hermens, Frouke

    2018-04-01

    Although choice experiments (CEs) are widely applied in economics to study choice behaviour, understanding of how individuals process attribute information remains limited. We show how eye-tracking methods can provide insight into how decisions are made. Participants completed a CE, while their eye movements were recorded. Results show that although the information presented guided participants' decisions, there were also several processing biases at work. Evidence was found of (a) top-to-bottom, (b) left-to-right, and (c) first-to-last order biases. Experimental factors-whether attributes are defined as "best" or "worst," choice task complexity, and attribute ordering-also influence information processing. How individuals visually process attribute information was shown to be related to their choices. Implications for the design and analysis of CEs and future research are discussed. Copyright © 2017 John Wiley & Sons, Ltd.

  10. Levels of Information Processing in a Fitts law task (LIPFitts)

    Science.gov (United States)

    Mosier, K. L.; Hart, S. G.

    1986-01-01

    State-of-the-art flight technology has restructured the task of human operators, decreasing the need for physical and sensory resources, and increasing the quantity of cognitive effort required, changing it qualitatively. Recent technological advances have the most potential for impacting a pilot in two areas: performance and mental workload. In an environment in which timing is critical, additional cognitive processing can cause performance decrements, and increase a pilot's perception of the mental workload involved. The effects of stimulus processing demands on motor response performance and subjective mental workload are examined, using different combinations of response selection and target acquisition tasks. The information processing demands of the response selection were varied (e.g., Sternberg memory set tasks, math equations, pattern matching), as was the difficulty of the response execution. Response latency as well as subjective workload ratings varied in accordance with the cognitive complexity of the task. Movement times varied according to the difficulty of the response execution task. Implications in terms of real-world flight situations are discussed.

  11. Supporting risk-informed decisions during business process execution

    NARCIS (Netherlands)

    Conforti, R.; Leoni, de M.; La Rosa, M.; Aalst, van der W.M.P.; Salinesi, C.; Norrie, M.C.; Pastor, O.

    2013-01-01

    This paper proposes a technique that supports process participants in making risk-informed decisions, with the aim to reduce the process risks. Risk reduction involves decreasing the likelihood and severity of a process fault from occurring. Given a process exposed to risks, e.g. a financial process

  12. Summer School Mathematical Foundations of Complex Networked Information Systems

    CERN Document Server

    Fosson, Sophie; Ravazzi, Chiara

    2015-01-01

    Introducing the reader to the mathematics beyond complex networked systems, these lecture notes investigate graph theory, graphical models, and methods from statistical physics. Complex networked systems play a fundamental role in our society, both in everyday life and in scientific research, with applications ranging from physics and biology to economics and finance. The book is self-contained, and requires only an undergraduate mathematical background.

  13. COMPLEX PROCESSING OF CELLULOSE WASTE FROM POULTRY AND SUGAR PRODUCTION

    Directory of Open Access Journals (Sweden)

    E. V. Sklyadnev

    2015-01-01

    Full Text Available Summary.To solve the problem of disposing of huge volumes of cellulose waste from sugar production in the form of beet pulp and waste of poultry farms in the form of poultry manure is proposed to use the joint use of two methods of thermal processing of waste - pyrolysis and gasification. The possibility of using pyrolysis applied to the waste are confirmed by experimental results. Based on the results of laboratory studies of the properties of by-products resulting from the thermal processing of the feedstock, it is proposed complex processing to produce useful products, to be implemented in the form of marketable products, and the organization's own process energy utilization. Developed flow diagram of an integrated processing said waste comprises 3 sections, which successively carried out: pyrolytic decomposition of the feedstock to obtain a secondary product in the form of solid, liquid and gas fractions, the gasification of solids to obtain combustible gas and separating the liquid fraction by distillation to obtain valuable products. The main equipment in the first region is the pyrolysis reactor cascade condensers; the second section - gasifiers layers and stream type; the third - one or more distillation columns with the necessary strapping. Proper power supply installation is organized by the use of the heat produced during combustion of the synthesis gas for heating and gasification reactor. For the developed scheme presents calculations of the heat balance of the installation, supporting the energy efficiency of the proposed disposal process. Developments carried out in the framework of the project the winner of the Youth Prize Competition Government of Voronezh region to support youth programs in the 2014-2015.

  14. Effects of individual popularity on information spreading in complex networks

    Science.gov (United States)

    Gao, Lei; Li, Ruiqi; Shu, Panpan; Wang, Wei; Gao, Hui; Cai, Shimin

    2018-01-01

    In real world, human activities often exhibit preferential selection mechanism based on the popularity of individuals. However, this mechanism is seldom taken into account by previous studies about spreading dynamics on networks. Thus in this work, an information spreading model is proposed by considering the preferential selection based on individuals' current popularity, which is defined as the number of individuals' cumulative contacts with informed neighbors. A mean-field theory is developed to analyze the spreading model. Through systematically studying the information spreading dynamics on uncorrelated configuration networks as well as real-world networks, we find that the popularity preference has great impacts on the information spreading. On the one hand, the information spreading is facilitated, i.e., a larger final prevalence of information and a smaller outbreak threshold, if nodes with low popularity are preferentially selected. In this situation, the effective contacts between informed nodes and susceptible nodes are increased, and nodes almost have uniform probabilities of obtaining the information. On the other hand, if nodes with high popularity are preferentially selected, the final prevalence of information is reduced, the outbreak threshold is increased, and even the information cannot outbreak. In addition, the heterogeneity of the degree distribution and the structure of real-world networks do not qualitatively affect the results. Our research can provide some theoretical supports for the promotion of spreading such as information, health related behaviors, and new products, etc.

  15. Information Integration; The process of integration, evolution and versioning

    NARCIS (Netherlands)

    de Keijzer, Ander; van Keulen, Maurice

    2005-01-01

    At present, many information sources are available wherever you are. Most of the time, the information needed is spread across several of those information sources. Gathering this information is a tedious and time consuming job. Automating this process would assist the user in its task. Integration

  16. Process Knowledge Summary Report for Materials and Fuels Complex Contact-Handled Transuranic Debris Waste

    Energy Technology Data Exchange (ETDEWEB)

    R. P. Grant; P. J. Crane; S. Butler; M. A. Henry

    2010-02-01

    This Process Knowledge Summary Report summarizes the information collected to satisfy the transportation and waste acceptance requirements for the transfer of transuranic (TRU) waste between the Materials and Fuels Complex (MFC) and the Advanced Mixed Waste Treatment Project (AMWTP). The information collected includes documentation that addresses the requirements for AMWTP and the applicable portion of their Resource Conservation and Recovery Act permits for receipt and treatment of TRU debris waste in AMWTP. This report has been prepared for contact-handled TRU debris waste generated by the Idaho National Laboratory at MFC. The TRU debris waste will be shipped to AMWTP for purposes of supercompaction. This Process Knowledge Summary Report includes information regarding, but not limited to, the generation process, the physical form, radiological characteristics, and chemical contaminants of the TRU debris waste, prohibited items, and packaging configuration. This report, along with the referenced supporting documents, will create a defensible and auditable record for waste originating from MFC.

  17. Why genetic information processing could have a quantum basis

    Indian Academy of Sciences (India)

    Unknown

    Centre for Theoretical Studies and Supercomputer Education and Research Centre, ... the parent to the offspring, sensory information conveyed by the sense organ to the .... The task involved in genetic information processing is. ASSEMBLY.

  18. Process-aware information system development for the healthcare domain : consistency, reliability and effectiveness

    NARCIS (Netherlands)

    Mans, R.S.; Aalst, van der W.M.P.; Russell, N.C.; Bakker, P.J.M.; Moleman, A.J.; Rinderle-Ma, S.; Sadiq, S.; Leymann, F.

    2010-01-01

    Optimal support for complex healthcare processes cannot be provided by a single out-of-the-box Process-Aware Information System and necessitates the construction of customized applications based on these systems. In order to allow for the seamless integration of the new technology into the existing

  19. Information search and decision making: effects of age and complexity on strategy use.

    Science.gov (United States)

    Queen, Tara L; Hess, Thomas M; Ennis, Gilda E; Dowd, Keith; Grühn, Daniel

    2012-12-01

    The impact of task complexity on information search strategy and decision quality was examined in a sample of 135 young, middle-aged, and older adults. We were particularly interested in the competing roles of fluid cognitive ability and domain knowledge and experience, with the former being a negative influence and the latter being a positive influence on older adults' performance. Participants utilized 2 decision matrices, which varied in complexity, regarding a consumer purchase. Using process tracing software and an algorithm developed to assess decision strategy, we recorded search behavior, strategy selection, and final decision. Contrary to expectations, older adults were not more likely than the younger age groups to engage in information-minimizing search behaviors in response to increases in task complexity. Similarly, adults of all ages used comparable decision strategies and adapted their strategies to the demands of the task. We also examined decision outcomes in relation to participants' preferences. Overall, it seems that older adults utilize simpler sets of information primarily reflecting the most valued attributes in making their choice. The results of this study suggest that older adults are adaptive in their approach to decision making and that this ability may benefit from accrued knowledge and experience. 2013 APA, all rights reserved

  20. Usage of information safety requirements in improving tube bending process

    Science.gov (United States)

    Livshitz, I. I.; Kunakov, E.; Lontsikh, P. A.

    2018-05-01

    This article is devoted to an improvement of the technological process's analysis with the information security requirements implementation. The aim of this research is the competition increase analysis in aircraft industry enterprises due to the information technology implementation by the example of the tube bending technological process. The article analyzes tube bending kinds and current technique. In addition, a potential risks analysis in a tube bending technological process is carried out in terms of information security.

  1. Living is information processing: from molecules to global systems

    OpenAIRE

    Farnsworth, Keith D.; Nelson, John; Gershenson, Carlos

    2012-01-01

    We extend the concept that life is an informational phenomenon, at every level of organisation, from molecules to the global ecological system. According to this thesis: (a) living is information processing, in which memory is maintained by both molecular states and ecological states as well as the more obvious nucleic acid coding; (b) this information processing has one overall function - to perpetuate itself; and (c) the processing method is filtration (cognition) of, and synthesis of, info...

  2. Using evaluation to adapt health information outreach to the complex environments of community-based organizations.

    Science.gov (United States)

    Olney, Cynthia A

    2005-10-01

    After arguing that most community-based organizations (CBOs) function as complex adaptive systems, this white paper describes the evaluation goals, questions, indicators, and methods most important at different stages of community-based health information outreach. This paper presents the basic characteristics of complex adaptive systems and argues that the typical CBO can be considered this type of system. It then presents evaluation as a tool for helping outreach teams adapt their outreach efforts to the CBO environment and thus maximize success. Finally, it describes the goals, questions, indicators, and methods most important or helpful at each stage of evaluation (community assessment, needs assessment and planning, process evaluation, and outcomes assessment). Literature from complex adaptive systems as applied to health care, business, and evaluation settings is presented. Evaluation models and applications, particularly those based on participatory approaches, are presented as methods for maximizing the effectiveness of evaluation in dynamic CBO environments. If one accepts that CBOs function as complex adaptive systems-characterized by dynamic relationships among many agents, influences, and forces-then effective evaluation at the stages of community assessment, needs assessment and planning, process evaluation, and outcomes assessment is critical to outreach success.

  3. Usefulness of surgical complexity classification index in cataract surgery process.

    Science.gov (United States)

    Salazar Méndez, R; Cuesta García, M; Llaneza Velasco, M E; Rodríguez Villa, S; Cubillas Martín, M; Alonso Álvarez, C M

    2016-06-01

    To evaluate the usefulness of surgical complexity classification index (SCCI) to predict the degree of surgical difficulty in cataract surgery. This retrospective study includes data collected between January 2013 and December 2014 from patients who underwent cataract extraction by phacoemulsification at our hospital. A sample size of 159 patients was obtained by simple random sampling (P=.5, 10% accuracy, 95% confidence). The main variables were: recording and value of SCCI in electronic medical record (EMR), presence of exfoliation syndrome (XFS), criteria for inclusion in surgical waiting list (SWL), and functional results. SCCI was classified into 7 categories (range: 1-4) according to predictors of technical difficulty, which was indirectly estimated in terms of surgical time (ST). All statistical analyses were performed using SPSS v15.0 statistical software. Prevalence of XFS was 18.2% (95%CI: 11.9-24.5). In terms of quality indicators in the cataract surgery process, 96.8% of patients met at least one of the criteria to be included in SWL, and 98.1% gained ≥2 Snellen lines. The SCCI was recorded in EMR of 98.1% patients, and it was grouped for study into 2 categories: High and low surgical complexity. Statistically significant differences in the distribution of ST were found depending on the assigned SCCI (Pde Oftalmología. Published by Elsevier España, S.L.U. All rights reserved.

  4. [Effect of the microencapsulation process parameters piroxicam by complex coacervation].

    Science.gov (United States)

    Lamoudi, L; Chaumeil, J-C; Daoud, K

    2015-01-01

    The gelatin-acacia system is used for the microencapsulation of piroxicam by complex coacervation. The effect of some formulation parameters and process, namely the ratio of gelatin/gum acacia, core/wall ratio, concentration of crosslinking agent and crosslinking time are studied. The microcapsules properties are evaluated. The results showed that the microcapsules have a spherical shape, a coacervation efficiency greater than 70%, an average diameter less than 250 microns, a good stability and finally, the better values are obtained for gelatin/acacia ratio (5/3), ratio core/wall (1/4), an amount of 2 mL of crosslinking agent and a crosslinking time of 60 minutes. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  5. Temporal Information Partitioning Networks (TIPNets): Characterizing emergent behavior in complex ecohydrologic systems

    Science.gov (United States)

    Goodwell, Allison; Kumar, Praveen

    2017-04-01

    Within an ecosystem, components of the atmosphere, vegetation, and the root-soil system participate in forcing and feedback reactions at varying time scales and intensities. These interactions constitute a complex network that exhibits behavioral shifts due to perturbations ranging from weather events to long-term drought or land use change. However, it is challenging to characterize this shifting network due to multiple drivers, non-linear interactions, and synchronization due to feedback. To overcome these issues, we implement a process network approach where eco-hydrologic time-series variables are nodes and information measures are links. We introduce a Temporal Information Partition Network (TIPNet) framework in which multivariate lagged mutual information between source and target nodes is decomposed into synergistic, redundant, and unique components, each of which reveals different aspects of interactions within the network. We use methods to compute information measures given as few as 200 data points to construct TIPNets based on 1-minute weather station data (radiation Rg, air temperature Ta, wind speed WS, relative humidity RH, precipitation PPT, and leaf wetness LWet) from Central Illinois during the growing season of 2015. We assess temporal shifts in network behavior for various weather conditions and over the growing season. We find that wet time periods are associated with complex and synergistic network structures compared to dry conditions, and that seasonal network patterns reveal responses to vegetation growth and rainfall trends. This framework is applicable to study a broad range of complex systems composed of multiple interacting components, and may aid process understanding, model improvement, and resilience and vulnerability assessments.

  6. Quantum information processing with graph states

    International Nuclear Information System (INIS)

    Schlingemann, Dirk-Michael

    2005-04-01

    Graph states are multiparticle states which are associated with graphs. Each vertex of the graph corresponds to a single system or particle. The links describe quantum correlations (entanglement) between pairs of connected particles. Graph states were initiated independently by two research groups: On the one hand, graph states were introduced by Briegel and Raussendorf as a resource for a new model of one-way quantum computing, where algorithms are implemented by a sequence of measurements at single particles. On the other hand, graph states were developed by the author of this thesis and ReinhardWerner in Braunschweig, as a tool to build quantum error correcting codes, called graph codes. The connection between the two approaches was fully realized in close cooperation of both research groups. This habilitation thesis provides a survey of the theory of graph codes, focussing mainly, but not exclusively on the author's own research work. We present the theoretical and mathematical background for the analysis of graph codes. The concept of one-way quantum computing for general graph states is discussed. We explicitly show how to realize the encoding and decoding device of a graph code on a one-way quantum computer. This kind of implementation is to be seen as a mathematical description of a quantum memory device. In addition to that, we investigate interaction processes, which enable the creation of graph states on very large systems. Particular graph states can be created, for instance, by an Ising type interaction between next neighbor particles which sits at the points of an infinitely extended cubic lattice. Based on the theory of quantum cellular automata, we give a constructive characterization of general interactions which create a translationally invariant graph state. (orig.)

  7. Structural Information Inference from Lanthanoid Complexing Systems: Photoluminescence Studies on Isolated Ions

    Science.gov (United States)

    Greisch, Jean Francois; Harding, Michael E.; Chmela, Jiri; Klopper, Willem M.; Schooss, Detlef; Kappes, Manfred M.

    2016-06-01

    The application of lanthanoid complexes ranges from photovoltaics and light-emitting diodes to quantum memories and biological assays. Rationalization of their design requires a thorough understanding of intramolecular processes such as energy transfer, charge transfer, and non-radiative decay involving their subunits. Characterization of the excited states of such complexes considerably benefits from mass spectrometric methods since the associated optical transitions and processes are strongly affected by stoichiometry, symmetry, and overall charge state. We report herein spectroscopic measurements on ensembles of ions trapped in the gas phase and soft-landed in neon matrices. Their interpretation is considerably facilitated by direct comparison with computations. The combination of energy- and time-resolved measurements on isolated species with density functional as well as ligand-field and Franck-Condon computations enables us to infer structural as well as dynamical information about the species studied. The approach is first illustrated for sets of model lanthanoid complexes whose structure and electronic properties are systematically varied via the substitution of one component (lanthanoid or alkali,alkali-earth ion): (i) systematic dependence of ligand-centered phosphorescence on the lanthanoid(III) promotion energy and its impact on sensitization, and (ii) structural changes induced by the substitution of alkali or alkali-earth ions in relation with structures inferred using ion mobility spectroscopy. The temperature dependence of sensitization is briefly discussed. The focus is then shifted to measurements involving europium complexes with doxycycline an antibiotic of the tetracycline family. Besides discussing the complexes' structural and electronic features, we report on their use to monitor enzymatic processes involving hydrogen peroxide or biologically relevant molecules such as adenosine triphosphate (ATP).

  8. Supporting Information Palladium Complexes of a New Type of N ...

    Indian Academy of Sciences (India)

    Prasenjit Ghosh

    Palladium Complexes of a New Type of N-heterocyclic Carbene. Ligand Derived From a Tricyclic Triazolooxazine Framework. Manoj Kumar Gangwar, Alok Ch. Kalita and Prasenjit Ghosh*. Department of Chemistry,. Indian Institute of Technology Bombay, ... 2. Figure S1. 1. H NMR spectrum of the compound 1a in CDCl3.

  9. Photonic Architecture for Scalable Quantum Information Processing in Diamond

    Directory of Open Access Journals (Sweden)

    Kae Nemoto

    2014-08-01

    Full Text Available Physics and information are intimately connected, and the ultimate information processing devices will be those that harness the principles of quantum mechanics. Many physical systems have been identified as candidates for quantum information processing, but none of them are immune from errors. The challenge remains to find a path from the experiments of today to a reliable and scalable quantum computer. Here, we develop an architecture based on a simple module comprising an optical cavity containing a single negatively charged nitrogen vacancy center in diamond. Modules are connected by photons propagating in a fiber-optical network and collectively used to generate a topological cluster state, a robust substrate for quantum information processing. In principle, all processes in the architecture can be deterministic, but current limitations lead to processes that are probabilistic but heralded. We find that the architecture enables large-scale quantum information processing with existing technology.

  10. Holledge gauge failure testing using concurrent information processing algorithm

    International Nuclear Information System (INIS)

    Weeks, G.E.; Daniel, W.E.; Edwards, R.E.; Jannarone, R.J.; Joshi, S.N.; Palakodety, S.S.; Qian, D.

    1996-01-01

    For several decades, computerized information processing systems and human information processing models have developed with a good deal of mutual influence. Any comprehensive psychology text in this decade uses terms that originated in the computer industry, such as ''cache'' and ''memory'', to describe human information processing. Likewise, many engineers today are using ''artificial intelligence''and ''artificial neural network'' computing tools that originated as models of human thought to solve industrial problems. This paper concerns a recently developed human information processing model, called ''concurrent information processing'' (CIP), and a related set of computing tools for solving industrial problems. The problem of focus is adaptive gauge monitoring; the application is pneumatic pressure repeaters (Holledge gauges) used to measure liquid level and density in the Defense Waste Processing Facility and the Integrated DWPF Melter System

  11. Shifts in information processing level: the speed theory of intelligence revisited.

    Science.gov (United States)

    Sircar, S S

    2000-06-01

    A hypothesis is proposed here to reconcile the inconsistencies observed in the IQ-P3 latency relation. The hypothesis stems from the observation that task-induced increase in P3 latency correlates positively with IQ scores. It is hypothesised that: (a) there are several parallel information processing pathways of varying complexity which are associated with the generation of P3 waves of varying latencies; (b) with increasing workload, there is a shift in the 'information processing level' through progressive recruitment of more complex polysynaptic pathways with greater processing power and inhibition of the oligosynaptic pathways; (c) high-IQ subjects have a greater reserve of higher level processing pathways; (d) a given 'task-load' imposes a greater 'mental workload' in subjects with lower IQ than in those with higher IQ. According to this hypothesis, a meaningful comparison of the P3 correlates of IQ is possible only when the information processing level is pushed to its limits.

  12. Creativity, Complexity, and Precision: Information Visualization for (Landscape) Architecture

    DEFF Research Database (Denmark)

    Buscher, Monika; Christensen, Michael; Mogensen, Preben Holst

    2000-01-01

    Drawing on ethnographic studies of (landscape) architects at work, this paper presents a human-centered approach to information visualization. A 3D collaborative electronic workspace allows people to configure, save and browse arrangements of heterogeneous work materials. Spatial arrangements...... and links are created and maintained as an integral part of ongoing work with `live' documents and objects. The result is an extension of the physical information space of the architects' studio that utilizes the potential of electronic data storage, visualization and network technologies to support work...... with information in context...

  13. When you talk about "Information processing" what actually do you have in mind?

    OpenAIRE

    Diamant, Emanuel

    2012-01-01

    "Information Processing" is a recently launched buzzword whose meaning is vague and obscure even for the majority of its users. The reason for this is the lack of a suitable definition for the term "information". In my attempt to amend this bizarre situation, I have realized that, following the insights of Kolmogorov's Complexity theory, information can be defined as a description of structures observable in a given data set. Two types of structures could be easily distinguished in every data...

  14. Information Distribution in Complex Systems to Improve Team Performance

    National Research Council Canada - National Science Library

    Sperling, Brian K; Pritchett, Amy; Estrada, Arthur; Adam, Gina E

    2006-01-01

    .... Specifically, this study hypothesizes that providing task specific information to individual team members will improve coordination and decision-making, and therefore team performance, at time-critical tasks...

  15. Dynamics of information diffusion and its applications on complex networks

    Science.gov (United States)

    Zhang, Zi-Ke; Liu, Chuang; Zhan, Xiu-Xiu; Lu, Xin; Zhang, Chu-Xu; Zhang, Yi-Cheng

    2016-09-01

    The ongoing rapid expansion of the Word Wide Web (WWW) greatly increases the information of effective transmission from heterogeneous individuals to various systems. Extensive research for information diffusion is introduced by a broad range of communities including social and computer scientists, physicists, and interdisciplinary researchers. Despite substantial theoretical and empirical studies, unification and comparison of different theories and approaches are lacking, which impedes further advances. In this article, we review recent developments in information diffusion and discuss the major challenges. We compare and evaluate available models and algorithms to respectively investigate their physical roles and optimization designs. Potential impacts and future directions are discussed. We emphasize that information diffusion has great scientific depth and combines diverse research fields which makes it interesting for physicists as well as interdisciplinary researchers.

  16. Process-aware information systems : lessons to be learned from process mining

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Jensen, K.; Aalst, van der W.M.P.

    2009-01-01

    A Process-Aware Information System (PAIS) is a software system that manages and executes operational processes involving people, applications, and/or information sources on the basis of process models. Example PAISs are workflow management systems, case-handling systems, enterprise information

  17. Supporting change processes in design: Complexity, prediction and reliability

    Energy Technology Data Exchange (ETDEWEB)

    Eckert, Claudia M. [Engineering Design Centre, University of Cambridge, Trumpington Street, Cambridge, CB2 1PZ (United Kingdom)]. E-mail: cme26@cam.ac.uk; Keller, Rene [Engineering Design Centre, University of Cambridge, Trumpington Street, Cambridge, CB2 1PZ (United Kingdom)]. E-mail: rk313@cam.ac.uk; Earl, Chris [Open University, Department of Design and Innovation, Walton Hall, Milton Keynes MK7 6AA (United Kingdom)]. E-mail: C.F.Earl@open.ac.uk; Clarkson, P. John [Engineering Design Centre, University of Cambridge, Trumpington Street, Cambridge, CB2 1PZ (United Kingdom)]. E-mail: pjc10@cam.ac.uk

    2006-12-15

    Change to existing products is fundamental to design processes. New products are often designed through change or modification to existing products. Specific parts or subsystems are changed to similar ones whilst others are directly reused. Design by modification applies particularly to safety critical products where the reuse of existing working parts and subsystems can reduce cost and risk. However change is rarely a matter of just reusing or modifying parts. Changing one part can propagate through the entire design leading to costly rework or jeopardising the integrity of the whole product. This paper characterises product change based on studies in the aerospace and automotive industry and introduces tools to aid designers in understanding the potential effects of change. Two ways of supporting designers are described: probabilistic prediction of the effects of change and visualisation of change propagation through product connectivities. Change propagation has uncertainties which are amplified by the choices designers make in practice as they implement change. Change prediction and visualisation is discussed with reference to complexity in three areas of product development: the structural backcloth of connectivities in the existing product (and its processes), the descriptions of the product used in design and the actions taken to carry out changes.

  18. Motivated information processing in organizational teams: Progress, puzzles, and prospects

    NARCIS (Netherlands)

    Nijstad, B.A.; de Dreu, C.K.W.

    2012-01-01

    Much of the research into group and team functioning looks at groups that perform cognitive tasks, such as decision making, problem solving, and innovation. The Motivated Information Processing in Groups Model (MIP-G; De Dreu, Nijstad, & Van Knippenberg, 2008) conjectures that information processing

  19. Virtual HRD and National Culture: An Information Processing Perspective

    Science.gov (United States)

    Chung, Chih-Hung; Angnakoon, Putthachat; Li, Jessica; Allen, Jeff

    2016-01-01

    Purpose: The purpose of this study is to provide researchers with a better understanding of the cultural impact on information processing in virtual learning environment. Design/methodology/approach: This study uses a causal loop diagram to depict the cultural impact on information processing in the virtual human resource development (VHRD)…

  20. Affect and Persuasion: Effects on Motivation for Information Processing.

    Science.gov (United States)

    Leach, Mark M; Stoltenberg, Cal D.

    The relationship between mood and information processing, particularly when reviewing the Elaboration Likelihood Model of persuasion, lacks conclusive evidence. This study was designed to investigate the hypothesis that information processing would be greater for mood-topic congruence than non mood-topic congruence. Undergraduate students (N=216)…

  1. Attachment in Middle Childhood: Associations with Information Processing

    Science.gov (United States)

    Zimmermann, Peter; Iwanski, Alexandra

    2015-01-01

    Attachment theory suggests that internal working models of self and significant others influence adjustment during development by controlling information processing and self-regulation. We provide a conceptual overview on possible mechanisms linking attachment and information processing and review the current literature in middle childhood.…

  2. Team confidence, motivated information processing, and dynamic group decision making

    NARCIS (Netherlands)

    de Dreu, C.K.W.; Beersma, B.

    2010-01-01

    According to the Motivated Information Processing in Groups (MIP-G) model, groups should perform ambiguous (non-ambiguous) tasks better when they have high (low) epistemic motivation and concomitant tendencies to engage in systematic (heuristic) information processing and exchange. The authors

  3. An analytical approach to managing complex process problems

    Energy Technology Data Exchange (ETDEWEB)

    Ramstad, Kari; Andersen, Espen; Rohde, Hans Christian; Tydal, Trine

    2006-03-15

    The oil companies are continuously investing time and money to ensure optimum regularity on their production facilities. High regularity increases profitability, reduces workload on the offshore organisation and most important; - reduces discharge to air and sea. There are a number of mechanisms and tools available in order to achieve high regularity. Most of these are related to maintenance, system integrity, well operations and process conditions. However, for all of these tools, they will only be effective if quick and proper analysis of fluids and deposits are carried out. In fact, analytical backup is a powerful tool used to maintain optimised oil production, and should as such be given high priority. The present Operator (Hydro Oil and Energy) and the Chemical Supplier (MI Production Chemicals) have developed a cooperation to ensure that analytical backup is provided efficiently to the offshore installations. The Operator's Research and Development (R and D) departments and the Chemical Supplier have complementary specialties in both personnel and equipment, and this is utilized to give the best possible service when required from production technologists or operations. In order for the Operator's Research departments, Health, Safety and Environment (HSE) departments and Operations to approve analytical work performed by the Chemical Supplier, a number of analytical tests are carried out following procedures agreed by both companies. In the present paper, three field case examples of analytical cooperation for managing process problems will be presented. 1) Deposition in a Complex Platform Processing System. 2) Contaminated Production Chemicals. 3) Improved Monitoring of Scale Inhibitor, Suspended Solids and Ions. In each case the Research Centre, Operations and the Chemical Supplier have worked closely together to achieve fast solutions and Best Practice. (author) (tk)

  4. Cost information in succeeding stages of the design process

    NARCIS (Netherlands)

    Tempelmans Plat, H.; Deiman, E.P.; Beheshti, M.R.; Zreik, K.

    1993-01-01

    Adequate decision making in the design process needs information about oost oonsequences over the life of the designed object. In succeeding stages the types of decisions change; as a consequence the type of oost information will differ as well. For each stage oost information about realized

  5. Informative providing of processes of development on industrial enterprises

    OpenAIRE

    Kalinichenko, L.

    2010-01-01

    Information is definite by the basic resource of activity of enterprises. Suggestion in relation to the selection of informative subsystems of strategic, tactical, operative management is borne. The list of indexes in relation to estimation of the informative providing of functional processes of enterprise is offered.

  6. Motivated information processing in group judgement and decision making

    NARCIS (Netherlands)

    de Dreu, C.K.W.; Nijstad, B.A.; van Knippenberg, D.

    2008-01-01

    This article expands the view of groups as information processors into a motivated information processing in groups (MIP-G) model by emphasizing, first, the mixedmotive structure of many group tasks and, second, the idea that individuals engage in more or less deliberate information search and

  7. Motivated information processing in group judgment and decision making

    NARCIS (Netherlands)

    De Dreu, Carsten K. W.; Nijstad, Bernard A.; van Knippenberg, Daan

    This article expands the view of groups as information processors into a motivated information processing in groups (MIP-G) model by emphasizing, first, the mixed-motive structure of many group tasks and, second, the idea that individuals engage in more or less deliberate information search and

  8. 1st International Conference on Cognitive Systems and Information Processing

    CERN Document Server

    Hu, Dewen; Liu, Huaping

    2014-01-01

    "Foundations and Practical Applications of Cognitive Systems and Information Processing" presents selected papers from the First International Conference on Cognitive Systems and Information Processing, held in Beijing, China on December 15-17, 2012 (CSIP2012). The aim of this conference is to bring together experts from different fields of expertise to discuss the state-of-the-art in artificial cognitive systems and advanced information processing, and to present new findings and perspectives on future development. This book introduces multidisciplinary perspectives on the subject areas of Cognitive Systems and Information Processing, including cognitive sciences and technology, autonomous vehicles, cognitive psychology, cognitive metrics, information fusion, image/video understanding, brain-computer interfaces, visual cognitive processing, neural computation, bioinformatics, etc. The book will be beneficial for both researchers and practitioners in the fields of Cognitive Science, Computer Science and Cogni...

  9. Conjoint Management of Business Processes and Information Technologies

    DEFF Research Database (Denmark)

    Siurdyban, Artur

    and improve business processes. As a consequence, there is a growing need to address managerial aspects of the relationships between information technologies and business processes. The aim of this PhD study is to investigate how the practice of conjoint management of business processes and information...... technologies can be supported and improved. The study is organized into five research papers and this summary. Each paper addresses a different aspect of conjoint management of business processes and information technologies, i.e. problem development and managerial practices on software...... and information technologies in a project environment. It states that both elements are intrinsically related and should be designed and considered together. The second case examines the relationships between information technology management and business process management. It discusses the multi-faceted role...

  10. High-Dimensional Quantum Information Processing with Linear Optics

    Science.gov (United States)

    Fitzpatrick, Casey A.

    Quantum information processing (QIP) is an interdisciplinary field concerned with the development of computers and information processing systems that utilize quantum mechanical properties of nature to carry out their function. QIP systems have become vastly more practical since the turn of the century. Today, QIP applications span imaging, cryptographic security, computation, and simulation (quantum systems that mimic other quantum systems). Many important strategies improve quantum versions of classical information system hardware, such as single photon detectors and quantum repeaters. Another more abstract strategy engineers high-dimensional quantum state spaces, so that each successful event carries more information than traditional two-level systems allow. Photonic states in particular bring the added advantages of weak environmental coupling and data transmission near the speed of light, allowing for simpler control and lower system design complexity. In this dissertation, numerous novel, scalable designs for practical high-dimensional linear-optical QIP systems are presented. First, a correlated photon imaging scheme using orbital angular momentum (OAM) states to detect rotational symmetries in objects using measurements, as well as building images out of those interactions is reported. Then, a statistical detection method using chains of OAM superpositions distributed according to the Fibonacci sequence is established and expanded upon. It is shown that the approach gives rise to schemes for sorting, detecting, and generating the recursively defined high-dimensional states on which some quantum cryptographic protocols depend. Finally, an ongoing study based on a generalization of the standard optical multiport for applications in quantum computation and simulation is reported upon. The architecture allows photons to reverse momentum inside the device. This in turn enables realistic implementation of controllable linear-optical scattering vertices for

  11. Toward a Process View in Adoption of Interorganizational Information Systems

    DEFF Research Database (Denmark)

    Brandt, Charlotte J.

    2014-01-01

    , and despite the apparent reason to come to terms with IOIS, the utilization rate is still low. Adoption of IOIS is an interesting process to study, because of the high complexity in successful adoption of IOIS created by the increased number of organizations involved in the adoption process, and because...

  12. Proteomic amino-termini profiling reveals targeting information for protein import into complex plastids.

    Directory of Open Access Journals (Sweden)

    Pitter F Huesgen

    Full Text Available In organisms with complex plastids acquired by secondary endosymbiosis from a photosynthetic eukaryote, the majority of plastid proteins are nuclear-encoded, translated on cytoplasmic ribosomes, and guided across four membranes by a bipartite targeting sequence. In-depth understanding of this vital import process has been impeded by a lack of information about the transit peptide part of this sequence, which mediates transport across the inner three membranes. We determined the mature N-termini of hundreds of proteins from the model diatom Thalassiosira pseudonana, revealing extensive N-terminal modification by acetylation and proteolytic processing in both cytosol and plastid. We identified 63 mature N-termini of nucleus-encoded plastid proteins, deduced their complete transit peptide sequences, determined a consensus motif for their cleavage by the stromal processing peptidase, and found evidence for subsequent processing by a plastid methionine aminopeptidase. The cleavage motif differs from that of higher plants, but is shared with other eukaryotes with complex plastids.

  13. Research and Measurement of Software Complexity Based on Wuli, Shili, Renli (WSR and Information Entropy

    Directory of Open Access Journals (Sweden)

    Rong Jiang

    2015-04-01

    Full Text Available Complexity is an important factor throughout the software life cycle. It is increasingly difficult to guarantee software quality, cost and development progress with the increase in complexity. Excessive complexity is one of the main reasons for the failure of software projects, so effective recognition, measurement and control of complexity becomes the key of project management. At first, this paper analyzes the current research situation of software complexity systematically and points out existing problems in current research. Then, it proposes a WSR framework of software complexity, which divides the complexity of software into three levels of Wuli (WL, Shili (SL and Renli (RL, so that the staff in different roles may have a better understanding of complexity. Man is the main source of complexity, but the current research focuses on WL complexity, and the research of RL complexity is extremely scarce, so this paper emphasizes the research of RL complexity of software projects. This paper not only analyzes the composing factors of RL complexity, but also provides the definition of RL complexity. Moreover, it puts forward a quantitative measurement method of the complexity of personnel organization hierarchy and the complexity of personnel communication information based on information entropy first and analyzes and validates the scientificity and rationality of this measurement method through a large number of cases.

  14. Composing complex EXAFS problems with severe information constraints

    International Nuclear Information System (INIS)

    Ravel, B

    2009-01-01

    In recent work, a model for the structural environment of Hg bound to a catalytic DNA sensor was proposed on the basis of EXAFS data analysis. Although severely constrained by limited data quality and scant supporting structural data, a compelling structural model was found which agreed with a similar but less detailed model proposed on the basis on NMR data. I discuss in detail the successes and limitations of the analytical strategy that were implemented in the earlier work. I then speculate on future software requirements needed to make this and similarly complex analytical strategies more available to the wider audience of EXAFS practitioners.

  15. Principles of big data preparing, sharing, and analyzing complex information

    CERN Document Server

    Berman, Jules J

    2013-01-01

    Principles of Big Data helps readers avoid the common mistakes that endanger all Big Data projects. By stressing simple, fundamental concepts, this book teaches readers how to organize large volumes of complex data, and how to achieve data permanence when the content of the data is constantly changing. General methods for data verification and validation, as specifically applied to Big Data resources, are stressed throughout the book. The book demonstrates how adept analysts can find relationships among data objects held in disparate Big Data resources, when the data objects are endo

  16. CIRQuL: Complex Information Retrieval Query Language

    NARCIS (Netherlands)

    Mihajlovic, V.; Hiemstra, Djoerd; Apers, Peter M.G.

    In this paper we will present a new framework for the retrieval of XML documents. We will describe the extension for existing query languages (XPath and XQuery) geared toward ranked information retrieval and full-text search in XML documents. Furthermore we will present language models for ranked

  17. Influence of information on behavioral effects in decision processes

    OpenAIRE

    Angelarosa Longo; Viviana Ventre

    2015-01-01

    Rational models in decision processes are marked out by many anomalies, caused by behavioral issues. We point out the importance of information in causing inconsistent preferences in a decision process. In a single or multi agent decision process each mental model is influenced by the presence, the absence or false information about the problem or about other members of the decision making group. The difficulty in modeling these effects increases because behavioral biases influence also the m...

  18. Survey of Applications of Complex Event Processing (CEP in Health Domain

    Directory of Open Access Journals (Sweden)

    Nadeem Mahmood

    2017-12-01

    Full Text Available It is always difficult to manipulate the production of huge amount of data which comes from multiple sources and to extract meaningful information to make appropriate decisions. When data comes from various input resources, to get required streams of events form this complex input network, the one of the strong functionality of Business Intelligence (BI the Complex Event Processing (CEP is the appropriate solution for the above mention problems. Real time processing, pattern matching, stream processing, big data management, sensor data processing and many more are the application areas of CEP. Health domain itself is a multi-dimension domain such as hospital supply chain, OPD management, disease diagnostic, In-patient, out-patient management, and emergency care etc. In this paper, the main focus is to discuss the application areas of Complex Event Processing (CEP in health domain by using sensor device, such that how CEP manipulate health data set events coming from sensor devices such as blood pressure, heart rate, fall detection, sugar level, temperature or any other vital signs and how this systems respond to these events as quickly as possible. Different existing models and application using CEP are discussed and summarized according to different characteristics.

  19. Effects of spectral complexity and sound duration on automatic complex-sound pitch processing in humans - a mismatch negativity study.

    Science.gov (United States)

    Tervaniemi, M; Schröger, E; Saher, M; Näätänen, R

    2000-08-18

    The pitch of a spectrally rich sound is known to be more easily perceived than that of a sinusoidal tone. The present study compared the importance of spectral complexity and sound duration in facilitated pitch discrimination. The mismatch negativity (MMN), which reflects automatic neural discrimination, was recorded to a 2. 5% pitch change in pure tones with only one sinusoidal frequency component (500 Hz) and in spectrally rich tones with three (500-1500 Hz) and five (500-2500 Hz) harmonic partials. During the recordings, subjects concentrated on watching a silent movie. In separate blocks, stimuli were of 100 and 250 ms in duration. The MMN amplitude was enhanced with both spectrally rich sounds when compared with pure tones. The prolonged sound duration did not significantly enhance the MMN. This suggests that increased spectral rather than temporal information facilitates pitch processing of spectrally rich sounds.

  20. Advances in statistical monitoring of complex multivariate processes with applications in industrial process control

    CERN Document Server

    Kruger, Uwe

    2012-01-01

    The development and application of multivariate statistical techniques in process monitoring has gained substantial interest over the past two decades in academia and industry alike.  Initially developed for monitoring and fault diagnosis in complex systems, such techniques have been refined and applied in various engineering areas, for example mechanical and manufacturing, chemical, electrical and electronic, and power engineering.  The recipe for the tremendous interest in multivariate statistical techniques lies in its simplicity and adaptability for developing monitoring applica

  1. A modeling process to understand complex system architectures

    Science.gov (United States)

    Robinson, Santiago Balestrini

    2009-12-01

    In recent decades, several tools have been developed by the armed forces, and their contractors, to test the capability of a force. These campaign level analysis tools, often times characterized as constructive simulations are generally expensive to create and execute, and at best they are extremely difficult to verify and validate. This central observation, that the analysts are relying more and more on constructive simulations to predict the performance of future networks of systems, leads to the two central objectives of this thesis: (1) to enable the quantitative comparison of architectures in terms of their ability to satisfy a capability without resorting to constructive simulations, and (2) when constructive simulations must be created, to quantitatively determine how to spend the modeling effort amongst the different system classes. The first objective led to Hypothesis A, the first main hypotheses, which states that by studying the relationships between the entities that compose an architecture, one can infer how well it will perform a given capability. The method used to test the hypothesis is based on two assumptions: (1) the capability can be defined as a cycle of functions, and that it (2) must be possible to estimate the probability that a function-based relationship occurs between any two types of entities. If these two requirements are met, then by creating random functional networks, different architectures can be compared in terms of their ability to satisfy a capability. In order to test this hypothesis, a novel process for creating representative functional networks of large-scale system architectures was developed. The process, named the Digraph Modeling for Architectures (DiMA), was tested by comparing its results to those of complex constructive simulations. Results indicate that if the inputs assigned to DiMA are correct (in the tests they were based on time-averaged data obtained from the ABM), DiMA is able to identify which of any two

  2. Links between attachment and social information processing: examination of intergenerational processes.

    Science.gov (United States)

    Dykas, Matthew J; Ehrlich, Katherine B; Cassidy, Jude

    2011-01-01

    This chapter describes theory and research on intergenerational connections between parents' attachment and children's social information processing, as well as between parents' social information processing and children's attachment. The chapter begins with a discussion of attachment theorists' early insights into the role that social information processing plays in attachment processes. Next, current theory about the mechanisms through which cross-generational links between attachment and social information processing might emerge is presented. The central proposition is that the quality of attachment and/or the social information processing of the parent contributes to the quality of attachment and/or social information processing in the child, and these links emerge through mediating processes related to social learning, open communication, gate-keeping, emotion regulation, and joint attention. A comprehensive review of the literature is then presented. The chapter ends with the presentation of a current theoretical perspective and suggestions for future empirical and clinical endeavors.

  3. [Biohydrometallurgical technology of a complex copper concentrate process].

    Science.gov (United States)

    Murav'ev, M I; Fomchenko, N V; Kondrat'eva, T F

    2011-01-01

    Leaching of sulfide-oxidized copper concentrate of the Udokan deposit ore with a copper content of 37.4% was studied. In the course of treatment in a sulfuric acid solution with pH 1.2, a copper leaching rate was 6.9 g/kg h for 22 h, which allowed extraction of 40.6% of copper. As a result of subsequent chemical leaching at 80 degrees C during 7 h with a solution of sulphate ferric iron obtained after bio-oxidation by an association of microorganisms, the rate of copper recovery was 52.7 g/kg h. The total copper recovery was 94.5% (over 29 h). Regeneration of the Fe3+ ions was carried out by an association of moderately thermophilic microorganisms, including bacteria of genus Sulfobacillus and archaea of genus Ferroplasma acidiphilum, at 1.0 g/l h at 40 degrees C in the presence of 3% solids obtained by chemical leaching of copper concentrate. A technological scheme of a complex copper concentrate process with the use of bacterial-chemical leaching is proposed.

  4. Real-time complex event processing for cloud resources

    Science.gov (United States)

    Adam, M.; Cordeiro, C.; Field, L.; Giordano, D.; Magnoni, L.

    2017-10-01

    The ongoing integration of clouds into the WLCG raises the need for detailed health and performance monitoring of the virtual resources in order to prevent problems of degraded service and interruptions due to undetected failures. When working in scale, the existing monitoring diversity can lead to a metric overflow whereby the operators need to manually collect and correlate data from several monitoring tools and frameworks, resulting in tens of different metrics to be constantly interpreted and analyzed per virtual machine. In this paper we present an ESPER based standalone application which is able to process complex monitoring events coming from various sources and automatically interpret data in order to issue alarms upon the resources’ statuses, without interfering with the actual resources and data sources. We will describe how this application has been used with both commercial and non-commercial cloud activities, allowing the operators to quickly be alarmed and react to misbehaving VMs and LHC experiments’ workflows. We will present the pattern analysis mechanisms being used, as well as the surrounding Elastic and REST API interfaces where the alarms are collected and served to users.

  5. Information processing capacity in psychopathy: Effects of anomalous attention.

    Science.gov (United States)

    Hamilton, Rachel K B; Newman, Joseph P

    2018-03-01

    Hamilton and colleagues (2015) recently proposed that an integrative deficit in psychopathy restricts simultaneous processing, thereby leaving fewer resources available for information encoding, narrowing the scope of attention, and undermining associative processing. The current study evaluated this parallel processing deficit proposal using the Simultaneous-Sequential paradigm. This investigation marks the first a priori test of the Hamilton et al.'s theoretical framework. We predicted that psychopathy would be associated with inferior performance (as indexed by lower accuracy and longer response time) on trials requiring simultaneous processing of visual information relative to trials necessitating sequential processing. Results were consistent with these predictions, supporting the proposal that psychopathy is characterized by a reduced capacity to process multicomponent perceptual information concurrently. We discuss the potential implications of impaired simultaneous processing for the conceptualization of the psychopathic deficit. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  6. Information Technology Process Improvement Decision-Making: An Exploratory Study from the Perspective of Process Owners and Process Managers

    Science.gov (United States)

    Lamp, Sandra A.

    2012-01-01

    There is information available in the literature that discusses information technology (IT) governance and investment decision making from an executive-level perception, yet there is little information available that offers the perspective of process owners and process managers pertaining to their role in IT process improvement and investment…

  7. Retrieval practice enhances the ability to evaluate complex physiology information.

    Science.gov (United States)

    Dobson, John; Linderholm, Tracy; Perez, Jose

    2018-05-01

    Many investigations have shown that retrieval practice enhances the recall of different types of information, including both medical and physiological, but the effects of the strategy on higher-order thinking, such as evaluation, are less clear. The primary aim of this study was to compare how effectively retrieval practice and repeated studying (i.e. reading) strategies facilitated the evaluation of two research articles that advocated dissimilar conclusions. A secondary aim was to determine if that comparison was affected by using those same strategies to first learn important contextual information about the articles. Participants were randomly assigned to learn three texts that provided background information about the research articles either by studying them four consecutive times (Text-S) or by studying and then retrieving them two consecutive times (Text-R). Half of both the Text-S and Text-R groups were then randomly assigned to learn two physiology research articles by studying them four consecutive times (Article-S) and the other half learned them by studying and then retrieving them two consecutive times (Article-R). Participants then completed two assessments: the first tested their ability to critique the research articles and the second tested their recall of the background texts. On the article critique assessment, the Article-R groups' mean scores of 33.7 ± 4.7% and 35.4 ± 4.5% (Text-R then Article-R group and Text-S then Article-R group, respectively) were both significantly (p Retrieval practice promoted superior critical evaluation of the research articles, and the results also indicated the strategy enhanced the recall of background information. © 2018 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  8. Thermodynamic aspects of information transfer in complex dynamical systems

    Science.gov (United States)

    Cafaro, Carlo; Ali, Sean Alan; Giffin, Adom

    2016-02-01

    From the Horowitz-Esposito stochastic thermodynamical description of information flows in dynamical systems [J. M. Horowitz and M. Esposito, Phys. Rev. X 4, 031015 (2014), 10.1103/PhysRevX.4.031015], it is known that while the second law of thermodynamics is satisfied by a joint system, the entropic balance for the subsystems is adjusted by a term related to the mutual information exchange rate between the two subsystems. In this article, we present a quantitative discussion of the conceptual link between the Horowitz-Esposito analysis and the Liang-Kleeman work on information transfer between dynamical system components [X. S. Liang and R. Kleeman, Phys. Rev. Lett. 95, 244101 (2005), 10.1103/PhysRevLett.95.244101]. In particular, the entropic balance arguments employed in the two approaches are compared. Notwithstanding all differences between the two formalisms, our work strengthens the Liang-Kleeman heuristic balance reasoning by showing its formal analogy with the recent Horowitz-Esposito thermodynamic balance arguments.

  9. Motivation within the Information Processing Model of Foreign Language Learning

    Science.gov (United States)

    Manolopoulou-Sergi, Eleni

    2004-01-01

    The present article highlights the importance of the motivational construct for the foreign language learning (FLL) process. More specifically, in the present article it is argued that motivation is likely to play a significant role at all three stages of the FLL process as they are discussed within the information processing model of FLL, namely,…

  10. Auditory, Tactile, and Audiotactile Information Processing Following Visual Deprivation

    Science.gov (United States)

    Occelli, Valeria; Spence, Charles; Zampini, Massimiliano

    2013-01-01

    We highlight the results of those studies that have investigated the plastic reorganization processes that occur within the human brain as a consequence of visual deprivation, as well as how these processes give rise to behaviorally observable changes in the perceptual processing of auditory and tactile information. We review the evidence showing…

  11. Information mining in weighted complex networks with nonlinear rating projection

    Science.gov (United States)

    Liao, Hao; Zeng, An; Zhou, Mingyang; Mao, Rui; Wang, Bing-Hong

    2017-10-01

    Weighted rating networks are commonly used by e-commerce providers nowadays. In order to generate an objective ranking of online items' quality according to users' ratings, many sophisticated algorithms have been proposed in the complex networks domain. In this paper, instead of proposing new algorithms we focus on a more fundamental problem: the nonlinear rating projection. The basic idea is that even though the rating values given by users are linearly separated, the real preference of users to items between the different given values is nonlinear. We thus design an approach to project the original ratings of users to more representative values. This approach can be regarded as a data pretreatment method. Simulation in both artificial and real networks shows that the performance of the ranking algorithms can be improved when the projected ratings are used.

  12. A simplified computational memory model from information processing

    Science.gov (United States)

    Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang

    2016-01-01

    This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view. PMID:27876847

  13. A simplified computational memory model from information processing.

    Science.gov (United States)

    Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang

    2016-11-23

    This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view.

  14. Horizontal information drives the behavioural signatures of face processing

    Directory of Open Access Journals (Sweden)

    Valerie Goffaux

    2010-09-01

    Full Text Available Recent psychophysical evidence indicates that the vertical arrangement of horizontal information is particularly important for encoding facial identity. In this paper we extend this notion to examine the role that information at different (particularly cardinal orientations might play in a number of established phenomena each a behavioural “signature” of face processing. In particular we consider (a the face inversion effect (FIE, (b the facial identity after-effect, (c face-matching across viewpoint, and (d interactive, so-called holistic, processing of face parts. We report that filtering faces to remove all but the horizontal information largely preserves these effects but conversely, retaining vertical information generally diminishes or abolishes them. We conclude that preferential processing of horizontal information is a central feature of human face processing that supports many of the behavioural signatures of this critical visual operation.

  15. Moral judgment as information processing: an integrative review.

    Science.gov (United States)

    Guglielmo, Steve

    2015-01-01

    How do humans make moral judgments about others' behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment.

  16. Moral judgment as information processing: an integrative review

    Science.gov (United States)

    Guglielmo, Steve

    2015-01-01

    How do humans make moral judgments about others’ behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment. PMID:26579022

  17. Intelligent monitoring and fault diagnosis for ATLAS TDAQ: a complex event processing solution

    CERN Document Server

    Magnoni, Luca; Luppi, Eleonora

    Effective monitoring and analysis tools are fundamental in modern IT infrastructures to get insights on the overall system behavior and to deal promptly and effectively with failures. In recent years, Complex Event Processing (CEP) technologies have emerged as effective solutions for information processing from the most disparate fields: from wireless sensor networks to financial analysis. This thesis proposes an innovative approach to monitor and operate complex and distributed computing systems, in particular referring to the ATLAS Trigger and Data Acquisition (TDAQ) system currently in use at the European Organization for Nuclear Research (CERN). The result of this research, the AAL project, is currently used to provide ATLAS data acquisition operators with automated error detection and intelligent system analysis. The thesis begins by describing the TDAQ system and the controlling architecture, with a focus on the monitoring infrastructure and the expert system used for error detection and automated reco...

  18. Theoretical aspects of cellular decision-making and information-processing.

    Science.gov (United States)

    Kobayashi, Tetsuya J; Kamimura, Atsushi

    2012-01-01

    Microscopic biological processes have extraordinary complexity and variety at the sub-cellular, intra-cellular, and multi-cellular levels. In dealing with such complex phenomena, conceptual and theoretical frameworks are crucial, which enable us to understand seemingly different intra- and inter-cellular phenomena from unified viewpoints. Decision-making is one such concept that has attracted much attention recently. Since a number of cellular behavior can be regarded as processes to make specific actions in response to external stimuli, decision-making can cover and has been used to explain a broad range of different cellular phenomena [Balázsi et al. (Cell 144(6):910, 2011), Zeng et al. (Cell 141(4):682, 2010)]. Decision-making is also closely related to cellular information-processing because appropriate decisions cannot be made without exploiting the information that the external stimuli contain. Efficiency of information transduction and processing by intra-cellular networks determines the amount of information obtained, which in turn limits the efficiency of subsequent decision-making. Furthermore, information-processing itself can serve as another concept that is crucial for understanding of other biological processes than decision-making. In this work, we review recent theoretical developments on cellular decision-making and information-processing by focusing on the relation between these two concepts.

  19. Synergistic Information Processing Encrypts Strategic Reasoning in Poker.

    Science.gov (United States)

    Frey, Seth; Albino, Dominic K; Williams, Paul L

    2018-06-14

    There is a tendency in decision-making research to treat uncertainty only as a problem to be overcome. But it is also a feature that can be leveraged, particularly in social interaction. Comparing the behavior of profitable and unprofitable poker players, we reveal a strategic use of information processing that keeps decision makers unpredictable. To win at poker, a player must exploit public signals from others. But using public inputs makes it easier for an observer to reconstruct that player's strategy and predict his or her behavior. How should players trade off between exploiting profitable opportunities and remaining unexploitable themselves? Using a recent multivariate approach to information theoretic data analysis and 1.75 million hands of online two-player No-Limit Texas Hold'em, we find that the important difference between winning and losing players is not in the amount of information they process, but how they process it. In particular, winning players are better at integrative information processing-creating new information from the interaction between their cards and their opponents' signals. We argue that integrative information processing does not just produce better decisions, it makes decision-making harder for others to reverse engineer, as an expert poker player's cards act like the private key in public-key cryptography. Poker players encrypt their reasoning with the way they process information. The encryption function of integrative information processing makes it possible for players to exploit others while remaining unexploitable. By recognizing the act of information processing as a strategic behavior in its own right, we offer a detailed account of how experts use endemic uncertainty to conceal their intentions in high-stakes competitive environments, and we highlight new opportunities between cognitive science, information theory, and game theory. Copyright © 2018 Cognitive Science Society, Inc.

  20. Risk perception and information processing: the development and validation of a questionnaire to assess self-reported information processing.

    Science.gov (United States)

    Smerecnik, Chris M R; Mesters, Ilse; Candel, Math J J M; De Vries, Hein; De Vries, Nanne K

    2012-01-01

    The role of information processing in understanding people's responses to risk information has recently received substantial attention. One limitation of this research concerns the unavailability of a validated questionnaire of information processing. This article presents two studies in which we describe the development and validation of the Information-Processing Questionnaire to meet that need. Study 1 describes the development and initial validation of the questionnaire. Participants were randomized to either a systematic processing or a heuristic processing condition after which they completed a manipulation check and the initial 15-item questionnaire and again two weeks later. The questionnaire was subjected to factor reliability and validity analyses on both measurement times for purposes of cross-validation of the results. A two-factor solution was observed representing a systematic processing and a heuristic processing subscale. The resulting scale showed good reliability and validity, with the systematic condition scoring significantly higher on the systematic subscale and the heuristic processing condition significantly higher on the heuristic subscale. Study 2 sought to further validate the questionnaire in a field study. Results of the second study corresponded with those of Study 1 and provided further evidence of the validity of the Information-Processing Questionnaire. The availability of this information-processing scale will be a valuable asset for future research and may provide researchers with new research opportunities. © 2011 Society for Risk Analysis.

  1. Sources of Information as Determinants of Product and Process Innovation.

    Science.gov (United States)

    Gómez, Jaime; Salazar, Idana; Vargas, Pilar

    2016-01-01

    In this paper we use a panel of manufacturing firms in Spain to examine the extent to which they use internal and external sources of information (customers, suppliers, competitors, consultants and universities) to generate product and process innovation. Our results show that, although internal sources are influential, external sources of information are key to achieve innovation performance. These results are in line with the open innovation literature because they show that firms that are opening up their innovation process and that use different information sources have a greater capacity to generate innovations. We also find that the importance of external sources of information varies depending on the type of innovation (product or process) considered. To generate process innovation, firms mainly rely on suppliers while, to generate product innovation, the main contribution is from customers. The potential simultaneity between product and process innovation is also taken into consideration. We find that the generation of both types of innovation is not independent.

  2. Sources of Information as Determinants of Product and Process Innovation.

    Directory of Open Access Journals (Sweden)

    Jaime Gómez

    Full Text Available In this paper we use a panel of manufacturing firms in Spain to examine the extent to which they use internal and external sources of information (customers, suppliers, competitors, consultants and universities to generate product and process innovation. Our results show that, although internal sources are influential, external sources of information are key to achieve innovation performance. These results are in line with the open innovation literature because they show that firms that are opening up their innovation process and that use different information sources have a greater capacity to generate innovations. We also find that the importance of external sources of information varies depending on the type of innovation (product or process considered. To generate process innovation, firms mainly rely on suppliers while, to generate product innovation, the main contribution is from customers. The potential simultaneity between product and process innovation is also taken into consideration. We find that the generation of both types of innovation is not independent.

  3. Sources of Information as Determinants of Product and Process Innovation

    Science.gov (United States)

    2016-01-01

    In this paper we use a panel of manufacturing firms in Spain to examine the extent to which they use internal and external sources of information (customers, suppliers, competitors, consultants and universities) to generate product and process innovation. Our results show that, although internal sources are influential, external sources of information are key to achieve innovation performance. These results are in line with the open innovation literature because they show that firms that are opening up their innovation process and that use different information sources have a greater capacity to generate innovations. We also find that the importance of external sources of information varies depending on the type of innovation (product or process) considered. To generate process innovation, firms mainly rely on suppliers while, to generate product innovation, the main contribution is from customers. The potential simultaneity between product and process innovation is also taken into consideration. We find that the generation of both types of innovation is not independent. PMID:27035456

  4. A Process Model for Goal-Based Information Retrieval

    Directory of Open Access Journals (Sweden)

    Harvey Hyman

    2014-12-01

    Full Text Available In this paper we examine the domain of information search and propose a "goal-based" approach to study search strategy. We describe "goal-based information search" using a framework of Knowledge Discovery. We identify two Information Retrieval (IR goals using the constructs of Knowledge Acquisition (KA and Knowledge Explanation (KE. We classify these constructs into two specific information problems: An exploration-exploitation problem and an implicit-explicit problem. Our proposed framework is an extension of prior work in this domain, applying an IR Process Model originally developed for Legal-IR and adapted to Medical-IR. The approach in this paper is guided by the recent ACM-SIG Medical Information Retrieval (MedIR Workshop definition: "methodologies and technologies that seek to improve access to medical information archives via a process of information retrieval."

  5. State densities and spectrum fluctuations: Information propagation in complex nuclei

    International Nuclear Information System (INIS)

    French, J.B.; Kota, V.K.B.

    1988-01-01

    At excitation energies in nuclei where the state density is unambiguously defined there is a sharp separation between the smoothed spectrum (which defines the density) and fluctuations about it which have recently been studied with a view to understanding some aspects of quantum chaos. We briefly review these two complementary subjects, paying special attention to: the role of the effective interaction in determining the density; the calculation of interacting-particle state and level densities, and of expectation values of interesting operators; the information about the effective nucleon-nucleon interaction which is carried both by the density and the fluctuations. 28 refs., 1 fig

  6. Information processing and routing in wireless sensor networks

    CERN Document Server

    Yu, Yang; Krishnamachari, Bhaskar

    2006-01-01

    This book presents state-of-the-art cross-layer optimization techniques for energy-efficient information processing and routing in wireless sensor networks. Besides providing a survey on this important research area, three specific topics are discussed in detail - information processing in a collocated cluster, information transport over a tree substrate, and information routing for computationally intensive applications. The book covers several important system knobs for cross-layer optimization, including voltage scaling, rate adaptation, and tunable compression. By exploring tradeoffs of en

  7. Regulation of health information processing in an outsourcing environment.

    Science.gov (United States)

    2004-06-01

    Policy makers must consider the work force, technology, cost, and legal implications of their legislative proposals. AHIMA, AAMT, CHIA, and MTIA urge lawmakers to craft regulatory solutions that enforce HIPAA and support advancements in modern health information processing practices that improve the quality and cost of healthcare. We also urge increased investment in health information work force development and implementation of new technologies to advance critical healthcare outcomes--timely, accurate, accessible, and secure information to support patient care. It is essential that state legislatures reinforce the importance of improving information processing solutions for healthcare and not take actions that will produce unintended and detrimental consequences.

  8. Splash, pop, sizzle: Information processing with phononic computing

    Directory of Open Access Journals (Sweden)

    Sophia R. Sklan

    2015-05-01

    Full Text Available Phonons, the quanta of mechanical vibration, are important to the transport of heat and sound in solid materials. Recent advances in the fundamental control of phonons (phononics have brought into prominence the potential role of phonons in information processing. In this review, the many directions of realizing phononic computing and information processing are examined. Given the relative similarity of vibrational transport at different length scales, the related fields of acoustic, phononic, and thermal information processing are all included, as are quantum and classical computer implementations. Connections are made between the fundamental questions in phonon transport and phononic control and the device level approach to diodes, transistors, memory, and logic.

  9. Process system of radiometric and magnetometric aerial information

    International Nuclear Information System (INIS)

    Bazua Rueda, L.F.

    1985-01-01

    The author has been working first in the National Institute of Nuclear Energy (Mexico) and then in URAMEX (Uranio Mexicano) since 1975 to 1983, integrated to radiometric and magnetometric aerial prospecting projects in computerized processing of information aspects. During this period the author participated in the work out of computing systems, information processing and mathematical procedures definition for the geophysical reduction of the calibration equipment data. With cumulated experience, in this thesis are presented aspects concerning to management and operation of computerized processing of information systems. Operation handbooks of the majority of modules are presented. Program lists are not included. (Author)

  10. Harvesting Social Signals to Inform Peace Processes Implementation and Monitoring.

    Science.gov (United States)

    Nigam, Aastha; Dambanemuya, Henry K; Joshi, Madhav; Chawla, Nitesh V

    2017-12-01

    Peace processes are complex, protracted, and contentious involving significant bargaining and compromising among various societal and political stakeholders. In civil war terminations, it is pertinent to measure the pulse of the nation to ensure that the peace process is responsive to citizens' concerns. Social media yields tremendous power as a tool for dialogue, debate, organization, and mobilization, thereby adding more complexity to the peace process. Using Colombia's final peace agreement and national referendum as a case study, we investigate the influence of two important indicators: intergroup polarization and public sentiment toward the peace process. We present a detailed linguistic analysis to detect intergroup polarization and a predictive model that leverages Tweet structure, content, and user-based features to predict public sentiment toward the Colombian peace process. We demonstrate that had proaccord stakeholders leveraged public opinion from social media, the outcome of the Colombian referendum could have been different.

  11. Information properties of morphologically complex words modulate brain activity during word reading.

    Science.gov (United States)

    Hakala, Tero; Hultén, Annika; Lehtonen, Minna; Lagus, Krista; Salmelin, Riitta

    2018-06-01

    Neuroimaging studies of the reading process point to functionally distinct stages in word recognition. Yet, current understanding of the operations linked to those various stages is mainly descriptive in nature. Approaches developed in the field of computational linguistics may offer a more quantitative approach for understanding brain dynamics. Our aim was to evaluate whether a statistical model of morphology, with well-defined computational principles, can capture the neural dynamics of reading, using the concept of surprisal from information theory as the common measure. The Morfessor model, created for unsupervised discovery of morphemes, is based on the minimum description length principle and attempts to find optimal units of representation for complex words. In a word recognition task, we correlated brain responses to word surprisal values derived from Morfessor and from other psycholinguistic variables that have been linked with various levels of linguistic abstraction. The magnetoencephalography data analysis focused on spatially, temporally and functionally distinct components of cortical activation observed in reading tasks. The early occipital and occipito-temporal responses were correlated with parameters relating to visual complexity and orthographic properties, whereas the later bilateral superior temporal activation was correlated with whole-word based and morphological models. The results show that the word processing costs estimated by the statistical Morfessor model are relevant for brain dynamics of reading during late processing stages. © 2018 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.

  12. BOOK REVIEW: Theory of Neural Information Processing Systems

    Science.gov (United States)

    Galla, Tobias

    2006-04-01

    It is difficult not to be amazed by the ability of the human brain to process, to structure and to memorize information. Even by the toughest standards the behaviour of this network of about 1011 neurons qualifies as complex, and both the scientific community and the public take great interest in the growing field of neuroscience. The scientific endeavour to learn more about the function of the brain as an information processing system is here a truly interdisciplinary one, with important contributions from biology, computer science, physics, engineering and mathematics as the authors quite rightly point out in the introduction of their book. The role of the theoretical disciplines here is to provide mathematical models of information processing systems and the tools to study them. These models and tools are at the centre of the material covered in the book by Coolen, Kühn and Sollich. The book is divided into five parts, providing basic introductory material on neural network models as well as the details of advanced techniques to study them. A mathematical appendix complements the main text. The range of topics is extremely broad, still the presentation is concise and the book well arranged. To stress the breadth of the book let me just mention a few keywords here: the material ranges from the basics of perceptrons and recurrent network architectures to more advanced aspects such as Bayesian learning and support vector machines; Shannon's theory of information and the definition of entropy are discussed, and a chapter on Amari's information geometry is not missing either. Finally the statistical mechanics chapters cover Gardner theory and the replica analysis of the Hopfield model, not without being preceded by a brief introduction of the basic concepts of equilibrium statistical physics. The book also contains a part on effective theories of the macroscopic dynamics of neural networks. Many dynamical aspects of neural networks are usually hard to find in the

  13. Toward theoretical understanding of the fertility preservation decision-making process: Examining information processing among young women with cancer

    Science.gov (United States)

    Hershberger, Patricia E.; Finnegan, Lorna; Altfeld, Susan; Lake, Sara; Hirshfeld-Cytron, Jennifer

    2014-01-01

    Background Young women with cancer now face the complex decision about whether to undergo fertility preservation. Yet little is known about how these women process information involved in making this decision. Objective The purpose of this paper is to expand theoretical understanding of the decision-making process by examining aspects of information processing among young women diagnosed with cancer. Methods Using a grounded theory approach, 27 women with cancer participated in individual, semi-structured interviews. Data were coded and analyzed using constant-comparison techniques that were guided by five dimensions within the Contemplate phase of the decision-making process framework. Results In the first dimension, young women acquired information primarily from clinicians and Internet sources. Experiential information, often obtained from peers, occurred in the second dimension. Preferences and values were constructed in the third dimension as women acquired factual, moral, and ethical information. Women desired tailored, personalized information that was specific to their situation in the fourth dimension; however, women struggled with communicating these needs to clinicians. In the fifth dimension, women offered detailed descriptions of clinician behaviors that enhance or impede decisional debriefing. Conclusion Better understanding of theoretical underpinnings surrounding women’s information processes can facilitate decision support and improve clinical care. PMID:24552086

  14. Toward theoretical understanding of the fertility preservation decision-making process: examining information processing among young women with cancer.

    Science.gov (United States)

    Hershberger, Patricia E; Finnegan, Lorna; Altfeld, Susan; Lake, Sara; Hirshfeld-Cytron, Jennifer

    2013-01-01

    Young women with cancer now face the complex decision about whether to undergo fertility preservation. Yet little is known about how these women process information involved in making this decision. The purpose of this article is to expand theoretical understanding of the decision-making process by examining aspects of information processing among young women diagnosed with cancer. Using a grounded theory approach, 27 women with cancer participated in individual, semistructured interviews. Data were coded and analyzed using constant-comparison techniques that were guided by 5 dimensions within the Contemplate phase of the decision-making process framework. In the first dimension, young women acquired information primarily from clinicians and Internet sources. Experiential information, often obtained from peers, occurred in the second dimension. Preferences and values were constructed in the third dimension as women acquired factual, moral, and ethical information. Women desired tailored, personalized information that was specific to their situation in the fourth dimension; however, women struggled with communicating these needs to clinicians. In the fifth dimension, women offered detailed descriptions of clinician behaviors that enhance or impede decisional debriefing. Better understanding of theoretical underpinnings surrounding women's information processes can facilitate decision support and improve clinical care.

  15. Information Technology in Small Medium Enterprise: Logistic and Production Processes

    Directory of Open Access Journals (Sweden)

    Maurizio Pighin

    2017-01-01

    Full Text Available This paper presents and discuss a survey which describes how small-medium enterprises (SMEs implement and use their information system with respect to their logistic and production processes. The study first describes the rationale of the research, then it identifies the characteristics of the companies and detects their general attitude towards information technology (IT. In the following section the paper presents a set of detailed processes to verify the structure and workflow of companies and how IT supports their processes. In the last part we study the influence of some company characteristics to effective use of processes and to different technological approaches, to support defined logistic and production processes. The novelty of the study and its interest, both in academic and institutional context as in the real world, resides in the opportunity to verify and understand the different attitudes of SMEs towards information technology in defining, organizing, planning and control their processes.

  16. Design Process Control for Improved Surface Finish of Metal Additive Manufactured Parts of Complex Build Geometry

    Directory of Open Access Journals (Sweden)

    Mikdam Jamal

    2017-12-01

    Full Text Available Metal additive manufacturing (AM is increasingly used to create complex 3D components at near net shape. However, the surface finish (SF of the metal AM part is uneven, with surface roughness being variable over the facets of the design. Standard post-processing methods such as grinding and linishing often meet with major challenges in finishing parts of complex shape. This paper reports on research that demonstrated that mass finishing (MF processes are able to deliver high-quality surface finishes (Ra and Sa on AM-generated parts of a relatively complex geometry (both internal features and external facets under select conditions. Four processes were studied in this work: stream finishing, high-energy (HE centrifuge, drag finishing and disc finishing. Optimisation of the drag finishing process was then studied using a structured design of experiments (DOE. The effects of a range of finishing parameters were evaluated and optimal parameters and conditions were determined. The study established that the proposed method can be successfully applied in drag finishing to optimise the surface roughness in an industrial application and that it is an economical way of obtaining the maximum amount of information in a short period of time with a small number of tests. The study has also provided an important step in helping understand the requirements of MF to deliver AM-generated parts to a target quality finish and cycle time.

  17. Finite-Time Approach to Microeconomic and Information Exchange Processes

    Directory of Open Access Journals (Sweden)

    Serghey A. Amelkin

    2009-07-01

    Full Text Available Finite-time approach allows one to optimize regimes of processes in macrosystems when duration of the processes is restricted. Driving force of the processes is difference of intensive variables: temperatures in thermodynamics, values in economics, etc. In microeconomic systems two counterflow fluxes appear due to the only driving force. They are goods and money fluxes. Another possible case is two fluxes with the same direction. The processes of information exchange can be described by this formalism.

  18. Science-based information processing in the process control of power stations

    International Nuclear Information System (INIS)

    Weisang, C.

    1992-01-01

    Through the application of specialized systems, future-orientated information processing integrates the sciences of processes, control systems, process control strategies, user behaviour and ergonomics. Improvements in process control can be attained, inter alia, by the preparation of the information contained (e.g. by suppressing the flow of signals and replacing it with signals which are found on substance) and also by an ergonomic representation of the study of the process. (orig.) [de

  19. Prioritisation process for decommissioning of the Iraq former nuclear complex

    International Nuclear Information System (INIS)

    Jarjies, Adnan; Abbas, Mohammed; Fernandes, Horst M.; Coates, Roger

    2008-01-01

    There are a number of sites in Iraq which have been used for nuclear activities and which contain potentially significant amounts of radioactive waste. The principal nuclear site is Al-Tuwaitha, the former nuclear research centre. Many of these sites suffered substantial physical damage during the Gulf Wars and have been subjected to subsequent looting. All require decommissioning in order to ensure both radiological and non-radiological safety. However, it is not possible to undertake the decommissioning of all sites and facilities at the same time. Therefore, a prioritization methodology has been developed in order to aid the decision-making process. The methodology comprises three principal stages of assessment: 1) a quantitative surrogate risk assessment, 2) a range of sensitivity analyses and 3) the inclusion of qualitative modifying factors. A group of five Tuwaitha facilities presented the highest evaluated risk, followed by a middle ranking grouping of Tuwaitha facilities and some other sites, with a relatively large number of lower risk facilities and sites comprising a third group. This initial risk-based order of priority is changed when modifying factors are taken into account. It is necessary to take account of Iraq's isolation from the international nuclear community over the last two decades and the lack of experienced personnel. Therefore it is appropriate to initiate decommissioning operations on selected low risk facilities at Tuwaitha in order to build capacity/experience and prepare for work to be carried out in more complex and potentially high hazard facilities. In addition it is appropriate to initiate some prudent precautionary actions relating to some of the higher risk facilities. (author)

  20. Crosstalk between endophytes and a plant host within information-processing networks

    Directory of Open Access Journals (Sweden)

    Kozyrovska N. O.

    2013-05-01

    Full Text Available Plants are heavily populated by pro- and eukaryotic microorganisms and represent therefore the tremendous complexity as a biological system. This system exists as an information-processing entity with rather complex processes of communication, occurring throughout the individual plant. The plant cellular information-proces- sing network constitutes the foundation for processes like growth, defense, and adaptation to the environment. Up to date, the molecular mechanisms, underlying perception, transfer, analysis, and storage of the endogenous and environmental information within the plant, remain to be fully understood. The associated microorganisms and their investment in the information conditioning are often ignored. Endophytes as plant partners are indispen- sable integrative part of the plant system. Diverse endophytic microorganisms comprise «normal» microbiota that plays a role in plant immunity and helps the plant system to survive in the environment (providing assistance in defense, nutrition, detoxification etc.. The role of endophytic microbiota in the processing of information may be presumed, taking into account a plant-microbial co-evolution and empirical data. Since the literature are be- ginning to emerge on this topic, in this article, I review key works in the field of plant-endophytes interactions in the context of information processing and represent the opinion on their putative role in plant information web under defense and the adaptation to changed conditions.

  1. Influence of information on behavioral effects in decision processes

    Directory of Open Access Journals (Sweden)

    Angelarosa Longo

    2015-07-01

    Full Text Available Rational models in decision processes are marked out by many anomalies, caused by behavioral issues. We point out the importance of information in causing inconsistent preferences in a decision process. In a single or multi agent decision process each mental model is influenced by the presence, the absence or false information about the problem or about other members of the decision making group. The difficulty in modeling these effects increases because behavioral biases influence also the modeler. Behavioral Operational Research (BOR studies these influences to create efficient models to define choices in similar decision processes.

  2. Evolution of the archaeal and mammalian information processing systems: towards an archaeal model for human disease.

    Science.gov (United States)

    Lyu, Zhe; Whitman, William B

    2017-01-01

    Current evolutionary models suggest that Eukaryotes originated from within Archaea instead of being a sister lineage. To test this model of ancient evolution, we review recent studies and compare the three major information processing subsystems of replication, transcription and translation in the Archaea and Eukaryotes. Our hypothesis is that if the Eukaryotes arose within the archaeal radiation, their information processing systems will appear to be one of kind and not wholly original. Within the Eukaryotes, the mammalian or human systems are emphasized because of their importance in understanding health. Biochemical as well as genetic studies provide strong evidence for the functional similarity of archaeal homologs to the mammalian information processing system and their dissimilarity to the bacterial systems. In many independent instances, a simple archaeal system is functionally equivalent to more elaborate eukaryotic homologs, suggesting that evolution of complexity is likely an central feature of the eukaryotic information processing system. Because fewer components are often involved, biochemical characterizations of the archaeal systems are often easier to interpret. Similarly, the archaeal cell provides a genetically and metabolically simpler background, enabling convenient studies on the complex information processing system. Therefore, Archaea could serve as a parsimonious and tractable host for studying human diseases that arise in the information processing systems.

  3. Physics Colloquium: The optical route to quantum information processing

    CERN Multimedia

    Université de Genève

    2011-01-01

    Geneva University Physics Department 24, Quai Ernest Ansermet CH-1211 Geneva 4 Monday 11 April 2011 17h00 - Ecole de Physique, Auditoire Stückelberg The optical route to quantum information processing Prof. Terry Rudolph/Imperial College, London Photons are attractive as carriers of quantum information both because they travel, and can thus transmit information, but also because of their good coherence properties and ease in undergoing single-qubit manipulations. The main obstacle to their use in information processing is inducing an effective interaction between them in order to produce entanglement. The most promising approach in photon-based information processing architectures is so-called measurement-based quantum computing. This relies on creating upfront a multi-qubit highly entangled state (the cluster state) which has the remarkable property that, once prepared, it can be used to perform quantum computation by making only single qubit measurements. In this talk I will discuss generically the...

  4. Lecturing and Loving It: Applying the Information-Processing Model.

    Science.gov (United States)

    Parker, Jonathan K.

    1993-01-01

    Discusses the benefits of lecturing, when done properly, in high schools. Describes the positive attributes of effective lecturers. Provides a human information-processing model applicable to the task of lecturing to students. (HB)

  5. Monkeys preferentially process body information while viewing affective displays.

    Science.gov (United States)

    Bliss-Moreau, Eliza; Moadab, Gilda; Machado, Christopher J

    2017-08-01

    Despite evolutionary claims about the function of facial behaviors across phylogeny, rarely are those hypotheses tested in a comparative context-that is, by evaluating how nonhuman animals process such behaviors. Further, while increasing evidence indicates that humans make meaning of faces by integrating contextual information, including that from the body, the extent to which nonhuman animals process contextual information during affective displays is unknown. In the present study, we evaluated the extent to which rhesus macaques (Macaca mulatta) process dynamic affective displays of conspecifics that included both facial and body behaviors. Contrary to hypotheses that they would preferentially attend to faces during affective displays, monkeys looked for longest, most frequently, and first at conspecifics' bodies rather than their heads. These findings indicate that macaques, like humans, attend to available contextual information during the processing of affective displays, and that the body may also be providing unique information about affective states. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  6. INFORMATION SYSTEM OF AUTOMATION OF PREPARATION EDUCATIONAL PROCESS DOCUMENTS

    Directory of Open Access Journals (Sweden)

    V. A. Matyushenko

    2016-01-01

    Full Text Available Information technology is rapidly conquering the world, permeating all spheres of human activity. Education is not an exception. An important direction of information of education is the development of university management systems. Modern information systems improve and facilitate the management of all types of activities of the institution. The purpose of this paper is development of system, which allows automating process of formation of accounting documents. The article describes the problem of preparation of the educational process documents. Decided to project and create the information system in Microsoft Access environment. The result is four types of reports obtained by using the developed system. The use of this system now allows you to automate the process and reduce the effort required to prepare accounting documents. All reports was implement in Microsoft Excel software product and can be used for further analysis and processing.

  7. Information Processing Theories and the Education of the Gifted.

    Science.gov (United States)

    Rawl, Ruth K.; O'Tuel, Frances S.

    1983-01-01

    The basic assumptions of information processing theories in cognitive psychology are reviewed, and the application of this approach to problem solving in gifted education is considered. Specific implications are cited on problem selection and instruction giving. (CL)

  8. Information processing theory in the early design stages

    DEFF Research Database (Denmark)

    Cash, Philip; Kreye, Melanie

    2014-01-01

    suggestions for improvements and support. One theory that may be particularly applicable to the early design stages is Information Processing Theory (IPT) as it is linked to the design process with regard to the key concepts considered. IPT states that designers search for information if they perceive......, the new knowledge is shared between the design team to reduce ambiguity with regards to its meaning and to build a shared understanding – reducing perceived uncertainty. Thus, we propose that Information-Processing Theory is suitable to describe designer activity in the early design stages...... uncertainty with regard to the knowledge necessary to solve a design challenge. They then process this information and compare if the new knowledge they have gained covers the previous knowledge gap. In engineering design, uncertainty plays a key role, particularly in the early design stages which has been...

  9. Bridging the Operational Divide: An Information-Processing Model of Internal Supply Chain Integration

    Science.gov (United States)

    Rosado Feger, Ana L.

    2009-01-01

    Supply Chain Management, the coordination of upstream and downstream flows of product, services, finances, and information from a source to a customer, has risen in prominence over the past fifteen years. The delivery of a product to the consumer is a complex process requiring action from several independent entities. An individual firm consists…

  10. Breakdown of local information processing may underlie isoflurane anesthesia effects.

    Science.gov (United States)

    Wollstadt, Patricia; Sellers, Kristin K; Rudelt, Lucas; Priesemann, Viola; Hutt, Axel; Fröhlich, Flavio; Wibral, Michael

    2017-06-01

    The disruption of coupling between brain areas has been suggested as the mechanism underlying loss of consciousness in anesthesia. This hypothesis has been tested previously by measuring the information transfer between brain areas, and by taking reduced information transfer as a proxy for decoupling. Yet, information transfer is a function of the amount of information available in the information source-such that transfer decreases even for unchanged coupling when less source information is available. Therefore, we reconsidered past interpretations of reduced information transfer as a sign of decoupling, and asked whether impaired local information processing leads to a loss of information transfer. An important prediction of this alternative hypothesis is that changes in locally available information (signal entropy) should be at least as pronounced as changes in information transfer. We tested this prediction by recording local field potentials in two ferrets after administration of isoflurane in concentrations of 0.0%, 0.5%, and 1.0%. We found strong decreases in the source entropy under isoflurane in area V1 and the prefrontal cortex (PFC)-as predicted by our alternative hypothesis. The decrease in source entropy was stronger in PFC compared to V1. Information transfer between V1 and PFC was reduced bidirectionally, but with a stronger decrease from PFC to V1. This links the stronger decrease in information transfer to the stronger decrease in source entropy-suggesting reduced source entropy reduces information transfer. This conclusion fits the observation that the synaptic targets of isoflurane are located in local cortical circuits rather than on the synapses formed by interareal axonal projections. Thus, changes in information transfer under isoflurane seem to be a consequence of changes in local processing more than of decoupling between brain areas. We suggest that source entropy changes must be considered whenever interpreting changes in information

  11. Information processing speed in obstructive sleep apnea syndrome: a review.

    Science.gov (United States)

    Kilpinen, R; Saunamäki, T; Jehkonen, M

    2014-04-01

    To provide a comprehensive review of studies on information processing speed in patients with obstructive sleep apnea syndrome (OSAS) as compared to healthy controls and normative data, and to determine whether continuous positive airway pressure (CPAP) treatment improves information processing speed. A systematic review was performed on studies drawn from Medline and PsycINFO (January 1990-December 2011) and identified from lists of references in these studies. After inclusion criteria, 159 articles were left for abstract review, and after exclusion criteria 44 articles were fully reviewed. The number of patients in the studies reviewed ranged from 10 to 157 and the study samples consisted mainly of men. Half of the studies reported that patients with OSAS showed reduced information processing speed when compared to healthy controls. Reduced information processing speed was seen more often (75%) when compared to norm-referenced data. Psychomotor speed seemed to be particularly liable to change. CPAP treatment improved processing speed, but the improvement was marginal when compared to placebo or conservative treatment. Patients with OSAS are affected by reduced information processing speed, which may persist despite CPAP treatment. Information processing is usually assessed as part of other cognitive functioning, not as a cognitive domain per se. However, it is important to take account of information processing speed when assessing other aspects of cognitive functioning. This will make it possible to determine whether cognitive decline in patients with OSAS is based on lower-level or higher-level cognitive processes or both. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  12. Information processing in the outer retina of fish

    NARCIS (Netherlands)

    Endeman, D.

    2017-01-01

    The retina translates light into neuronal activity. Thus, it renders visual information of the external environment. The retina can only send a limited amount of information to the brain within a given period. To use this amount optimally, light stimuli are strongly processed in the retina. This

  13. Reconciling Sex Differences in Information-Processing and Career Outcomes.

    Science.gov (United States)

    Wolleat, Patricia L.

    1990-01-01

    Information processing theory could be made more sensitive to differences in career outcomes for males and females by (1) examining the nature of the career decision; (2) expanding the notion of information; (3) relating the vocational schema to the gender schema; and (4) noting whether variables are general, sex related, or sex specific. (SK)

  14. Marketing for Special Libraries and Information Centers: The Positioning Process.

    Science.gov (United States)

    Sterngold, Arthur

    1982-01-01

    The positioning process of marketing used by special libraries and information centers involves two key decisions from which other decisions are derived: to which user groups marketing programs and services will be directed; and which information needs will be served. Two cases are discussed and a bibliography is provided. (EJS)

  15. Reshaping the Enterprise through an Information Architecture and Process Reengineering.

    Science.gov (United States)

    Laudato, Nicholas C.; DeSantis, Dennis J.

    1995-01-01

    The approach used by the University of Pittsburgh (Pennsylvania) in designing a campus-wide information architecture and a framework for reengineering the business process included building consensus on a general philosophy for information systems, using pattern-based abstraction techniques, applying data modeling and application prototyping, and…

  16. Quantum information processing beyond ten ion-qubits

    International Nuclear Information System (INIS)

    Monz, T.

    2011-01-01

    Successful processing of quantum information is, to a large degree, based on two aspects: a) the implementation of high-fidelity quantum gates, as well as b) avoiding or suppressing decoherence processes that destroy quantum information. The presented work shows our progress in the field of experimental quantum information processing over the last years: the implementation and characterisation of several quantum operations, amongst others the first realisation of the quantum Toffoli gate in an ion-trap based quantum computer. The creation of entangled states with up to 14 qubits serves as basis for investigations of decoherence processes. Based on the realised quantum operations as well as the knowledge about dominant noise processes in the employed apparatus, entanglement swapping as well as quantum operations within a decoherence-free subspace are demonstrated. (author) [de

  17. Process Control Security in the Cybercrime Information Exchange NICC

    OpenAIRE

    Luiijf, H.A.M.

    2009-01-01

    Detecting, investigating and prosecuting cybercrime? Extremely important, but not really the solution for the problem. Prevention is better! The sectors that have joined the Cybercrime Information Exchange have accepted the challenge of ensuring the effectiveness of the (information) security of process control systems (PCS), including SCADA. This publication makes it clear why it is vital that organizations establish and maintain control over the security of the information and communication...

  18. Information Processing Bias in Post-traumatic Stress Disorder

    OpenAIRE

    Weber, Darren L

    2008-01-01

    This review considers theory and evidence for abnormal information processing in post-traumatic stress disorder (PTSD). Cognitive studies have indicated sensitivity in PTSD for traumatic information, more so than general emotional information. These findings were supported by neuroimaging studies that identify increased brain activity during traumatic cognition, especially in affective networks (including the amygdala, orbitofrontal and anterior cingulate cortex). In theory, it is proposed th...

  19. Morphological evidence for parallel processing of information in rat macula

    Science.gov (United States)

    Ross, M. D.

    1988-01-01

    Study of montages, tracings and reconstructions prepared from a series of 570 consecutive ultrathin sections shows that rat maculas are morphologically organized for parallel processing of linear acceleratory information. Type II cells of one terminal field distribute information to neighboring terminals as well. The findings are examined in light of physiological data which indicate that macular receptor fields have a preferred directional vector, and are interpreted by analogy to a computer technology known as an information network.

  20. The capitalization of the accounting information in the process of stocks analyse

    OpenAIRE

    ciumag, anca

    2009-01-01

    The information is justifying its importance by the fact that its detailed aspects which contains can lead to signified economies and to the fusion of the stocking operations and procedures, when it is used the elaboration of decisions. The most complex structure, as data basis offered to the economic analysts, is represented by the accounting whose ability of coverage of the economic phenomena and processes, as well as of the patrimony existence in analytical and synthetic information, ...

  1. Theory and research in audiology education: understanding and representing complexity through informed methodological decisions.

    Science.gov (United States)

    Ng, Stella L

    2013-05-01

    The discipline of audiology has the opportunity to embark on research in education from an informed perspective, learning from professions that began this journey decades ago. The goal of this article is to position our discipline as a new member in the academic field of health professional education (HPE), with much to learn and contribute. In this article, I discuss the need for theory in informing HPE research. I also stress the importance of balancing our research goals by selecting appropriate methodologies for relevant research questions, to ensure that we respect the complexity of social processes inherent in HPE. Examples of relevant research questions are used to illustrate the need to consider alternative methodologies and to rethink the traditional hierarchy of evidence. I also provide an example of the thought processes and decisions that informed the design of an educational research study using a constructivist grounded theory methodology. As audiology enters the scholarly field of HPE, we need to arm ourselves with some of the knowledge and perspective that informs the field. Thus, we need to broaden our conceptions of what we consider to be appropriate styles of academic writing, relevant research questions, and valid evidence. Also, if we are to embark on qualitative inquiry into audiology education (or other audiology topics), we need to ensure that we conduct this research with an adequate understanding of the theories and methodologies informing such approaches. We must strive to conduct high quality, rigorous qualitative research more often than uninformed, generic qualitative research. These goals are imperative to the advancement of the theoretical landscape of audiology education and evolving the place of audiology in the field of HPE. American Academy of Audiology.

  2. A Sensitivity Analysis Method to Study the Behavior of Complex Process-based Models

    Science.gov (United States)

    Brugnach, M.; Neilson, R.; Bolte, J.

    2001-12-01

    The use of process-based models as a tool for scientific inquiry is becoming increasingly relevant in ecosystem studies. Process-based models are artificial constructs that simulate the system by mechanistically mimicking the functioning of its component processes. Structurally, a process-based model can be characterized, in terms of its processes and the relationships established among them. Each process comprises a set of functional relationships among several model components (e.g., state variables, parameters and input data). While not encoded explicitly, the dynamics of the model emerge from this set of components and interactions organized in terms of processes. It is the task of the modeler to guarantee that the dynamics generated are appropriate and semantically equivalent to the phenomena being modeled. Despite the availability of techniques to characterize and understand model behavior, they do not suffice to completely and easily understand how a complex process-based model operates. For example, sensitivity analysis studies model behavior by determining the rate of change in model output as parameters or input data are varied. One of the problems with this approach is that it considers the model as a "black box", and it focuses on explaining model behavior by analyzing the relationship input-output. Since, these models have a high degree of non-linearity, understanding how the input affects an output can be an extremely difficult task. Operationally, the application of this technique may constitute a challenging task because complex process-based models are generally characterized by a large parameter space. In order to overcome some of these difficulties, we propose a method of sensitivity analysis to be applicable to complex process-based models. This method focuses sensitivity analysis at the process level, and it aims to determine how sensitive the model output is to variations in the processes. Once the processes that exert the major influence in

  3. Process evaluation of complex interventions: Medical Research Council guidance

    OpenAIRE

    Moore, G.F.; Audrey, S.; Barker, M.; Bond, L.; Bonell, C.; Hardeman, W.; Moore, L.; O'Cathain, A.; Tinati, T.; Wight, D.; Baird, J.

    2015-01-01

    Attempts to tackle problems such as smoking and obesity increasingly use complex interventions. These are commonly defined as interventions that comprise multiple interacting components, although additional dimensions of complexity include the difficulty of their implementation and the number of organisational levels they target.1 Randomised controlled trials are regarded as the gold standard for establishing the effectiveness of interventions, when randomisation is feasible. However, effect ...

  4. ORGANIZATION OF INFORMATION INTERACTION OF AIRPORT PRODUCTION PROCESSES

    Directory of Open Access Journals (Sweden)

    Yakov Mikhajlovich Dalinger

    2017-01-01

    Full Text Available The organization of service production attributed to airports activity is analyzed. The importance and the actuality of information interaction problem solution between productive processes as a problem of organization of modern produc- tion are shown.Possibilities and features of information interaction system construction in form of multi-level hierarchical struc- ture have been shown. The airport is considered as an enterprise aimed at service production where it is necessary to analyze much in- formation in a limited time-frame. The production schedule often changes under the influence of many factors. This leads to the increase of the role of computerization and informatization of production processes what predetermines automation of production, creation of information environment and organization of information interaction needed for realization of production processes. The integrated organization form is proposed because it is oriented to the integration of different processes into a universal production system and it allows to conduct the coordination of local goals of particular processes in the context of the global purpose aimed at the improvement of the effectiveness of the airport activity. The main conditions needed for organization of information interaction between production processes and techno- logical operations are considered, and the list of the following problems is determined. The attention is paid to the necessity of compatibility of structure and organization of interaction system in the conditions of the airline and the necessity of be- ing its reflection in the information space of the airline. The usefulness of the intergrated organization form of information interaction based on information exchange between processes and service customers according to the network structure is explained. Multi-level character of this structure confirms its advantage over other items, however it also has a series of features presented

  5. Supramolecular chemistry: from molecular information towards self-organization and complex matter

    International Nuclear Information System (INIS)

    Lehn, Jean-Marie

    2004-01-01

    Molecular chemistry has developed a wide range of very powerful procedures for constructing ever more sophisticated molecules from atoms linked by covalent bonds. Beyond molecular chemistry lies supramolecular chemistry, which aims at developing highly complex chemical systems from components interacting via non-covalent intermolecular forces. By the appropriate manipulation of these interactions, supramolecular chemistry became progressively the chemistry of molecular information, involving the storage of information at the molecular level, in the structural features, and its retrieval, transfer, and processing at the supramolecular level, through molecular recognition processes operating via specific interactional algorithms. This has paved the way towards apprehending chemistry also as an information science. Numerous receptors capable of recognizing, i.e. selectively binding, specific substrates have been developed, based on the molecular information stored in the interacting species. Suitably functionalized receptors may perform supramolecular catalysis and selective transport processes. In combination with polymolecular organization, recognition opens ways towards the design of molecular and supramolecular devices based on functional (photoactive, electroactive, ionoactive, etc) components. A step beyond preorganization consists in the design of systems undergoing self-organization, i.e. systems capable of spontaneously generating well-defined supramolecular architectures by self-assembly from their components. Self-organization processes, directed by the molecular information stored in the components and read out at the supramolecular level through specific interactions, represent the operation of programmed chemical systems. They have been implemented for the generation of a variety of discrete functional architectures of either organic or inorganic nature. Self-organization processes also give access to advanced supramolecular materials, such as

  6. ADMINISTRATION OF THE INFORMATION AND THE PROCESS OF BANK NEGOTIATION

    Directory of Open Access Journals (Sweden)

    Almir Lindemann

    2009-07-01

    Full Text Available This paper analyzes the quality of the administration of information, identifying deficiencies in the information systems, used in the negotiation process for concession of bank credit, to small and mid-sized companies, under the business managers' perspective. The results make the deficiencies evident and confirm the need for change in the systems of administration of information, in order to allow for both an improvement in the negotiation process of bank credit as well as a larger economical efficiency of the available resources.

  7. IMPROVING THE QUALITY OF MAINTENANCE PROCESSES USING INFORMATION TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    Zora Arsovski

    2008-06-01

    Full Text Available In essence, process of maintaining equipment is a support process, because it indirectly contributes to operational ability of the production process necessary for the supply chain of the new value. Taking into account increased levels of automatization and quality, this proces s becomes more and more significant and for some branches of industry, even crucial. Due to the fact that the quality of the entire process is more and more dependent on the maintenance process, these processes must be carefully designed and effectively im plemented. There are various techniques and approaches at our disposal, such as technical, logistical and intensive application of the information - communication technologies. This last approach is presented in this work. It begins with organizational goa ls, especially quality objectives. Then, maintenance processes and integrated information system structures are defined. Maintenance process quality and improvement processes are defined using a set of performances, with a special emphasis placed on effectiveness and quality economics. At the end of the work, information system for improving maintenance economics is structured. Besides theoretical analysis, work also presents results authors obtained analyzing food industry, metal processing industry an d building materials industry.

  8. Knowledge acquisition process as an issue in information sciences

    Directory of Open Access Journals (Sweden)

    Boris Bosančić

    2016-07-01

    Full Text Available The paper presents an overview of some problems of information science which are explicitly portrayed in literature. It covers the following issues: information explosion, information flood and data deluge, information retrieval and relevance of information, and finally, the problem of scientific communication. The purpose of this paper is to explain why knowledge acquisition, can be considered as an issue in information sciences. The existing theoretical foundation within the information sciences, i.e. the DIKW hierarchy and its key concepts - data, information, knowledge and wisdom, is recognized as a symbolic representation as well as the theoretical foundation of the knowledge acquisition process. Moreover, it seems that the relationship between the DIKW hierarchy and the knowledge acquisition process is essential for a stronger foundation of information sciences in the 'body' of the overall human knowledge. In addition, the history of both the human and machine knowledge acquisition has been considered, as well as a proposal that the DIKW hierarchy take place as a symbol of general knowledge acquisition process, which could equally relate to both human and machine knowledge acquisition. To achieve this goal, it is necessary to modify the existing concept of the DIKW hierarchy. The appropriate modification of the DIKW hierarchy (one of which is presented in this paper could result in a much more solid theoretical foundation of the knowledge acquisition process and information sciences as a whole. The theoretical assumptions on which the knowledge acquisition process may be established as a problem of information science are presented at the end of the paper. The knowledge acquisition process does not necessarily have to be the subject of epistemology. It may establish a stronger link between the concepts of data and knowledge; furthermore, it can be used in the context of scientific research, but on the more primitive level than conducting

  9. Pathways from Toddler Information Processing to Adolescent Lexical Proficiency

    Science.gov (United States)

    Rose, Susan A.; Feldman, Judith F.; Jankowski, Jeffery J.

    2015-01-01

    This study examined the relation of 3-year core information-processing abilities to lexical growth and development. The core abilities covered four domains--memory, representational competence (cross-modal transfer), processing speed, and attention. Lexical proficiency was assessed at 3 and 13 years with the Peabody Picture Vocabulary Test (PPVT)…

  10. Temporal Expectation and Information Processing: A Model-Based Analysis

    Science.gov (United States)

    Jepma, Marieke; Wagenmakers, Eric-Jan; Nieuwenhuis, Sander

    2012-01-01

    People are able to use temporal cues to anticipate the timing of an event, enabling them to process that event more efficiently. We conducted two experiments, using the fixed-foreperiod paradigm (Experiment 1) and the temporal-cueing paradigm (Experiment 2), to assess which components of information processing are speeded when subjects use such…

  11. Information Processing and Dynamics in Minimally Cognitive Agents

    Science.gov (United States)

    Beer, Randall D.; Williams, Paul L.

    2015-01-01

    There has been considerable debate in the literature about the relative merits of information processing versus dynamical approaches to understanding cognitive processes. In this article, we explore the relationship between these two styles of explanation using a model agent evolved to solve a relational categorization task. Specifically, we…

  12. A Social Information Processing Model of Media Use in Organizations.

    Science.gov (United States)

    Fulk, Janet; And Others

    1987-01-01

    Presents a model to examine how social influence processes affect individuals' attitudes toward communication media and media use behavior, integrating two research areas: media use patterns as the outcome of objectively rational choices and social information processing theory. Asserts (in a synthesis) that media characteristics and attitudes are…

  13. A system of automated processing of deep water hydrological information

    Science.gov (United States)

    Romantsov, V. A.; Dyubkin, I. A.; Klyukbin, L. N.

    1974-01-01

    An automated system for primary and scientific analysis of deep water hydrological information is presented. Primary processing of the data in this system is carried out on a drifting station, which also calculates the parameters of vertical stability of the sea layers, as well as their depths and altitudes. Methods of processing the raw data are described.

  14. Information Architecture without Internal Theory: An Inductive Design Process.

    Science.gov (United States)

    Haverty, Marsha

    2002-01-01

    Suggests that information architecture design is primarily an inductive process, partly because it lacks internal theory and partly because it is an activity that supports emergent phenomena (user experiences) from basic design components. Suggests a resemblance to Constructive Induction, a design process that locates the best representational…

  15. Development of technical information processing system(VI)

    International Nuclear Information System (INIS)

    Lee, Jee Hoh; Kim, Tae Hwan; Choi, Kwang; Chung, Hyun Sook; Keum, Jong Yong

    1994-12-01

    This project is to establish high-quality information circulation system by developing serials-control system to improve serials management from ordering to distributing and availability on R and D and to advance in quality of information service needed in R and D by fast retrieval and providing of research information with CD-Net. The results of the project are as follows. 1. Serials management process which covers from ordering to distributing have higher efficiency by development of subscription information system. 2. Systematic control on each issue of serials is achieved by development of serials checking system. 3. It is possible to provide vol. and no. information of issue received currently to researchers promptly by improvement of serials holding information system. 4. Retrieval of research information contained in various CD-ROM DB throughout KAERI-NET is possible by research on construction methods of CD-Net. 2 figs, 25 refs. (Author)

  16. Development of technical information processing system(VI)

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jee Hoh; Kim, Tae Hwan; Choi, Kwang; Chung, Hyun Sook; Keum, Jong Yong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1994-12-01

    This project is to establish high-quality information circulation system by developing serials-control system to improve serials management from ordering to distributing and availability on R and D and to advance in quality of information service needed in R and D by fast retrieval and providing of research information with CD-Net. The results of the project are as follows. 1. Serials management process which covers from ordering to distributing have higher efficiency by development of subscription information system. 2. Systematic control on each issue of serials is achieved by development of serials checking system. 3. It is possible to provide vol. and no. information of issue received currently to researchers promptly by improvement of serials holding information system. 4. Retrieval of research information contained in various CD-ROM DB throughout KAERI-NET is possible by research on construction methods of CD-Net. 2 figs, 25 refs. (Author).

  17. Semantic Complex Event Processing over End-to-End Data Flows

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Qunzhi [University of Southern California; Simmhan, Yogesh; Prasanna, Viktor K.

    2012-04-01

    Emerging Complex Event Processing (CEP) applications in cyber physical systems like SmartPower Grids present novel challenges for end-to-end analysis over events, flowing from heterogeneous information sources to persistent knowledge repositories. CEP for these applications must support two distinctive features - easy specification patterns over diverse information streams, and integrated pattern detection over realtime and historical events. Existing work on CEP has been limited to relational query patterns, and engines that match events arriving after the query has been registered. We propose SCEPter, a semantic complex event processing framework which uniformly processes queries over continuous and archived events. SCEPteris built around an existing CEP engine with innovative support for semantic event pattern specification and allows their seamless detection over past, present and future events. Specifically, we describe a unified semantic query model that can operate over data flowing through event streams to event repositories. Compile-time and runtime semantic patterns are distinguished and addressed separately for efficiency. Query rewriting is examined and analyzed in the context of temporal boundaries that exist between event streams and their repository to avoid duplicate or missing results. The design and prototype implementation of SCEPterare analyzed using latency and throughput metrics for scenarios from the Smart Grid domain.

  18. Real-time information and processing system for radiation protection

    International Nuclear Information System (INIS)

    Oprea, I.; Oprea, M.; Stoica, M.; Badea, E.; Guta, V.

    1999-01-01

    The real-time information and processing system has as main task to record, collect, process and transmit the radiation level and weather data, being proposed for radiation protection, environmental monitoring around nuclear facilities and for civil defence. Such a system can offer information in order to provide mapping, data base, modelling and communication and to assess the consequences of nuclear accidents. The system incorporates a number of stationary or mobile radiation monitoring equipment, weather parameter measuring station, a GIS-based information processing center and the communication network, all running on a real-time operating system. It provides the automatic data collection on-line and off-line, remote diagnostic, advanced presentation techniques, including a graphically oriented executive support, which has the ability to respond to an emergency by geographical representation of the hazard zones on the map.The system can be integrated into national or international environmental monitoring systems, being based on local intelligent measuring and transmission units, simultaneous processing and data presentation using a real-time operating system for PC and geographical information system (GIS). Such an integrated system is composed of independent applications operating under the same computer, which is capable to improve the protection of the population and decision makers efforts, updating the remote GIS data base. All information can be managed directly from the map by multilevel data retrieving and presentation by using on-line dynamic evolution of the events, environment information, evacuation optimization, image and voice processing

  19. The Evolution Process on Information Technology Outsourcing Relationship

    Directory of Open Access Journals (Sweden)

    Duan Weihua

    2017-01-01

    Full Text Available Information technology outsourcing relationship is one of the key issues to IT outsourcing success. To explore how to manage and promote IT outsourcing relationship, it is necessary to understand its evolution process. Firstly, the types of IT outsourcing based on relationship quality and IT outsourcing project level will be analyzed; Secondly, two evolution process models of IT outsourcing relationship are proposed based on relationship quality and IT outsourcing project level, and the IT outsourcing relationship evolution process is indicated; Finally, an IT outsourcing relationship evolution process model is developed, and the development process of IT outsourcing relationship from low to high under the internal and external power is explained.

  20. Application of information and communication technology in process reengineering

    Directory of Open Access Journals (Sweden)

    Đurović Aleksandar M.

    2014-01-01

    Full Text Available This paper examines the role of information communication technologies in reengineering processes. General analysis of a process will show that information communication technologies improve their efficiency. Reengineering model based on the BPMN 2.0 standard will be applied to the process of seeking internship/job by students from Faculty of Transport and Traffic Engineering. In the paper, after defining the technical characteristics and required functionalities, web / mobile application is proposed, enabling better visibility of traffic engineers to companies seeking that education profile.