WorldWideScience

Sample records for complex information processing

  1. Modelling of information processes management of educational complex

    Directory of Open Access Journals (Sweden)

    Оксана Николаевна Ромашкова

    2014-12-01

    Full Text Available This work concerns information model of the educational complex which includes several schools. A classification of educational complexes formed in Moscow is given. There are also a consideration of the existing organizational structure of the educational complex and a suggestion of matrix management structure. Basic management information processes of the educational complex were conceptualized.

  2. Quantum-information processing in disordered and complex quantum systems

    International Nuclear Information System (INIS)

    Sen, Aditi; Sen, Ujjwal; Ahufinger, Veronica; Briegel, Hans J.; Sanpera, Anna; Lewenstein, Maciej

    2006-01-01

    We study quantum information processing in complex disordered many body systems that can be implemented by using lattices of ultracold atomic gases and trapped ions. We demonstrate, first in the short range case, the generation of entanglement and the local realization of quantum gates in a disordered magnetic model describing a quantum spin glass. We show that in this case it is possible to achieve fidelities of quantum gates higher than in the classical case. Complex systems with long range interactions, such as ions chains or dipolar atomic gases, can be used to model neural network Hamiltonians. For such systems, where both long range interactions and disorder appear, it is possible to generate long range bipartite entanglement. We provide an efficient analytical method to calculate the time evolution of a given initial state, which in turn allows us to calculate its quantum correlations

  3. Communication complexity and information complexity

    Science.gov (United States)

    Pankratov, Denis

    Information complexity enables the use of information-theoretic tools in communication complexity theory. Prior to the results presented in this thesis, information complexity was mainly used for proving lower bounds and direct-sum theorems in the setting of communication complexity. We present three results that demonstrate new connections between information complexity and communication complexity. In the first contribution we thoroughly study the information complexity of the smallest nontrivial two-party function: the AND function. While computing the communication complexity of AND is trivial, computing its exact information complexity presents a major technical challenge. In overcoming this challenge, we reveal that information complexity gives rise to rich geometrical structures. Our analysis of information complexity relies on new analytic techniques and new characterizations of communication protocols. We also uncover a connection of information complexity to the theory of elliptic partial differential equations. Once we compute the exact information complexity of AND, we can compute exact communication complexity of several related functions on n-bit inputs with some additional technical work. Previous combinatorial and algebraic techniques could only prove bounds of the form theta( n). Interestingly, this level of precision is typical in the area of information theory, so our result demonstrates that this meta-property of precise bounds carries over to information complexity and in certain cases even to communication complexity. Our result does not only strengthen the lower bound on communication complexity of disjointness by making it more exact, but it also shows that information complexity provides the exact upper bound on communication complexity. In fact, this result is more general and applies to a whole class of communication problems. In the second contribution, we use self-reduction methods to prove strong lower bounds on the information

  4. Automated complex for information retrieval and processing in the gamma-resonance spectrometry

    International Nuclear Information System (INIS)

    Belogurov, V.N.; Bylinkin, V.A.

    1977-01-01

    A complex for information retrieval and processing in the Moessbauer effect spectrometry is described. The complex consists of a set of 4 precision spectrometers nad a program system for the computation of Moessbauer effect spectrum. High velocity accuracy - 0.004 mm/s during 6 months operation is achieved by introducing an additional negative feedback, the signal from which is obtained when the time of electromagnetic vibrator driving rod passage in the middle of the cycle is compared to the half-period length of the operation of the multichannel analyser address register. Information from 4 spectrometers via a commutation unit and an equalizer unit is analyzed by one analyser. Descriptions and schemes of spectrometers, procedure and scheme of calibration and checking of their operation are given. Described are the system of connection of spectrometers with the BESM-4 computer and the program complex including programs of information input, check-up,, correction and storage in the computer, the calibration of the spectrometer velocity scale and the programs for computing gamma-resonance spectra. The description of the operational principles of these programs and their block diagrams is given

  5. MAIA - Method for Architecture of Information Applied: methodological construct of information processing in complex contexts

    Directory of Open Access Journals (Sweden)

    Ismael de Moura Costa

    2017-04-01

    Full Text Available Introduction: Paper to presentation the MAIA Method for Architecture of Information Applied evolution, its structure, results obtained and three practical applications.Objective: Proposal of a methodological constructo for treatment of complex information, distinguishing information spaces and revealing inherent configurations of those spaces. Metodology: The argument is elaborated from theoretical research of analitical hallmark, using distinction as a way to express concepts. Phenomenology is used as a philosophical position, which considers the correlation between Subject↔Object. The research also considers the notion of interpretation as an integrating element for concepts definition. With these postulates, the steps to transform the information spaces are formulated. Results: This article explores not only how the method is structured to process information in its contexts, starting from a succession of evolutive cicles, divided in moments, which, on their turn, evolve to transformation acts. Conclusions: This article explores not only how the method is structured to process information in its contexts, starting from a succession of evolutive cicles, divided in moments, which, on their turn, evolve to transformation acts. Besides that, the article presents not only possible applications as a cientific method, but also as configuration tool in information spaces, as well as generator of ontologies. At last, but not least, presents a brief summary of the analysis made by researchers who have already evaluated the method considering the three aspects mentioned.

  6. Automated information and control complex of hydro-gas endogenous mine processes

    Science.gov (United States)

    Davkaev, K. S.; Lyakhovets, M. V.; Gulevich, T. M.; Zolin, K. A.

    2017-09-01

    The automated information and control complex designed to prevent accidents, related to aerological situation in the underground workings, accounting of the received and handed over individual devices, transmission and display of measurement data, and the formation of preemptive solutions is considered. Examples for the automated workplace of an airgas control operator by individual means are given. The statistical characteristics of field data characterizing the aerological situation in the mine are obtained. The conducted studies of statistical characteristics confirm the feasibility of creating a subsystem of controlled gas distribution with an adaptive arrangement of points for gas control. The adaptive (multivariant) algorithm for processing measuring information of continuous multidimensional quantities and influencing factors has been developed.

  7. Mathematical Analysis of Evolution, Information, and Complexity

    CERN Document Server

    Arendt, Wolfgang

    2009-01-01

    Mathematical Analysis of Evolution, Information, and Complexity deals with the analysis of evolution, information and complexity. The time evolution of systems or processes is a central question in science, this text covers a broad range of problems including diffusion processes, neuronal networks, quantum theory and cosmology. Bringing together a wide collection of research in mathematics, information theory, physics and other scientific and technical areas, this new title offers elementary and thus easily accessible introductions to the various fields of research addressed in the book.

  8. Processing of spatial and non-spatial information in rats with lesions of the medial and lateral entorhinal cortex: Environmental complexity matters.

    Science.gov (United States)

    Rodo, Christophe; Sargolini, Francesca; Save, Etienne

    2017-03-01

    The entorhinal-hippocampal circuitry has been suggested to play an important role in episodic memory but the contribution of the entorhinal cortex remains elusive. Predominant theories propose that the medial entorhinal cortex (MEC) processes spatial information whereas the lateral entorhinal cortex (LEC) processes non spatial information. A recent study using an object exploration task has suggested that the involvement of the MEC and LEC spatial and non-spatial information processing could be modulated by the amount of information to be processed, i.e. environmental complexity. To address this hypothesis we used an object exploration task in which rats with excitotoxic lesions of the MEC and LEC had to detect spatial and non-spatial novelty among a set of objects and we varied environmental complexity by decreasing the number of objects or amount of object diversity. Reducing diversity resulted in restored ability to process spatial and non-spatial information in MEC and LEC groups, respectively. Reducing the number of objects yielded restored ability to process non-spatial information in the LEC group but not the ability to process spatial information in the MEC group. The findings indicate that the MEC and LEC are not strictly necessary for spatial and non-spatial processing but that their involvement depends on the complexity of the information to be processed. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Information-processing genes

    International Nuclear Information System (INIS)

    Tahir Shah, K.

    1995-01-01

    There are an estimated 100,000 genes in the human genome of which 97% is non-coding. On the other hand, bacteria have little or no non-coding DNA. Non-coding region includes introns, ALU sequences, satellite DNA, and other segments not expressed as proteins. Why it exists? Why nature has kept non-coding during the long evolutionary period if it has no role in the development of complex life forms? Does complexity of a species somehow correlated to the existence of apparently useless sequences? What kind of capability is encoded within such nucleotide sequences that is a necessary, but not a sufficient condition for the evolution of complex life forms, keeping in mind the C-value paradox and the omnipresence of non-coding segments in higher eurkaryotes and also in many archea and prokaryotes. The physico-chemical description of biological processes is hardware oriented and does not highlight algorithmic or information processing aspect. However, an algorithm without its hardware implementation is useless as much as hardware without its capability to run an algorithm. The nature and type of computation an information-processing hardware can perform depends only on its algorithm and the architecture that reflects the algorithm. Given that enormously difficult tasks such as high fidelity replication, transcription, editing and regulation are all achieved within a long linear sequence, it is natural to think that some parts of a genome are involved is these tasks. If some complex algorithms are encoded with these parts, then it is natural to think that non-coding regions contain processing-information algorithms. A comparison between well-known automatic sequences and sequences constructed out of motifs is found in all species proves the point: noncoding regions are a sort of ''hardwired'' programs, i.e., they are linear representations of information-processing machines. Thus in our model, a noncoding region, e.g., an intron contains a program (or equivalently, it is

  10. Effects of emotional tone and visual complexity on processing health information in prescription drug advertising.

    Science.gov (United States)

    Norris, Rebecca L; Bailey, Rachel L; Bolls, Paul D; Wise, Kevin R

    2012-01-01

    This experiment explored how the emotional tone and visual complexity of direct-to-consumer (DTC) drug advertisements affect the encoding and storage of specific risk and benefit statements about each of the drugs in question. Results are interpreted under the limited capacity model of motivated mediated message processing framework. Findings suggest that DTC drug ads should be pleasantly toned and high in visual complexity in order to maximize encoding and storage of risk and benefit information.

  11. Minimized state complexity of quantum-encoded cryptic processes

    Science.gov (United States)

    Riechers, Paul M.; Mahoney, John R.; Aghamohammadi, Cina; Crutchfield, James P.

    2016-05-01

    The predictive information required for proper trajectory sampling of a stochastic process can be more efficiently transmitted via a quantum channel than a classical one. This recent discovery allows quantum information processing to drastically reduce the memory necessary to simulate complex classical stochastic processes. It also points to a new perspective on the intrinsic complexity that nature must employ in generating the processes we observe. The quantum advantage increases with codeword length: the length of process sequences used in constructing the quantum communication scheme. In analogy with the classical complexity measure, statistical complexity, we use this reduced communication cost as an entropic measure of state complexity in the quantum representation. Previously difficult to compute, the quantum advantage is expressed here in closed form using spectral decomposition. This allows for efficient numerical computation of the quantum-reduced state complexity at all encoding lengths, including infinite. Additionally, it makes clear how finite-codeword reduction in state complexity is controlled by the classical process's cryptic order, and it allows asymptotic analysis of infinite-cryptic-order processes.

  12. Exploiting global information in complex network repair processes

    Institute of Scientific and Technical Information of China (English)

    Tianyu WANG; Jun ZHANG; Sebastian WANDELT

    2017-01-01

    Robustness of complex networks has been studied for decades,with a particular focus on network attack.Research on network repair,on the other hand,has been conducted only very lately,given the even higher complexity and absence of an effective evaluation metric.A recently proposed network repair strategy is self-healing,which aims to repair networks for larger compo nents at a low cost only with local information.In this paper,we discuss the effectiveness and effi ciency of self-healing,which limits network repair to be a multi-objective optimization problem and makes it difficult to measure its optimality.This leads us to a new network repair evaluation metric.Since the time complexity of the computation is very high,we devise a greedy ranking strategy.Evaluations on both real-world and random networks show the effectiveness of our new metric and repair strategy.Our study contributes to optimal network repair algorithms and provides a gold standard for future studies on network repair.

  13. Efficient physical embedding of topologically complex information processing networks in brains and computer circuits.

    Directory of Open Access Journals (Sweden)

    Danielle S Bassett

    2010-04-01

    Full Text Available Nervous systems are information processing networks that evolved by natural selection, whereas very large scale integrated (VLSI computer circuits have evolved by commercially driven technology development. Here we follow historic intuition that all physical information processing systems will share key organizational properties, such as modularity, that generally confer adaptivity of function. It has long been observed that modular VLSI circuits demonstrate an isometric scaling relationship between the number of processing elements and the number of connections, known as Rent's rule, which is related to the dimensionality of the circuit's interconnect topology and its logical capacity. We show that human brain structural networks, and the nervous system of the nematode C. elegans, also obey Rent's rule, and exhibit some degree of hierarchical modularity. We further show that the estimated Rent exponent of human brain networks, derived from MRI data, can explain the allometric scaling relations between gray and white matter volumes across a wide range of mammalian species, again suggesting that these principles of nervous system design are highly conserved. For each of these fractal modular networks, the dimensionality of the interconnect topology was greater than the 2 or 3 Euclidean dimensions of the space in which it was embedded. This relatively high complexity entailed extra cost in physical wiring: although all networks were economically or cost-efficiently wired they did not strictly minimize wiring costs. Artificial and biological information processing systems both may evolve to optimize a trade-off between physical cost and topological complexity, resulting in the emergence of homologous principles of economical, fractal and modular design across many different kinds of nervous and computational networks.

  14. Further Understanding of Complex Information Processing in Verbal Adolescents and Adults with Autism Spectrum Disorders

    Science.gov (United States)

    Williams, Diane L.; Minshew, Nancy J.; Goldstein, Gerald

    2015-01-01

    More than 20?years ago, Minshew and colleagues proposed the Complex Information Processing model of autism in which the impairment is characterized as a generalized deficit involving multiple modalities and cognitive domains that depend on distributed cortical systems responsible for higher order abilities. Subsequent behavioral work revealed a…

  15. Information Interaction Criteria Among Students in Process of Task-Based Information Searching (Role of Objective Complexity and Type of Product

    Directory of Open Access Journals (Sweden)

    Marziyeh Saeedizadeh

    2016-08-01

    Full Text Available Purpose:  human-information interactions must be considered in order to be able to interactively design Information Retrieval Systems (IRS. In this regard, study of users’ interactions must be based on their socio-cultural context (specifically work tasks. Accordingly, this paper aims to explore the use of information-interaction criteria among students in the information searching process according to different kinds of their work tasks.  Methodology: This research is applied qualitative method using exploratory study. The research population consisted of MSc students of Ferdowsi university of Mashhad enrolled in 2012-13  academic year. In 3 stages of sampling (random stratified, quota, and voluntary sampling, 30 cases were selected. Each of these cases searched 6 different types of simulated work tasks. Interaction criteria were extracted ? Content analysis of aloud thinking reports. Validity of tools was verified through Faculties of KIS at Ferdowsi university of Mashhad. Also,0.78  Kripendorff’s alpha ratio based on an agreement between the inter – coder indicates the Dependability  of content analysis. Findings: The findings show that in addition to ‘topic’ criteria, other interaction criteria impact on information- interaction of users, such as: ‘search results ranking’, ‘domain knowledge of user’, ‘layout’, ‘type of information resource’ and etc. based on the level of objective complexity and product of  work tasks, information-interaction criteria change. Conclusion: the users pay attention to different information-interaction criteria in process of information searching, considering to variety of work tasks (level of objective complexity and product. So, it is necessary to pay attention to work task characteristics in order to design interactive and personalized IR systems.

  16. An information transfer based novel framework for fault root cause tracing of complex electromechanical systems in the processing industry

    Science.gov (United States)

    Wang, Rongxi; Gao, Xu; Gao, Jianmin; Gao, Zhiyong; Kang, Jiani

    2018-02-01

    As one of the most important approaches for analyzing the mechanism of fault pervasion, fault root cause tracing is a powerful and useful tool for detecting the fundamental causes of faults so as to prevent any further propagation and amplification. Focused on the problems arising from the lack of systematic and comprehensive integration, an information transfer-based novel data-driven framework for fault root cause tracing of complex electromechanical systems in the processing industry was proposed, taking into consideration the experience and qualitative analysis of conventional fault root cause tracing methods. Firstly, an improved symbolic transfer entropy method was presented to construct a directed-weighted information model for a specific complex electromechanical system based on the information flow. Secondly, considering the feedback mechanisms in the complex electromechanical systems, a method for determining the threshold values of weights was developed to explore the disciplines of fault propagation. Lastly, an iterative method was introduced to identify the fault development process. The fault root cause was traced by analyzing the changes in information transfer between the nodes along with the fault propagation pathway. An actual fault root cause tracing application of a complex electromechanical system is used to verify the effectiveness of the proposed framework. A unique fault root cause is obtained regardless of the choice of the initial variable. Thus, the proposed framework can be flexibly and effectively used in fault root cause tracing for complex electromechanical systems in the processing industry, and formulate the foundation of system vulnerability analysis and condition prediction, as well as other engineering applications.

  17. Clinical Information Systems as the Backbone of a Complex Information Logistics Process: Findings from the Clinical Information Systems Perspective for 2016.

    Science.gov (United States)

    Hackl, W O; Ganslandt, T

    2017-08-01

    Objective: To summarize recent research and to propose a selection of best papers published in 2016 in the field of Clinical Information Systems (CIS). Method: The query used to retrieve the articles for the CIS section of the 2016 edition of the IMIA Yearbook of Medical Informatics was reused. It again aimed at identifying relevant publications in the field of CIS from PubMed and Web of Science and comprised search terms from the Medical Subject Headings (MeSH) catalog as well as additional free text search terms. The retrieved articles were categorized in a multi-pass review carried out by the two section editors. The final selection of candidate papers was then peer-reviewed by Yearbook editors and external reviewers. Based on the review results, the best papers were then chosen at the selection meeting with the IMIA Yearbook editorial board. Text mining, term co-occurrence mapping, and topic modelling techniques were used to get an overview on the content of the retrieved articles. Results: The query was carried out in mid-January 2017, yielding a consolidated result set of 2,190 articles published in 921 different journals. Out of them, 14 papers were nominated as candidate best papers and three of them were finally selected as the best papers of the CIS field. The content analysis of the articles revealed the broad spectrum of topics covered by CIS research. Conclusions: The CIS field is multi-dimensional and complex. It is hard to draw a well-defined outline between CIS and other domains or other sections of the IMIA Yearbook. The trends observed in the previous years are progressing. Clinical information systems are more than just sociotechnical systems for data collection, processing, exchange, presentation, and archiving. They are the backbone of a complex, trans-institutional information logistics process. Georg Thieme Verlag KG Stuttgart.

  18. Unifying Complexity and Information

    Science.gov (United States)

    Ke, Da-Guan

    2013-04-01

    Complex systems, arising in many contexts in the computer, life, social, and physical sciences, have not shared a generally-accepted complexity measure playing a fundamental role as the Shannon entropy H in statistical mechanics. Superficially-conflicting criteria of complexity measurement, i.e. complexity-randomness (C-R) relations, have given rise to a special measure intrinsically adaptable to more than one criterion. However, deep causes of the conflict and the adaptability are not much clear. Here I trace the root of each representative or adaptable measure to its particular universal data-generating or -regenerating model (UDGM or UDRM). A representative measure for deterministic dynamical systems is found as a counterpart of the H for random process, clearly redefining the boundary of different criteria. And a specific UDRM achieving the intrinsic adaptability enables a general information measure that ultimately solves all major disputes. This work encourages a single framework coving deterministic systems, statistical mechanics and real-world living organisms.

  19. Effective Complexity of Stationary Process Realizations

    Directory of Open Access Journals (Sweden)

    Arleta Szkoła

    2011-06-01

    Full Text Available The concept of effective complexity of an object as the minimal description length of its regularities has been initiated by Gell-Mann and Lloyd. The regularities are modeled by means of ensembles, which is the probability distributions on finite binary strings. In our previous paper [1] we propose a definition of effective complexity in precise terms of algorithmic information theory. Here we investigate the effective complexity of binary strings generated by stationary, in general not computable, processes. We show that under not too strong conditions long typical process realizations are effectively simple. Our results become most transparent in the context of coarse effective complexity which is a modification of the original notion of effective complexity that needs less parameters in its definition. A similar modification of the related concept of sophistication has been suggested by Antunes and Fortnow.

  20. The visual illustration of complex process information during abnormal incidents

    International Nuclear Information System (INIS)

    Heimbuerger, H.; Kautto, A.; Norros, L.; Ranta, J.

    1985-01-01

    One of the proposed solutions to the man-process interface problem in nuclear power plants is the integration of a system in the control room that can provide the operator with a display of a minimum set of critical plant parameters defining the safety status of the plant. Such a system has been experimentally validated using the Loviisa training simulator during the fall of 1982. The project was a joint effort between Combustion Engineering Inc., the Halden Reactor Project, Imatran Voima Oy and VTT. Alarm systems are used in nuclear power plants to tell the control room operators that an unexpected change in the plant operation state has occurred. One difficulty in using the alarms for checking the actions of the operator is that the conventional way of realizing the alarm systems implies that several alarms are active also during normal operation. The coding and representation of alarm information will be discussed in the paper. An important trend in control room design is the move away from direct, concrete indication of process parameters towards use of more abstract/logical representation of information as a basis for plant supervision. Recent advances in computer graphics provide the possibility that, in the future, visual information will be utilized to make the essential dynamics of the process more intelligible. A set of criteria for use of visual information will be necessary. The paper discusses practical aspects for the realisation of such criteria in the context of nuclear power plant. The criteria of the decomposition of the process information concerning the sub-goals safety and availability and also the tentative results of the conceptualization of a PWR-process are discussed in the paper

  1. Informational analysis involving application of complex information system

    Science.gov (United States)

    Ciupak, Clébia; Vanti, Adolfo Alberto; Balloni, Antonio José; Espin, Rafael

    The aim of the present research is performing an informal analysis for internal audit involving the application of complex information system based on fuzzy logic. The same has been applied in internal audit involving the integration of the accounting field into the information systems field. The technological advancements can provide improvements to the work performed by the internal audit. Thus we aim to find, in the complex information systems, priorities for the work of internal audit of a high importance Private Institution of Higher Education. The applied method is quali-quantitative, as from the definition of strategic linguistic variables it was possible to transform them into quantitative with the matrix intersection. By means of a case study, where data were collected via interview with the Administrative Pro-Rector, who takes part at the elaboration of the strategic planning of the institution, it was possible to infer analysis concerning points which must be prioritized at the internal audit work. We emphasize that the priorities were identified when processed in a system (of academic use). From the study we can conclude that, starting from these information systems, audit can identify priorities on its work program. Along with plans and strategic objectives of the enterprise, the internal auditor can define operational procedures to work in favor of the attainment of the objectives of the organization.

  2. Complex sound processing during human REM sleep by recovering information from long-term memory as revealed by the mismatch negativity (MMN).

    Science.gov (United States)

    Atienza, M; Cantero, J L

    2001-05-18

    Perceptual learning is thought to be the result of neural changes that take place over a period of several hours or days, allowing information to be transferred to long-term memory. Evidence suggests that contents of long-term memory may improve attentive and pre-attentive sensory processing. Therefore, it is plausible to hypothesize that learning-induced neural changes that develop during wakefulness could improve automatic information processing during human REM sleep. The MMN, an objective measure of the automatic change detection in auditory cortex, was used to evaluate long-term learning effects on pre-attentive processing during wakefulness and REM sleep. When subjects learned to discriminate two complex auditory patterns in wakefulness, an increase in the MMN was obtained in both wake and REM states. The automatic detection of the infrequent complex auditory pattern may therefore be improved in both brain states by reactivating information from long-term memory. These findings suggest that long-term learning-related neural changes are accessible during REM sleep as well.

  3. Neuropsychological study of FASD in a sample of American Indian children: processing simple versus complex information.

    Science.gov (United States)

    Aragón, Alfredo S; Kalberg, Wendy O; Buckley, David; Barela-Scott, Lindsey M; Tabachnick, Barbara G; May, Philip A

    2008-12-01

    Although a large body of literature exists on cognitive functioning in alcohol-exposed children, it is unclear if there is a signature neuropsychological profile in children with Fetal Alcohol Spectrum Disorders (FASD). This study assesses cognitive functioning in children with FASD from several American Indian reservations in the Northern Plains States, and it applies a hierarchical model of simple versus complex information processing to further examine cognitive function. We hypothesized that complex tests would discriminate between children with FASD and culturally similar controls, while children with FASD would perform similar to controls on relatively simple tests. Our sample includes 32 control children and 24 children with a form of FASD [fetal alcohol syndrome (FAS) = 10, partial fetal alcohol syndrome (PFAS) = 14]. The test battery measures general cognitive ability, verbal fluency, executive functioning, memory, and fine-motor skills. Many of the neuropsychological tests produced results consistent with a hierarchical model of simple versus complex processing. The complexity of the tests was determined "a priori" based on the number of cognitive processes involved in them. Multidimensional scaling was used to statistically analyze the accuracy of classifying the neurocognitive tests into a simple versus complex dichotomy. Hierarchical logistic regression models were then used to define the contribution made by complex versus simple tests in predicting the significant differences between children with FASD and controls. Complex test items discriminated better than simple test items. The tests that conformed well to the model were the Verbal Fluency, Progressive Planning Test (PPT), the Lhermitte memory tasks, and the Grooved Pegboard Test (GPT). The FASD-grouped children, when compared with controls, demonstrated impaired performance on letter fluency, while their performance was similar on category fluency. On the more complex PPT trials (problems 5 to

  4. The Logic Process Formalism of the Informational Domain

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available The performance of present-day informational technologies has two main properties: the universality of the structures used and the flexibility of the final user's interfaces. The first determines the potential cover area of the informational domain. The second determines the diversity and efficiency of processing methods of the proceedings being automated. The mentioned aspects are of great importance in agriculture and ecology because there are complex processes and considerable volumes of used information. For example, the meteoro-logical processes are a part of the ecological one like habitats' existential conditions and are known as a complex prognostic problem. The latter needs considerable computational resources to solve the appropriate equations. Likewise, agriculture as a controlled activity under strong impact from natural conditions has the same high requirements for diverse structures and flexibility of information processing.

  5. Bim Automation: Advanced Modeling Generative Process for Complex Structures

    Science.gov (United States)

    Banfi, F.; Fai, S.; Brumana, R.

    2017-08-01

    The new paradigm of the complexity of modern and historic structures, which are characterised by complex forms, morphological and typological variables, is one of the greatest challenges for building information modelling (BIM). Generation of complex parametric models needs new scientific knowledge concerning new digital technologies. These elements are helpful to store a vast quantity of information during the life cycle of buildings (LCB). The latest developments of parametric applications do not provide advanced tools, resulting in time-consuming work for the generation of models. This paper presents a method capable of processing and creating complex parametric Building Information Models (BIM) with Non-Uniform to NURBS) with multiple levels of details (Mixed and ReverseLoD) based on accurate 3D photogrammetric and laser scanning surveys. Complex 3D elements are converted into parametric BIM software and finite element applications (BIM to FEA) using specific exchange formats and new modelling tools. The proposed approach has been applied to different case studies: the BIM of modern structure for the courtyard of West Block on Parliament Hill in Ottawa (Ontario) and the BIM of Masegra Castel in Sondrio (Italy), encouraging the dissemination and interaction of scientific results without losing information during the generative process.

  6. Information processing in complex networks

    OpenAIRE

    Quax, R.

    2013-01-01

    Eerste resultaten van onderzoek van Rick Quax suggereren dat een combinatie van informatietheorie, netwerktheorie en statistische mechanica kan leiden tot een veelbelovende theorie om het gedrag van complexe netwerken te voorspellen. Er bestaat nog weinig theorie over het gedrag van dynamische eenheden die verbonden zijn in een netwerk, zoals neuronen in een breinnetwerk of genen in een gen-regulatienetwerk. Quax combineert informatietheorie, netwerktheorie, en statistische onderzoeken en mec...

  7. Towards the understanding of network information processing in biology

    Science.gov (United States)

    Singh, Vijay

    Living organisms perform incredibly well in detecting a signal present in the environment. This information processing is achieved near optimally and quite reliably, even though the sources of signals are highly variable and complex. The work in the last few decades has given us a fair understanding of how individual signal processing units like neurons and cell receptors process signals, but the principles of collective information processing on biological networks are far from clear. Information processing in biological networks, like the brain, metabolic circuits, cellular-signaling circuits, etc., involves complex interactions among a large number of units (neurons, receptors). The combinatorially large number of states such a system can exist in makes it impossible to study these systems from the first principles, starting from the interactions between the basic units. The principles of collective information processing on such complex networks can be identified using coarse graining approaches. This could provide insights into the organization and function of complex biological networks. Here I study models of biological networks using continuum dynamics, renormalization, maximum likelihood estimation and information theory. Such coarse graining approaches identify features that are essential for certain processes performed by underlying biological networks. We find that long-range connections in the brain allow for global scale feature detection in a signal. These also suppress the noise and remove any gaps present in the signal. Hierarchical organization with long-range connections leads to large-scale connectivity at low synapse numbers. Time delays can be utilized to separate a mixture of signals with temporal scales. Our observations indicate that the rules in multivariate signal processing are quite different from traditional single unit signal processing.

  8. Real time information management for improving productivity in metallurgical complexes

    International Nuclear Information System (INIS)

    Bascur, O.A.; Kennedy, J.P.

    1999-01-01

    Applying the latest information technologies in industrial plants has become a serious challenge to management and technical teams. The availability of real time and historical operations information to identify the most critical part of the processing system from mechanical integrity is a must for global plant optimization. Expanded use of plant information on the desktop is a standard tool for revenue improvement, cost reduction, and adherence to production constraints. The industrial component desktop supports access to information for process troubleshooting, continuous improvement and innovation by plant and staff personnel. Collaboration between groups enables the implementation of an overall process effectiveness index based on losses due to equipment availability, production and product quality. The key to designing technology is to use the Internet based technologies created by Microsoft for its marketplace-office automation and the Web. Time derived variables are used for process analysis, troubleshooting and performance assessment. Connectivity between metallurgical complexes, research centers and their business system has become a reality. Two case studies of large integrated mining/metallurgical complexes are highlighted. (author)

  9. System model the processing of heterogeneous sensory information in robotized complex

    Science.gov (United States)

    Nikolaev, V.; Titov, V.; Syryamkin, V.

    2018-05-01

    Analyzed the scope and the types of robotic systems consisting of subsystems of the form "a heterogeneous sensors data processing subsystem". On the basis of the Queuing theory model is developed taking into account the unevenness of the intensity of information flow from the sensors to the subsystem of information processing. Analytical solution to assess the relationship of subsystem performance and uneven flows. The research of the obtained solution in the range of parameter values of practical interest.

  10. Communication Analysis of Information Complexes.

    Science.gov (United States)

    Malik, M. F.

    Communication analysis is a tool for perceptual assessment of existing or projected information complexes, i.e., an established reality perceived by one or many humans. An information complex could be of a physical nature, such as a building, landscape, city street; or of a pure informational nature, such as a film, television program,…

  11. Curcumin complexation with cyclodextrins by the autoclave process: Method development and characterization of complex formation.

    Science.gov (United States)

    Hagbani, Turki Al; Nazzal, Sami

    2017-03-30

    One approach to enhance curcumin (CUR) aqueous solubility is to use cyclodextrins (CDs) to form inclusion complexes where CUR is encapsulated as a guest molecule within the internal cavity of the water-soluble CD. Several methods have been reported for the complexation of CUR with CDs. Limited information, however, is available on the use of the autoclave process (AU) in complex formation. The aims of this work were therefore to (1) investigate and evaluate the AU cycle as a complex formation method to enhance CUR solubility; (2) compare the efficacy of the AU process with the freeze-drying (FD) and evaporation (EV) processes in complex formation; and (3) confirm CUR stability by characterizing CUR:CD complexes by NMR, Raman spectroscopy, DSC, and XRD. Significant differences were found in the saturation solubility of CUR from its complexes with CD when prepared by the three complexation methods. The AU yielded a complex with expected chemical and physical fingerprints for a CUR:CD inclusion complex that maintained the chemical integrity and stability of CUR and provided the highest solubility of CUR in water. Physical and chemical characterizations of the AU complexes confirmed the encapsulated of CUR inside the CD cavity and the transformation of the crystalline CUR:CD inclusion complex to an amorphous form. It was concluded that the autoclave process with its short processing time could be used as an alternate and efficient methods for drug:CD complexation. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Theoretical aspects of cellular decision-making and information-processing.

    Science.gov (United States)

    Kobayashi, Tetsuya J; Kamimura, Atsushi

    2012-01-01

    Microscopic biological processes have extraordinary complexity and variety at the sub-cellular, intra-cellular, and multi-cellular levels. In dealing with such complex phenomena, conceptual and theoretical frameworks are crucial, which enable us to understand seemingly different intra- and inter-cellular phenomena from unified viewpoints. Decision-making is one such concept that has attracted much attention recently. Since a number of cellular behavior can be regarded as processes to make specific actions in response to external stimuli, decision-making can cover and has been used to explain a broad range of different cellular phenomena [Balázsi et al. (Cell 144(6):910, 2011), Zeng et al. (Cell 141(4):682, 2010)]. Decision-making is also closely related to cellular information-processing because appropriate decisions cannot be made without exploiting the information that the external stimuli contain. Efficiency of information transduction and processing by intra-cellular networks determines the amount of information obtained, which in turn limits the efficiency of subsequent decision-making. Furthermore, information-processing itself can serve as another concept that is crucial for understanding of other biological processes than decision-making. In this work, we review recent theoretical developments on cellular decision-making and information-processing by focusing on the relation between these two concepts.

  13. Analyzing complex networks evolution through Information Theory quantifiers

    International Nuclear Information System (INIS)

    Carpi, Laura C.; Rosso, Osvaldo A.; Saco, Patricia M.; Ravetti, Martin Gomez

    2011-01-01

    A methodology to analyze dynamical changes in complex networks based on Information Theory quantifiers is proposed. The square root of the Jensen-Shannon divergence, a measure of dissimilarity between two probability distributions, and the MPR Statistical Complexity are used to quantify states in the network evolution process. Three cases are analyzed, the Watts-Strogatz model, a gene network during the progression of Alzheimer's disease and a climate network for the Tropical Pacific region to study the El Nino/Southern Oscillation (ENSO) dynamic. We find that the proposed quantifiers are able not only to capture changes in the dynamics of the processes but also to quantify and compare states in their evolution.

  14. Information and Self-Organization A Macroscopic Approach to Complex Systems

    CERN Document Server

    Haken, Hermann

    2006-01-01

    This book presents the concepts needed to deal with self-organizing complex systems from a unifying point of view that uses macroscopic data. The various meanings of the concept "information" are discussed and a general formulation of the maximum information (entropy) principle is used. With the aid of results from synergetics, adequate objective constraints for a large class of self-organizing systems are formulated and examples are given from physics, life and computer science. The relationship to chaos theory is examined and it is further shown that, based on possibly scarce and noisy data, unbiased guesses about processes of complex systems can be made and the underlying deterministic and random forces determined. This allows for probabilistic predictions of processes, with applications to numerous fields in science, technology, medicine and economics. The extensions of the third edition are essentially devoted to an introduction to the meaning of information in the quantum context. Indeed, quantum inform...

  15. Integrating complex business processes for knowledge-driven clinical decision support systems.

    Science.gov (United States)

    Kamaleswaran, Rishikesan; McGregor, Carolyn

    2012-01-01

    This paper presents in detail the component of the Complex Business Process for Stream Processing framework that is responsible for integrating complex business processes to enable knowledge-driven Clinical Decision Support System (CDSS) recommendations. CDSSs aid the clinician in supporting the care of patients by providing accurate data analysis and evidence-based recommendations. However, the incorporation of a dynamic knowledge-management system that supports the definition and enactment of complex business processes and real-time data streams has not been researched. In this paper we discuss the process web service as an innovative method of providing contextual information to a real-time data stream processing CDSS.

  16. Visual Navigation of Complex Information Spaces

    Directory of Open Access Journals (Sweden)

    Sarah North

    1995-11-01

    Full Text Available The authors lay the foundation for the introduction of visual navigation aid to assist computer users in direct manipulation of the complex information spaces. By exploring present research on scientific data visualisation and creating a case for improved information visualisation tools, they introduce the design of an improved information visualisation interface utilizing dynamic slider, called Visual-X, incorporating icons with bindable attributes (glyphs. Exploring the improvement that these data visualisations, make to a computing environment, the authors conduct an experiment to compare the performance of subjects who use traditional interfaces and Visual-X. Methodology is presented and conclusions reveal that the use of Visual-X appears to be a promising approach in providing users with a navigation tool that does not overload their cognitive processes.

  17. Analyzing complex networks evolution through Information Theory quantifiers

    Energy Technology Data Exchange (ETDEWEB)

    Carpi, Laura C., E-mail: Laura.Carpi@studentmail.newcastle.edu.a [Civil, Surveying and Environmental Engineering, University of Newcastle, University Drive, Callaghan NSW 2308 (Australia); Departamento de Fisica, Instituto de Ciencias Exatas, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, Belo Horizonte (31270-901), MG (Brazil); Rosso, Osvaldo A., E-mail: rosso@fisica.ufmg.b [Departamento de Fisica, Instituto de Ciencias Exatas, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, Belo Horizonte (31270-901), MG (Brazil); Chaos and Biology Group, Instituto de Calculo, Facultad de Ciencias Exactas y Naturales, Universidad de Buenos Aires, Pabellon II, Ciudad Universitaria, 1428 Ciudad de Buenos Aires (Argentina); Saco, Patricia M., E-mail: Patricia.Saco@newcastle.edu.a [Civil, Surveying and Environmental Engineering, University of Newcastle, University Drive, Callaghan NSW 2308 (Australia); Departamento de Hidraulica, Facultad de Ciencias Exactas, Ingenieria y Agrimensura, Universidad Nacional de Rosario, Avenida Pellegrini 250, Rosario (Argentina); Ravetti, Martin Gomez, E-mail: martin.ravetti@dep.ufmg.b [Departamento de Engenharia de Producao, Universidade Federal de Minas Gerais, Av. Antonio Carlos, 6627, Belo Horizonte (31270-901), MG (Brazil)

    2011-01-24

    A methodology to analyze dynamical changes in complex networks based on Information Theory quantifiers is proposed. The square root of the Jensen-Shannon divergence, a measure of dissimilarity between two probability distributions, and the MPR Statistical Complexity are used to quantify states in the network evolution process. Three cases are analyzed, the Watts-Strogatz model, a gene network during the progression of Alzheimer's disease and a climate network for the Tropical Pacific region to study the El Nino/Southern Oscillation (ENSO) dynamic. We find that the proposed quantifiers are able not only to capture changes in the dynamics of the processes but also to quantify and compare states in their evolution.

  18. Teaching Information Systems Development via Process Variants

    Science.gov (United States)

    Tan, Wee-Kek; Tan, Chuan-Hoo

    2010-01-01

    Acquiring the knowledge to assemble an integrated Information System (IS) development process that is tailored to the specific needs of a project has become increasingly important. It is therefore necessary for educators to impart to students this crucial skill. However, Situational Method Engineering (SME) is an inherently complex process that…

  19. Information processing for aerospace structural health monitoring

    Science.gov (United States)

    Lichtenwalner, Peter F.; White, Edward V.; Baumann, Erwin W.

    1998-06-01

    Structural health monitoring (SHM) technology provides a means to significantly reduce life cycle of aerospace vehicles by eliminating unnecessary inspections, minimizing inspection complexity, and providing accurate diagnostics and prognostics to support vehicle life extension. In order to accomplish this, a comprehensive SHM system will need to acquire data from a wide variety of diverse sensors including strain gages, accelerometers, acoustic emission sensors, crack growth gages, corrosion sensors, and piezoelectric transducers. Significant amounts of computer processing will then be required to convert this raw sensor data into meaningful information which indicates both the diagnostics of the current structural integrity as well as the prognostics necessary for planning and managing the future health of the structure in a cost effective manner. This paper provides a description of the key types of information processing technologies required in an effective SHM system. These include artificial intelligence techniques such as neural networks, expert systems, and fuzzy logic for nonlinear modeling, pattern recognition, and complex decision making; signal processing techniques such as Fourier and wavelet transforms for spectral analysis and feature extraction; statistical algorithms for optimal detection, estimation, prediction, and fusion; and a wide variety of other algorithms for data analysis and visualization. The intent of this paper is to provide an overview of the role of information processing for SHM, discuss various technologies which can contribute to accomplishing this role, and present some example applications of information processing for SHM implemented at the Boeing Company.

  20. The informed consent process in randomised controlled trials: a nurse-led process.

    Science.gov (United States)

    Cresswell, Pip; Gilmour, Jean

    2014-03-01

    Clinical trials are carried out with human participants to answer questions about the best way to diagnose, treat and prevent illness. Participants must give informed consent to take part in clinical trials that requires understanding of how clinical trials work and their purpose. Randomised controlled trials provide strong evidence but their complex design is difficult for both clinicians and participants to understand. Increasingly, ensuring informed consent in randomised controlled trials has become part of the clinical research nurse role. The aim of this study was to explore in depth the clinical research nurse role in the informed consent process using a qualitative descriptive approach. Three clinical research nurses were interviewed and data analysed using a thematic analysis approach. Three themes were identified to describe the process of ensuring informed consent. The first theme, Preparatory partnerships, canvassed the relationships required prior to initiation of the informed consent process. The second theme, Partnering the participant, emphasises the need for ensuring voluntariness and understanding, along with patient advocacy. The third theme, Partnership with the project, highlights the clinical research nurse contribution to the capacity of the trial to answer the research question through appropriate recruiting and follow up of participants. Gaining informed consent in randomised controlled trials was complex and required multiple partnerships. A wide variety of skills was used to protect the safety of trial participants and promote quality research. The information from this study contributes to a greater understanding of the clinical research nurse role, and suggests the informed consent process in trials can be a nurse-led one. In order to gain collegial, employer and industry recognition it is important this aspect of the nursing role is acknowledged.

  1. Consciousness: a unique way of processing information.

    Science.gov (United States)

    Marchetti, Giorgio

    2018-02-08

    In this article, I argue that consciousness is a unique way of processing information, in that: it produces information, rather than purely transmitting it; the information it produces is meaningful for us; the meaning it has is always individuated. This uniqueness allows us to process information on the basis of our personal needs and ever-changing interactions with the environment, and consequently to act autonomously. Three main basic cognitive processes contribute to realize this unique way of information processing: the self, attention and working memory. The self, which is primarily expressed via the central and peripheral nervous systems, maps our body, the environment, and our relations with the environment. It is the primary means by which the complexity inherent to our composite structure is reduced into the "single voice" of a unique individual. It provides a reference system that (albeit evolving) is sufficiently stable to define the variations that will be used as the raw material for the construction of conscious information. Attention allows for the selection of those variations in the state of the self that are most relevant in the given situation. Attention originates and is deployed from a single locus inside our body, which represents the center of the self, around which all our conscious experiences are organized. Whatever is focused by attention appears in our consciousness as possessing a spatial quality defined by this center and the direction toward which attention is focused. In addition, attention determines two other features of conscious experience: periodicity and phenomenal quality. Self and attention are necessary but not sufficient for conscious information to be produced. Complex forms of conscious experiences, such as the various modes of givenness of conscious experience and the stream of consciousness, need a working memory mechanism to assemble the basic pieces of information selected by attention.

  2. Bridging domains : a comparison between information processing in Archaea and Eukarya

    NARCIS (Netherlands)

    Koning, de B.

    2015-01-01

    Bridging Domains

    A Comparison between Information Processing in Archaea and Eukarya

    Studying Information Processing

    Living cells evolved complex systems to handle the flow of information both

  3. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    Science.gov (United States)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  4. Mathematics Education as a Proving-Ground for Information-Processing Theories.

    Science.gov (United States)

    Greer, Brian, Ed.; Verschaffel, Lieven, Ed.

    1990-01-01

    Five papers discuss the current and potential contributions of information-processing theory to our understanding of mathematical thinking as those contributions affect the practice of mathematics education. It is concluded that information-processing theories need to be supplemented in various ways to more adequately reflect the complexity of…

  5. Capturing connectivity and causality in complex industrial processes

    CERN Document Server

    Yang, Fan; Shah, Sirish L; Chen, Tongwen

    2014-01-01

    This brief reviews concepts of inter-relationship in modern industrial processes, biological and social systems. Specifically ideas of connectivity and causality within and between elements of a complex system are treated; these ideas are of great importance in analysing and influencing mechanisms, structural properties and their dynamic behaviour, especially for fault diagnosis and hazard analysis. Fault detection and isolation for industrial processes being concerned with root causes and fault propagation, the brief shows that, process connectivity and causality information can be captured in two ways: ·      from process knowledge: structural modeling based on first-principles structural models can be merged with adjacency/reachability matrices or topology models obtained from process flow-sheets described in standard formats; and ·      from process data: cross-correlation analysis, Granger causality and its extensions, frequency domain methods, information-theoretical methods, and Bayesian ne...

  6. STAR-GENERIS - a software package for information processing

    International Nuclear Information System (INIS)

    Felkel, L.

    1985-01-01

    Man-machine-communication in electrical power plants is increasingly based on the capabilities of minicomputers. Rather than just displaying raw process data more complex processing is done to aid operators by improving information quality. Advanced operator aids for nuclear power plants are, e.g. alarm reduction, disturbance analysis and expert systems. Operator aids use complex combinations and computations of plant signals, which have to be described in a formal and homogeneous way. The design of such computer-based information systems requires extensive software and engineering efforts. The STAR software concept reduces the software effort to a minimum by proving an advanced program package which facilitates specification and implementation of engineering know-how necessary for sophisticated operator aids. (orig./HP) [de

  7. A new information dimension of complex networks

    Energy Technology Data Exchange (ETDEWEB)

    Wei, Daijun [School of Computer and Information Science, Southwest University, Chongqing 400715 (China); School of Science, Hubei University for Nationalities, Enshi 445000 (China); Wei, Bo [School of Computer and Information Science, Southwest University, Chongqing 400715 (China); Hu, Yong [Institute of Business Intelligence and Knowledge Discovery, Guangdong University of Foreign Studies, Guangzhou 510006 (China); Zhang, Haixin [School of Computer and Information Science, Southwest University, Chongqing 400715 (China); Deng, Yong, E-mail: ydeng@swu.edu.cn [School of Computer and Information Science, Southwest University, Chongqing 400715 (China); School of Engineering, Vanderbilt University, TN 37235 (United States)

    2014-03-01

    Highlights: •The proposed measure is more practical than the classical information dimension. •The difference of information for box in the box-covering algorithm is considered. •Results indicate the measure can capture the fractal property of complex networks. -- Abstract: The fractal and self-similarity properties are revealed in many complex networks. The classical information dimension is an important method to study fractal and self-similarity properties of planar networks. However, it is not practical for real complex networks. In this Letter, a new information dimension of complex networks is proposed. The nodes number in each box is considered by using the box-covering algorithm of complex networks. The proposed method is applied to calculate the fractal dimensions of some real networks. Our results show that the proposed method is efficient when dealing with the fractal dimension problem of complex networks.

  8. A new information dimension of complex networks

    International Nuclear Information System (INIS)

    Wei, Daijun; Wei, Bo; Hu, Yong; Zhang, Haixin; Deng, Yong

    2014-01-01

    Highlights: •The proposed measure is more practical than the classical information dimension. •The difference of information for box in the box-covering algorithm is considered. •Results indicate the measure can capture the fractal property of complex networks. -- Abstract: The fractal and self-similarity properties are revealed in many complex networks. The classical information dimension is an important method to study fractal and self-similarity properties of planar networks. However, it is not practical for real complex networks. In this Letter, a new information dimension of complex networks is proposed. The nodes number in each box is considered by using the box-covering algorithm of complex networks. The proposed method is applied to calculate the fractal dimensions of some real networks. Our results show that the proposed method is efficient when dealing with the fractal dimension problem of complex networks.

  9. The Process of Handling an Excess of Complex and Interdisciplinary Information in a Decision Support Research Situation

    Directory of Open Access Journals (Sweden)

    Fredrik Moltu Johnsen

    2017-06-01

    Full Text Available Researchers are sometimes expected to investigate a complex and interdisciplinary subject-matter in order to provide scientific support for large-scale decisions. This may prove challenging: typically, a lack of cohesion between the pieces of information investigated in the starting phase may cause confusion. This article suggests one possible road from this problem, which may lead to holistic understanding and next to communication and implementation of this understanding. The process is presented as a diagram, and selected aspects of it are analysed. The process involves moving to a higher level of generalisation in order to gain a better overview and potentially invent new concepts, and next moving back to a more detailed level in order to communicate and implement these insights. Potential challenges and roadblocks are identified. The possible conflict between normal science and decision support is briefly investigated; it is pointed out that “post-normal science” may be a more appropriate description of such processes than simply “science”.

  10. Advanced monitoring with complex stream processing

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Making sense of metrics and logs for service monitoring can be a complicated task. Valuable information is normally scattered across several streams of monitoring data, requiring aggregation, correlation and time-based analysis to promptly detect problems and failures. This presentations shows a solution which is used to support the advanced monitoring of the messaging services provided by the IT Department. It uses Esper, an open-source software product for Complex Event Processing (CEP), that analyses series of events for deriving conclusions from them.

  11. Information processing of earth resources data

    Science.gov (United States)

    Zobrist, A. L.; Bryant, N. A.

    1982-01-01

    Current trends in the use of remotely sensed data include integration of multiple data sources of various formats and use of complex models. These trends have placed a strain on information processing systems because an enormous number of capabilities are needed to perform a single application. A solution to this problem is to create a general set of capabilities which can perform a wide variety of applications. General capabilities for the Image-Based Information System (IBIS) are outlined in this report. They are then cross-referenced for a set of applications performed at JPL.

  12. Is it health information technology? : Task complexity and work substitution

    NARCIS (Netherlands)

    Medina Palomino, Hector; Rutkowski, Anne; Verhulst, Matthijs

    2015-01-01

    New technology is making it possible to replace professions built on complex knowledge, e.g. medicine. In our exploratory research we examined how Information Technologies might be replacing some of the tasks formerly processed by physician anesthesiologists (MDAs). Data (N=1178) were collected at a

  13. Advances in intelligent process-aware information systems concepts, methods, and technologies

    CERN Document Server

    Oberhauser, Roy; Reichert, Manfred

    2017-01-01

    This book provides a state-of-the-art perspective on intelligent process-aware information systems and presents chapters on specific facets and approaches applicable to such systems. Further, it highlights novel advances and developments in various aspects of intelligent process-aware information systems and business process management systems. Intelligence capabilities are increasingly being integrated into or created in many of today’s software products and services. Process-aware information systems provide critical computing infrastructure to support the various processes involved in the creation and delivery of business products and services. Yet the integration of intelligence capabilities into process-aware information systems is a non-trivial yet necessary evolution of these complex systems. The book’s individual chapters address adaptive process management, case management processes, autonomically-capable processes, process-oriented information logistics, process recommendations, reasoning over ...

  14. Gathering Information from Transport Systems for Processing in Supply Chains

    Science.gov (United States)

    Kodym, Oldřich; Unucka, Jakub

    2016-12-01

    Paper deals with complex system for processing information from means of transport acting as parts of train (rail or road). It focuses on automated information gathering using AutoID technology, information transmission via Internet of Things networks and information usage in information systems of logistic firms for support of selected processes on MES and ERP levels. Different kinds of gathered information from whole transport chain are discussed. Compliance with existing standards is mentioned. Security of information in full life cycle is integral part of presented system. Design of fully equipped system based on synthesized functional nodes is presented.

  15. Towards an Information Theory of Complex Networks

    CERN Document Server

    Dehmer, Matthias; Mehler, Alexander

    2011-01-01

    For over a decade, complex networks have steadily grown as an important tool across a broad array of academic disciplines, with applications ranging from physics to social media. A tightly organized collection of carefully-selected papers on the subject, Towards an Information Theory of Complex Networks: Statistical Methods and Applications presents theoretical and practical results about information-theoretic and statistical models of complex networks in the natural sciences and humanities. The book's major goal is to advocate and promote a combination of graph-theoretic, information-theoreti

  16. Increasing process understanding by analyzing complex interactions in experimental data

    DEFF Research Database (Denmark)

    Naelapaa, Kaisa; Allesø, Morten; Kristensen, Henning Gjelstrup

    2009-01-01

    understanding of a coating process. It was possible to model the response, that is, the amount of drug released, using both mentioned techniques. However, the ANOVAmodel was difficult to interpret as several interactions between process parameters existed. In contrast to ANOVA, GEMANOVA is especially suited...... for modeling complex interactions and making easily understandable models of these. GEMANOVA modeling allowed a simple visualization of the entire experimental space. Furthermore, information was obtained on how relative changes in the settings of process parameters influence the film quality and thereby drug......There is a recognized need for new approaches to understand unit operations with pharmaceutical relevance. A method for analyzing complex interactions in experimental data is introduced. Higher-order interactions do exist between process parameters, which complicate the interpretation...

  17. Shifts in information processing level: the speed theory of intelligence revisited.

    Science.gov (United States)

    Sircar, S S

    2000-06-01

    A hypothesis is proposed here to reconcile the inconsistencies observed in the IQ-P3 latency relation. The hypothesis stems from the observation that task-induced increase in P3 latency correlates positively with IQ scores. It is hypothesised that: (a) there are several parallel information processing pathways of varying complexity which are associated with the generation of P3 waves of varying latencies; (b) with increasing workload, there is a shift in the 'information processing level' through progressive recruitment of more complex polysynaptic pathways with greater processing power and inhibition of the oligosynaptic pathways; (c) high-IQ subjects have a greater reserve of higher level processing pathways; (d) a given 'task-load' imposes a greater 'mental workload' in subjects with lower IQ than in those with higher IQ. According to this hypothesis, a meaningful comparison of the P3 correlates of IQ is possible only when the information processing level is pushed to its limits.

  18. Information driven self-organization of complex robotic behaviors.

    Directory of Open Access Journals (Sweden)

    Georg Martius

    Full Text Available Information theory is a powerful tool to express principles to drive autonomous systems because it is domain invariant and allows for an intuitive interpretation. This paper studies the use of the predictive information (PI, also called excess entropy or effective measure complexity, of the sensorimotor process as a driving force to generate behavior. We study nonlinear and nonstationary systems and introduce the time-local predicting information (TiPI which allows us to derive exact results together with explicit update rules for the parameters of the controller in the dynamical systems framework. In this way the information principle, formulated at the level of behavior, is translated to the dynamics of the synapses. We underpin our results with a number of case studies with high-dimensional robotic systems. We show the spontaneous cooperativity in a complex physical system with decentralized control. Moreover, a jointly controlled humanoid robot develops a high behavioral variety depending on its physics and the environment it is dynamically embedded into. The behavior can be decomposed into a succession of low-dimensional modes that increasingly explore the behavior space. This is a promising way to avoid the curse of dimensionality which hinders learning systems to scale well.

  19. Recording information on protein complexes in an information management system.

    Science.gov (United States)

    Savitsky, Marc; Diprose, Jonathan M; Morris, Chris; Griffiths, Susanne L; Daniel, Edward; Lin, Bill; Daenke, Susan; Bishop, Benjamin; Siebold, Christian; Wilson, Keith S; Blake, Richard; Stuart, David I; Esnouf, Robert M

    2011-08-01

    The Protein Information Management System (PiMS) is a laboratory information management system (LIMS) designed for use with the production of proteins in a research environment. The software is distributed under the CCP4 licence, and so is available free of charge to academic laboratories. Like most LIMS, the underlying PiMS data model originally had no support for protein-protein complexes. To support the SPINE2-Complexes project the developers have extended PiMS to meet these requirements. The modifications to PiMS, described here, include data model changes, additional protocols, some user interface changes and functionality to detect when an experiment may have formed a complex. Example data are shown for the production of a crystal of a protein complex. Integration with SPINE2-Complexes Target Tracker application is also described. Copyright © 2011 Elsevier Inc. All rights reserved.

  20. Information geometric methods for complexity

    Science.gov (United States)

    Felice, Domenico; Cafaro, Carlo; Mancini, Stefano

    2018-03-01

    Research on the use of information geometry (IG) in modern physics has witnessed significant advances recently. In this review article, we report on the utilization of IG methods to define measures of complexity in both classical and, whenever available, quantum physical settings. A paradigmatic example of a dramatic change in complexity is given by phase transitions (PTs). Hence, we review both global and local aspects of PTs described in terms of the scalar curvature of the parameter manifold and the components of the metric tensor, respectively. We also report on the behavior of geodesic paths on the parameter manifold used to gain insight into the dynamics of PTs. Going further, we survey measures of complexity arising in the geometric framework. In particular, we quantify complexity of networks in terms of the Riemannian volume of the parameter space of a statistical manifold associated with a given network. We are also concerned with complexity measures that account for the interactions of a given number of parts of a system that cannot be described in terms of a smaller number of parts of the system. Finally, we investigate complexity measures of entropic motion on curved statistical manifolds that arise from a probabilistic description of physical systems in the presence of limited information. The Kullback-Leibler divergence, the distance to an exponential family and volumes of curved parameter manifolds, are examples of essential IG notions exploited in our discussion of complexity. We conclude by discussing strengths, limits, and possible future applications of IG methods to the physics of complexity.

  1. Crosstalk between endophytes and a plant host within information-processing networks

    Directory of Open Access Journals (Sweden)

    Kozyrovska N. O.

    2013-05-01

    Full Text Available Plants are heavily populated by pro- and eukaryotic microorganisms and represent therefore the tremendous complexity as a biological system. This system exists as an information-processing entity with rather complex processes of communication, occurring throughout the individual plant. The plant cellular information-proces- sing network constitutes the foundation for processes like growth, defense, and adaptation to the environment. Up to date, the molecular mechanisms, underlying perception, transfer, analysis, and storage of the endogenous and environmental information within the plant, remain to be fully understood. The associated microorganisms and their investment in the information conditioning are often ignored. Endophytes as plant partners are indispen- sable integrative part of the plant system. Diverse endophytic microorganisms comprise «normal» microbiota that plays a role in plant immunity and helps the plant system to survive in the environment (providing assistance in defense, nutrition, detoxification etc.. The role of endophytic microbiota in the processing of information may be presumed, taking into account a plant-microbial co-evolution and empirical data. Since the literature are be- ginning to emerge on this topic, in this article, I review key works in the field of plant-endophytes interactions in the context of information processing and represent the opinion on their putative role in plant information web under defense and the adaptation to changed conditions.

  2. An analytical approach to customer requirement information processing

    Science.gov (United States)

    Zhou, Zude; Xiao, Zheng; Liu, Quan; Ai, Qingsong

    2013-11-01

    'Customer requirements' (CRs) management is a key component of customer relationship management (CRM). By processing customer-focused information, CRs management plays an important role in enterprise systems (ESs). Although two main CRs analysis methods, quality function deployment (QFD) and Kano model, have been applied to many fields by many enterprises in the past several decades, the limitations such as complex processes and operations make them unsuitable for online businesses among small- and medium-sized enterprises (SMEs). Currently, most SMEs do not have the resources to implement QFD or Kano model. In this article, we propose a method named customer requirement information (CRI), which provides a simpler and easier way for SMEs to run CRs analysis. The proposed method analyses CRs from the perspective of information and applies mathematical methods to the analysis process. A detailed description of CRI's acquisition, classification and processing is provided.

  3. Maximizing information exchange between complex networks

    International Nuclear Information System (INIS)

    West, Bruce J.; Geneston, Elvis L.; Grigolini, Paolo

    2008-01-01

    research overarching all of the traditional scientific disciplines. The transportation networks of planes, highways and railroads; the economic networks of global finance and stock markets; the social networks of terrorism, governments, businesses and churches; the physical networks of telephones, the Internet, earthquakes and global warming and the biological networks of gene regulation, the human body, clusters of neurons and food webs, share a number of apparently universal properties as the networks become increasingly complex. Ubiquitous aspects of such complex networks are the appearance of non-stationary and non-ergodic statistical processes and inverse power-law statistical distributions. Herein we review the traditional dynamical and phase-space methods for modeling such networks as their complexity increases and focus on the limitations of these procedures in explaining complex networks. Of course we will not be able to review the entire nascent field of network science, so we limit ourselves to a review of how certain complexity barriers have been surmounted using newly applied theoretical concepts such as aging, renewal, non-ergodic statistics and the fractional calculus. One emphasis of this review is information transport between complex networks, which requires a fundamental change in perception that we express as a transition from the familiar stochastic resonance to the new concept of complexity matching

  4. Maximizing information exchange between complex networks

    Science.gov (United States)

    West, Bruce J.; Geneston, Elvis L.; Grigolini, Paolo

    2008-10-01

    modern research overarching all of the traditional scientific disciplines. The transportation networks of planes, highways and railroads; the economic networks of global finance and stock markets; the social networks of terrorism, governments, businesses and churches; the physical networks of telephones, the Internet, earthquakes and global warming and the biological networks of gene regulation, the human body, clusters of neurons and food webs, share a number of apparently universal properties as the networks become increasingly complex. Ubiquitous aspects of such complex networks are the appearance of non-stationary and non-ergodic statistical processes and inverse power-law statistical distributions. Herein we review the traditional dynamical and phase-space methods for modeling such networks as their complexity increases and focus on the limitations of these procedures in explaining complex networks. Of course we will not be able to review the entire nascent field of network science, so we limit ourselves to a review of how certain complexity barriers have been surmounted using newly applied theoretical concepts such as aging, renewal, non-ergodic statistics and the fractional calculus. One emphasis of this review is information transport between complex networks, which requires a fundamental change in perception that we express as a transition from the familiar stochastic resonance to the new concept of complexity matching.

  5. Maximizing information exchange between complex networks

    Energy Technology Data Exchange (ETDEWEB)

    West, Bruce J. [Mathematical and Information Science, Army Research Office, Research Triangle Park, NC 27708 (United States); Physics Department, Duke University, Durham, NC 27709 (United States)], E-mail: bwest@nc.rr.com; Geneston, Elvis L. [Center for Nonlinear Science, University of North Texas, P.O. Box 311427, Denton, TX 76203-1427 (United States); Physics Department, La Sierra University, 4500 Riverwalk Parkway, Riverside, CA 92515 (United States); Grigolini, Paolo [Center for Nonlinear Science, University of North Texas, P.O. Box 311427, Denton, TX 76203-1427 (United States); Istituto di Processi Chimico Fisici del CNR, Area della Ricerca di Pisa, Via G. Moruzzi, 56124, Pisa (Italy); Dipartimento di Fisica ' E. Fermi' Universita' di Pisa, Largo Pontecorvo 3, 56127 Pisa (Italy)

    2008-10-15

    modern research overarching all of the traditional scientific disciplines. The transportation networks of planes, highways and railroads; the economic networks of global finance and stock markets; the social networks of terrorism, governments, businesses and churches; the physical networks of telephones, the Internet, earthquakes and global warming and the biological networks of gene regulation, the human body, clusters of neurons and food webs, share a number of apparently universal properties as the networks become increasingly complex. Ubiquitous aspects of such complex networks are the appearance of non-stationary and non-ergodic statistical processes and inverse power-law statistical distributions. Herein we review the traditional dynamical and phase-space methods for modeling such networks as their complexity increases and focus on the limitations of these procedures in explaining complex networks. Of course we will not be able to review the entire nascent field of network science, so we limit ourselves to a review of how certain complexity barriers have been surmounted using newly applied theoretical concepts such as aging, renewal, non-ergodic statistics and the fractional calculus. One emphasis of this review is information transport between complex networks, which requires a fundamental change in perception that we express as a transition from the familiar stochastic resonance to the new concept of complexity matching.

  6. The value of mechanistic biophysical information for systems-level understanding of complex biological processes such as cytokinesis.

    Science.gov (United States)

    Pollard, Thomas D

    2014-12-02

    This review illustrates the value of quantitative information including concentrations, kinetic constants and equilibrium constants in modeling and simulating complex biological processes. Although much has been learned about some biological systems without these parameter values, they greatly strengthen mechanistic accounts of dynamical systems. The analysis of muscle contraction is a classic example of the value of combining an inventory of the molecules, atomic structures of the molecules, kinetic constants for the reactions, reconstitutions with purified proteins and theoretical modeling to account for the contraction of whole muscles. A similar strategy is now being used to understand the mechanism of cytokinesis using fission yeast as a favorable model system. Copyright © 2014 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  7. The architecture of the management system of complex steganographic information

    Science.gov (United States)

    Evsutin, O. O.; Meshcheryakov, R. V.; Kozlova, A. S.; Solovyev, T. M.

    2017-01-01

    The aim of the study is to create a wide area information system that allows one to control processes of generation, embedding, extraction, and detection of steganographic information. In this paper, the following problems are considered: the definition of the system scope and the development of its architecture. For creation of algorithmic maintenance of the system, classic methods of steganography are used to embed information. Methods of mathematical statistics and computational intelligence are used to identify the embedded information. The main result of the paper is the development of the architecture of the management system of complex steganographic information. The suggested architecture utilizes cloud technology in order to provide service using the web-service via the Internet. It is meant to provide streams of multimedia data processing that are streams with many sources of different types. The information system, built in accordance with the proposed architecture, will be used in the following areas: hidden transfer of documents protected by medical secrecy in telemedicine systems; copyright protection of online content in public networks; prevention of information leakage caused by insiders.

  8. Photonic Quantum Information Processing

    International Nuclear Information System (INIS)

    Walther, P.

    2012-01-01

    The advantage of the photon's mobility makes optical quantum system ideally suited for delegated quantum computation. I will present results for the realization for a measurement-based quantum network in a client-server environment, where quantum information is securely communicated and computed. Related to measurement-based quantum computing I will discuss a recent experiment showing that quantum discord can be used as resource for the remote state preparation, which might shine new light on the requirements for quantum-enhanced information processing. Finally, I will briefly review recent photonic quantum simulation experiments of four frustrated Heisenberg-interactions spins and present an outlook of feasible simulation experiments with more complex interactions or random walk structures. As outlook I will discuss the current status of new quantum technology for improving the scalability of photonic quantum systems by using superconducting single-photon detectors and tailored light-matter interactions. (author)

  9. The effects of mild and severe traumatic brain injury on speed of information processing as measured by the computerized tests of information processing (CTIP).

    Science.gov (United States)

    Tombaugh, Tom N; Rees, Laura; Stormer, Peter; Harrison, Allyson G; Smith, Andra

    2007-01-01

    In spite of the fact that reaction time (RT) measures are sensitive to the effects of traumatic brain injury (TBI), few RT procedures have been developed for use in standard clinical evaluations. The computerized test of information processing (CTIP) [Tombaugh, T. N., & Rees, L. (2000). Manual for the computerized tests of information processing (CTIP). Ottawa, Ont.: Carleton University] was designed to measure the degree to which TBI decreases the speed at which information is processed. The CTIP consists of three computerized programs that progressively increase the amount of information that is processed. Results of the current study demonstrated that RT increased as the difficulty of the CTIP tests increased (known as the complexity effect), and as severity of injury increased (from mild to severe TBI). The current study also demonstrated the importance of selecting a non-biased measure of variability. Overall, findings suggest that the CTIP is an easy to administer and sensitive measure of information processing speed.

  10. Efficiency of cellular information processing

    International Nuclear Information System (INIS)

    Barato, Andre C; Hartich, David; Seifert, Udo

    2014-01-01

    We show that a rate of conditional Shannon entropy reduction, characterizing the learning of an internal process about an external process, is bounded by the thermodynamic entropy production. This approach allows for the definition of an informational efficiency that can be used to study cellular information processing. We analyze three models of increasing complexity inspired by the Escherichia coli sensory network, where the external process is an external ligand concentration jumping between two values. We start with a simple model for which ATP must be consumed so that a protein inside the cell can learn about the external concentration. With a second model for a single receptor we show that the rate at which the receptor learns about the external environment can be nonzero even without any dissipation inside the cell since chemical work done by the external process compensates for this learning rate. The third model is more complete, also containing adaptation. For this model we show inter alia that a bacterium in an environment that changes at a very slow time-scale is quite inefficient, dissipating much more than it learns. Using the concept of a coarse-grained learning rate, we show for the model with adaptation that while the activity learns about the external signal the option of changing the methylation level increases the concentration range for which the learning rate is substantial. (paper)

  11. An information theory based approach for quantitative evaluation of man-machine interface complexity

    International Nuclear Information System (INIS)

    Kang, Hyun Gook

    1999-02-01

    In complex and high-risk work conditions, especially such as in nuclear power plants, human understanding of the plant is highly cognitive and thus largely dependent on the effectiveness of the man-machine interface system. In order to provide more effective and reliable operating conditions for future nuclear power plants, developing more credible and easy to use evaluation methods will afford great help in designing interface systems in a more efficient manner. In this study, in order to analyze the human-machine interactions, I propose the Human-processor Communication(HPC) model which is based on the information flow concept. It identifies the information flow around a human-processor. Information flow has two aspects: appearance and content. Based on the HPC model, I propose two kinds of measures for evaluating a user interface from the viewpoint of these two aspects of information flow. They measure the communicative complexity of each aspect. In this study, for the evaluation of the aspect of appearance, I propose three complexity measures: Operation Complexity, Transition Complexity, and Screen Complexity. Each one of these measures has its own physical meaning. Two experiments carried out in this work support the utility of these measures. The result of the quiz game experiment shows that as the complexity of task context increases, the usage of the interface system becomes more complex. The experimental results of the three example systems(digital view, LDP style view and hierarchy view) show the utility of the proposed complexity measures. In this study, for the evaluation of the aspect of content, I propose the degree of informational coincidence, R (K, P) as a measure for the usefulness of an alarm-processing system. It is designed to perform user-oriented evaluation based on the informational entropy concept. It will be especially useful inearly design phase because designers can estimate the usefulness of an alarm system by short calculations instead

  12. An information theory based approach for quantitative evaluation of man-machine interface complexity

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Hyun Gook

    1999-02-15

    In complex and high-risk work conditions, especially such as in nuclear power plants, human understanding of the plant is highly cognitive and thus largely dependent on the effectiveness of the man-machine interface system. In order to provide more effective and reliable operating conditions for future nuclear power plants, developing more credible and easy to use evaluation methods will afford great help in designing interface systems in a more efficient manner. In this study, in order to analyze the human-machine interactions, I propose the Human-processor Communication(HPC) model which is based on the information flow concept. It identifies the information flow around a human-processor. Information flow has two aspects: appearance and content. Based on the HPC model, I propose two kinds of measures for evaluating a user interface from the viewpoint of these two aspects of information flow. They measure the communicative complexity of each aspect. In this study, for the evaluation of the aspect of appearance, I propose three complexity measures: Operation Complexity, Transition Complexity, and Screen Complexity. Each one of these measures has its own physical meaning. Two experiments carried out in this work support the utility of these measures. The result of the quiz game experiment shows that as the complexity of task context increases, the usage of the interface system becomes more complex. The experimental results of the three example systems(digital view, LDP style view and hierarchy view) show the utility of the proposed complexity measures. In this study, for the evaluation of the aspect of content, I propose the degree of informational coincidence, R (K, P) as a measure for the usefulness of an alarm-processing system. It is designed to perform user-oriented evaluation based on the informational entropy concept. It will be especially useful inearly design phase because designers can estimate the usefulness of an alarm system by short calculations instead

  13. Information Technology in Complex Health Services

    Science.gov (United States)

    Southon, Frank Charles Gray; Sauer, Chris; Dampney, Christopher Noel Grant (Kit)

    1997-01-01

    Abstract Objective: To identify impediments to the successful transfer and implementation of packaged information systems through large, divisionalized health services. Design: A case analysis of the failure of an implementation of a critical application in the Public Health System of the State of New South Wales, Australia, was carried out. This application had been proven in the United States environment. Measurements: Interviews involving over 60 staff at all levels of the service were undertaken by a team of three. The interviews were recorded and analyzed for key themes, and the results were shared and compared to enable a continuing critical assessment. Results: Two components of the transfer of the system were considered: the transfer from a different environment, and the diffusion throughout a large, divisionalized organization. The analyses were based on the Scott-Morton organizational fit framework. In relation to the first, it was found that there was a lack of fit in the business environments and strategies, organizational structures and strategy-structure pairing as well as the management process-roles pairing. The diffusion process experienced problems because of the lack of fit in the strategy-structure, strategy-structure-management processes, and strategy-structure-role relationships. Conclusion: The large-scale developments of integrated health services present great challenges to the efficient and reliable implementation of information technology, especially in large, divisionalized organizations. There is a need to take a more sophisticated approach to understanding the complexities of organizational factors than has traditionally been the case. PMID:9067877

  14. Motor dysfunction of complex regional pain syndrome is related to impaired central processing of proprioceptive information.

    Science.gov (United States)

    Bank, Paulina J M; Peper, C Lieke E; Marinus, Johan; Beek, Peter J; van Hilten, Jacobus J

    2013-11-01

    Our understanding of proprioceptive deficits in complex regional pain syndrome (CRPS) and its potential contribution to impaired motor function is still limited. To gain more insight into these issues, we evaluated accuracy and precision of joint position sense over a range of flexion-extension angles of the wrist of the affected and unaffected sides in 25 chronic CRPS patients and in 50 healthy controls. The results revealed proprioceptive impairment at both the patients' affected and unaffected sides, characterized predominantly by overestimation of wrist extension angles. Precision of the position estimates was more prominently reduced at the affected side. Importantly, group differences in proprioceptive performance were observed not only for tests at identical percentages of each individual's range of wrist motion but also when controls were tested at wrist angles that corresponded to those of the patient's affected side. More severe motor impairment of the affected side was associated with poorer proprioceptive performance. Based on additional sensory tests, variations in proprioceptive performance over the range of wrist angles, and comparisons between active and passive displacements, the disturbances of proprioceptive performance most likely resulted from altered processing of afferent (and not efferent) information and its subsequent interpretation in the context of a distorted "body schema." The present results point at a significant role for impaired central processing of proprioceptive information in the motor dysfunction of CRPS and suggest that therapeutic strategies aimed at identification of proprioceptive impairments and their restoration may promote the recovery of motor function in CRPS patients. Copyright © 2013 American Pain Society. Published by Elsevier Inc. All rights reserved.

  15. The eyes have it: Using eye tracking to inform information processing strategies in multi-attributes choices.

    Science.gov (United States)

    Ryan, Mandy; Krucien, Nicolas; Hermens, Frouke

    2018-04-01

    Although choice experiments (CEs) are widely applied in economics to study choice behaviour, understanding of how individuals process attribute information remains limited. We show how eye-tracking methods can provide insight into how decisions are made. Participants completed a CE, while their eye movements were recorded. Results show that although the information presented guided participants' decisions, there were also several processing biases at work. Evidence was found of (a) top-to-bottom, (b) left-to-right, and (c) first-to-last order biases. Experimental factors-whether attributes are defined as "best" or "worst," choice task complexity, and attribute ordering-also influence information processing. How individuals visually process attribute information was shown to be related to their choices. Implications for the design and analysis of CEs and future research are discussed. Copyright © 2017 John Wiley & Sons, Ltd.

  16. Robust and parliamentary or informal and participative? The pitfalls of decision-making processes in complex procedures; Robust-parlamentarisch oder informell-partizipativ? Die Tuecken der Entscheidungsfindung in komplexen Verfahren

    Energy Technology Data Exchange (ETDEWEB)

    Hocke, Peter [Karlsruher Institut fuer Technologie (KIT), Karlsruhe (Germany). Inst. fuer Technikfolgenabschaetzung und Systemanalyse (ITAS); Smeddinck, Ulrich [Technische Univ. Braunschweig (Germany). Inst. fuer Rechtswissenschaften

    2017-09-01

    The authors discuss the question whether the site selection decision for a final nuclear waste repository should be a parliamentary representative process or an informal pragmatic process based on public participation. In the frame of the German site selection law possibilities for innovative participation procedures were developed. The pitfalls of decision-making processes in complex procedures are discussed.

  17. Simple, complex and hyper-complex understanding - enhanced sensitivity in observation of information

    DEFF Research Database (Denmark)

    Bering Keiding, Tina

    for construction and analysis of empirical information. A quick overview on empirical research drawing on Luhmann reveals a diverse complex of analytical strategies and empirical methods. Despite differences between strategies and methods they have in common that understanding of uttered information is crucial...... in their production of empirically founded knowledge. However research generally seems to pay more attention to production of uttered information than to selection of understanding. The aim of this contribution is to sketch out a suggestion to how selection of understanding can be systematized in order to produce...... enhanced transparency in selection of understanding as well as enhanced sensitivity and definition in dept. The contribution suggest that we distinguish between three types of understanding; simple, complex and hyper-complex understanding. Simple understanding is the simultaneous selection of understanding...

  18. Choice Complexity, Benchmarks and Costly Information

    NARCIS (Netherlands)

    Harms, Job; Rosenkranz, S.; Sanders, M.W.J.L.

    In this study we investigate how two types of information interventions, providing a benchmark and providing costly information on option ranking, can improve decision-making in complex choices. In our experiment subjects made a series of incentivized choices between four hypothetical financial

  19. Predicting protein complexes using a supervised learning method combined with local structural information.

    Science.gov (United States)

    Dong, Yadong; Sun, Yongqi; Qin, Chao

    2018-01-01

    The existing protein complex detection methods can be broadly divided into two categories: unsupervised and supervised learning methods. Most of the unsupervised learning methods assume that protein complexes are in dense regions of protein-protein interaction (PPI) networks even though many true complexes are not dense subgraphs. Supervised learning methods utilize the informative properties of known complexes; they often extract features from existing complexes and then use the features to train a classification model. The trained model is used to guide the search process for new complexes. However, insufficient extracted features, noise in the PPI data and the incompleteness of complex data make the classification model imprecise. Consequently, the classification model is not sufficient for guiding the detection of complexes. Therefore, we propose a new robust score function that combines the classification model with local structural information. Based on the score function, we provide a search method that works both forwards and backwards. The results from experiments on six benchmark PPI datasets and three protein complex datasets show that our approach can achieve better performance compared with the state-of-the-art supervised, semi-supervised and unsupervised methods for protein complex detection, occasionally significantly outperforming such methods.

  20. Informational and Causal Architecture of Discrete-Time Renewal Processes

    Directory of Open Access Journals (Sweden)

    Sarah E. Marzen

    2015-07-01

    Full Text Available Renewal processes are broadly used to model stochastic behavior consisting of isolated events separated by periods of quiescence, whose durations are specified by a given probability law. Here, we identify the minimal sufficient statistic for their prediction (the set of causal states, calculate the historical memory capacity required to store those states (statistical complexity, delineate what information is predictable (excess entropy, and decompose the entropy of a single measurement into that shared with the past, future, or both. The causal state equivalence relation defines a new subclass of renewal processes with a finite number of causal states despite having an unbounded interevent count distribution. We use the resulting formulae to analyze the output of the parametrized Simple Nonunifilar Source, generated by a simple two-state hidden Markov model, but with an infinite-state ϵ-machine presentation. All in all, the results lay the groundwork for analyzing more complex processes with infinite statistical complexity and infinite excess entropy.

  1. The Concept of Information Sharing Behaviors in Complex Organizations: Research in Latvian Enterprises

    Directory of Open Access Journals (Sweden)

    Andrejs Cekuls

    2016-12-01

    Full Text Available The purpose of this paper is to explore the factors influencing behaviors of information sharing in complex organizations. Evaluation of the previous studies on provision of information turnover process and the role of organizational culture in competitive intelligence of business environment in Latvia indicated the trends that employees of Latvian enterprises lack incentive to share information. Tasks of the study were to research the basis of the review of scientific sources and study aspects influencing habits of information sharing in complex organizations. For this particular study, the focus group is selected as the most appropriate data collection method for high-quality research. To find out individuals' opinions and attitudes two focus group discussions were carried out. Members from various industries and with different employment period were included in discussion groups. In aggregate, opinions of the employees from 41 different companies were summarized regarding the aspects affecting the process of information sharing in organizations. Results of researches show that that influence the sharing of information are closely related to the values: interpersonal trust, organizational trust, and organizational identification, support, fairness etc. Results of discussions showed that it is important for a manager to be aware of the factors affecting the performance of the organization. To identify the need for changes, a manager should follow events in the environment and analyze the extent, to which they affect the performance of the organization. Complexity science suggests that maturity to changes emerges when the system is far from balance, but the tension makes to accept changes.

  2. Quantum information processing in nanostructures

    International Nuclear Information System (INIS)

    Reina Estupinan, John-Henry

    2002-01-01

    Since information has been regarded os a physical entity, the field of quantum information theory has blossomed. This brings novel applications, such as quantum computation. This field has attracted the attention of numerous researchers with backgrounds ranging from computer science, mathematics and engineering, to the physical sciences. Thus, we now have an interdisciplinary field where great efforts are being made in order to build devices that should allow for the processing of information at a quantum level, and also in the understanding of the complex structure of some physical processes at a more basic level. This thesis is devoted to the theoretical study of structures at the nanometer-scale, 'nanostructures', through physical processes that mainly involve the solid-state and quantum optics, in order to propose reliable schemes for the processing of quantum information. Initially, the main results of quantum information theory and quantum computation are briefly reviewed. Next, the state-of-the-art of quantum dots technology is described. In so doing, the theoretical background and the practicalities required for this thesis are introduced. A discussion of the current quantum hardware used for quantum information processing is given. In particular, the solid-state proposals to date are emphasised. A detailed prescription is given, using an optically-driven coupled quantum dot system, to reliably prepare and manipulate exciton maximally entangled Bell and Greenberger-Horne-Zeilinger (GHZ) states. Manipulation of the strength and duration of selective light-pulses needed for producing these highly entangled states provides us with crucial elements for the processing of solid-state based quantum information. The all-optical generation of states of the so-called Bell basis for a system of two quantum dots (QDs) is exploited for performing the quantum teleportation of the excitonic state of a dot in an array of three coupled QDs. Theoretical predictions suggest

  3. THEORETICAL FRAMEWORK FOR INFORMATION AND EDUCATIONAL COMPLEX DEVELOPMENT OF AN ACADEMIC DISCIPLINE AT A HIGHER INSTITUTION

    Directory of Open Access Journals (Sweden)

    Evgeniia Nikolaevna Kikot

    2015-05-01

    Full Text Available The question of organization of contemporary education process is getting more important nowadays in the conditions of ICT (information and communication technologies and e-education usage.This defines one of the most important methodological and research directions in the university – creation of informational-educational course unit complex as the foundation of e-University resource.The foundation of informational-educational course unit complex creation are the concepts of openness, accessibility, clearness, personalisation and that allow to built the requirements system to the complex creation and its substantial content.The main functions of informational educational complex are detected: informational, educational, controlling and communicative.It’s defined that into the basis of scientific justification of new structure elements of informational-educational of course unit complex development and introduction is necessary to include creation of e-workbook, e-workshops in order to organize theoretical and practical e-conferences.Development of ICT in education that provides e-education application assume establishment of distance learning techno-logies for educational programme implementation.

  4. Dynamics of analyst forecasts and emergence of complexity: Role of information disparity.

    Directory of Open Access Journals (Sweden)

    Chansoo Kim

    Full Text Available We report complex phenomena arising among financial analysts, who gather information and generate investment advice, and elucidate them with the help of a theoretical model. Understanding how analysts form their forecasts is important in better understanding the financial market. Carrying out big-data analysis of the analyst forecast data from I/B/E/S for nearly thirty years, we find skew distributions as evidence for emergence of complexity, and show how information asymmetry or disparity affects financial analysts' forming their forecasts. Here regulations, information dissemination throughout a fiscal year, and interactions among financial analysts are regarded as the proxy for a lower level of information disparity. It is found that financial analysts with better access to information display contrasting behaviors: a few analysts become bolder and issue forecasts independent of other forecasts while the majority of analysts issue more accurate forecasts and flock to each other. Main body of our sample of optimistic forecasts fits a log-normal distribution, with the tail displaying a power law. Based on the Yule process, we propose a model for the dynamics of issuing forecasts, incorporating interactions between analysts. Explaining nicely empirical data on analyst forecasts, this provides an appealing instance of understanding social phenomena in the perspective of complex systems.

  5. How Students Learn: Information Processing, Intellectual Development and Confrontation

    Science.gov (United States)

    Entwistle, Noel

    1975-01-01

    A model derived from information processing theory is described, which helps to explain the complex verbal learning of students and suggests implications for lecturing techniques. Other factors affecting learning, which are not covered by the model, are discussed in relationship to it: student's intellectual development and effects of individual…

  6. Triangle network motifs predict complexes by complementing high-error interactomes with structural information.

    Science.gov (United States)

    Andreopoulos, Bill; Winter, Christof; Labudde, Dirk; Schroeder, Michael

    2009-06-27

    A lot of high-throughput studies produce protein-protein interaction networks (PPINs) with many errors and missing information. Even for genome-wide approaches, there is often a low overlap between PPINs produced by different studies. Second-level neighbors separated by two protein-protein interactions (PPIs) were previously used for predicting protein function and finding complexes in high-error PPINs. We retrieve second level neighbors in PPINs, and complement these with structural domain-domain interactions (SDDIs) representing binding evidence on proteins, forming PPI-SDDI-PPI triangles. We find low overlap between PPINs, SDDIs and known complexes, all well below 10%. We evaluate the overlap of PPI-SDDI-PPI triangles with known complexes from Munich Information center for Protein Sequences (MIPS). PPI-SDDI-PPI triangles have ~20 times higher overlap with MIPS complexes than using second-level neighbors in PPINs without SDDIs. The biological interpretation for triangles is that a SDDI causes two proteins to be observed with common interaction partners in high-throughput experiments. The relatively few SDDIs overlapping with PPINs are part of highly connected SDDI components, and are more likely to be detected in experimental studies. We demonstrate the utility of PPI-SDDI-PPI triangles by reconstructing myosin-actin processes in the nucleus, cytoplasm, and cytoskeleton, which were not obvious in the original PPIN. Using other complementary datatypes in place of SDDIs to form triangles, such as PubMed co-occurrences or threading information, results in a similar ability to find protein complexes. Given high-error PPINs with missing information, triangles of mixed datatypes are a promising direction for finding protein complexes. Integrating PPINs with SDDIs improves finding complexes. Structural SDDIs partially explain the high functional similarity of second-level neighbors in PPINs. We estimate that relatively little structural information would be sufficient

  7. Temporal Information Partitioning Networks (TIPNets): Characterizing emergent behavior in complex ecohydrologic systems

    Science.gov (United States)

    Goodwell, Allison; Kumar, Praveen

    2017-04-01

    Within an ecosystem, components of the atmosphere, vegetation, and the root-soil system participate in forcing and feedback reactions at varying time scales and intensities. These interactions constitute a complex network that exhibits behavioral shifts due to perturbations ranging from weather events to long-term drought or land use change. However, it is challenging to characterize this shifting network due to multiple drivers, non-linear interactions, and synchronization due to feedback. To overcome these issues, we implement a process network approach where eco-hydrologic time-series variables are nodes and information measures are links. We introduce a Temporal Information Partition Network (TIPNet) framework in which multivariate lagged mutual information between source and target nodes is decomposed into synergistic, redundant, and unique components, each of which reveals different aspects of interactions within the network. We use methods to compute information measures given as few as 200 data points to construct TIPNets based on 1-minute weather station data (radiation Rg, air temperature Ta, wind speed WS, relative humidity RH, precipitation PPT, and leaf wetness LWet) from Central Illinois during the growing season of 2015. We assess temporal shifts in network behavior for various weather conditions and over the growing season. We find that wet time periods are associated with complex and synergistic network structures compared to dry conditions, and that seasonal network patterns reveal responses to vegetation growth and rainfall trends. This framework is applicable to study a broad range of complex systems composed of multiple interacting components, and may aid process understanding, model improvement, and resilience and vulnerability assessments.

  8. Semantic Complex Event Processing over End-to-End Data Flows

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Qunzhi [University of Southern California; Simmhan, Yogesh; Prasanna, Viktor K.

    2012-04-01

    Emerging Complex Event Processing (CEP) applications in cyber physical systems like SmartPower Grids present novel challenges for end-to-end analysis over events, flowing from heterogeneous information sources to persistent knowledge repositories. CEP for these applications must support two distinctive features - easy specification patterns over diverse information streams, and integrated pattern detection over realtime and historical events. Existing work on CEP has been limited to relational query patterns, and engines that match events arriving after the query has been registered. We propose SCEPter, a semantic complex event processing framework which uniformly processes queries over continuous and archived events. SCEPteris built around an existing CEP engine with innovative support for semantic event pattern specification and allows their seamless detection over past, present and future events. Specifically, we describe a unified semantic query model that can operate over data flowing through event streams to event repositories. Compile-time and runtime semantic patterns are distinguished and addressed separately for efficiency. Query rewriting is examined and analyzed in the context of temporal boundaries that exist between event streams and their repository to avoid duplicate or missing results. The design and prototype implementation of SCEPterare analyzed using latency and throughput metrics for scenarios from the Smart Grid domain.

  9. Processing reafferent and exafferent visual information for action and perception.

    Science.gov (United States)

    Reichenbach, Alexandra; Diedrichsen, Jörn

    2015-01-01

    A recent study suggests that reafferent hand-related visual information utilizes a privileged, attention-independent processing channel for motor control. This process was termed visuomotor binding to reflect its proposed function: linking visual reafferences to the corresponding motor control centers. Here, we ask whether the advantage of processing reafferent over exafferent visual information is a specific feature of the motor processing stream or whether the improved processing also benefits the perceptual processing stream. Human participants performed a bimanual reaching task in a cluttered visual display, and one of the visual hand cursors could be displaced laterally during the movement. We measured the rapid feedback responses of the motor system as well as matched perceptual judgments of which cursor was displaced. Perceptual judgments were either made by watching the visual scene without moving or made simultaneously to the reaching tasks, such that the perceptual processing stream could also profit from the specialized processing of reafferent information in the latter case. Our results demonstrate that perceptual judgments in the heavily cluttered visual environment were improved when performed based on reafferent information. Even in this case, however, the filtering capability of the perceptual processing stream suffered more from the increasing complexity of the visual scene than the motor processing stream. These findings suggest partly shared and partly segregated processing of reafferent information for vision for motor control versus vision for perception.

  10. Unveiling the mystery of visual information processing in human brain.

    Science.gov (United States)

    Diamant, Emanuel

    2008-08-15

    It is generally accepted that human vision is an extremely powerful information processing system that facilitates our interaction with the surrounding world. However, despite extended and extensive research efforts, which encompass many exploration fields, the underlying fundamentals and operational principles of visual information processing in human brain remain unknown. We still are unable to figure out where and how along the path from eyes to the cortex the sensory input perceived by the retina is converted into a meaningful object representation, which can be consciously manipulated by the brain. Studying the vast literature considering the various aspects of brain information processing, I was surprised to learn that the respected scholarly discussion is totally indifferent to the basic keynote question: "What is information?" in general or "What is visual information?" in particular. In the old days, it was assumed that any scientific research approach has first to define its basic departure points. Why was it overlooked in brain information processing research remains a conundrum. In this paper, I am trying to find a remedy for this bizarre situation. I propose an uncommon definition of "information", which can be derived from Kolmogorov's Complexity Theory and Chaitin's notion of Algorithmic Information. Embracing this new definition leads to an inevitable revision of traditional dogmas that shape the state of the art of brain information processing research. I hope this revision would better serve the challenging goal of human visual information processing modeling.

  11. Methodology for Measuring the Complexity of Enterprise Information Systems

    Directory of Open Access Journals (Sweden)

    Ilja Holub

    2016-07-01

    Full Text Available The complexity of enterprise information systems is currently a challenge faced not only by IT professionals and project managers, but also by the users of such systems. Current methodologies and frameworks used to design and implement information systems do not specifically deal with the issue of their complexity and, apart from few exceptions, do not at all attempt to simplify the complexity. This article presents the author's own methodology for managing complexity, which can be used to complement any other methodology and which helps limit the growth of complexity. It introduces its own definition and metric of complexity, which it defines as the sum of entities of the individual UML models of the given system, which are selected according to the MMDIS methodology so as to consistently describe all relevant content dimensions of the system. The main objective is to propose a methodology to manage information system complexity and to verify it in practice on a real-life SAP implementation project.

  12. Improved motion contrast and processing efficiency in OCT angiography using complex-correlation algorithm

    International Nuclear Information System (INIS)

    Guo, Li; Li, Pei; Pan, Cong; Cheng, Yuxuan; Ding, Zhihua; Li, Peng; Liao, Rujia; Hu, Weiwei; Chen, Zhong

    2016-01-01

    The complex-based OCT angiography (Angio-OCT) offers high motion contrast by combining both the intensity and phase information. However, due to involuntary bulk tissue motions, complex-valued OCT raw data are processed sequentially with different algorithms for correcting bulk image shifts (BISs), compensating global phase fluctuations (GPFs) and extracting flow signals. Such a complicated procedure results in massive computational load. To mitigate such a problem, in this work, we present an inter-frame complex-correlation (CC) algorithm. The CC algorithm is suitable for parallel processing of both flow signal extraction and BIS correction, and it does not need GPF compensation. This method provides high processing efficiency and shows superiority in motion contrast. The feasibility and performance of the proposed CC algorithm is demonstrated using both flow phantom and live animal experiments. (paper)

  13. What do information reuse and automated processing require in engineering design? Semantic process

    Directory of Open Access Journals (Sweden)

    Ossi Nykänen

    2011-12-01

    Full Text Available Purpose: The purpose of this study is to characterize, analyze, and demonstrate machine-understandable semantic process for validating, integrating, and processing technical design information. This establishes both a vision and tools for information reuse and semi-automatic processing in engineering design projects, including virtual machine laboratory applications with generated components.Design/methodology/approach: The process model has been developed iteratively in terms of action research, constrained by the existing technical design practices and assumptions (design documents, expert feedback, available technologies (pre-studies and experiments with scripting and pipeline tools, benchmarking with other process models and methods (notably the RUP and DITA, and formal requirements (computability and the critical information paths for the generated applications. In practice, the work includes both quantitative and qualitative components.Findings: Technical design processes may be greatly enhanced in terms of semantic process thinking, by enriching design information, and automating information validation and transformation tasks. Contemporary design information, however, is mainly intended for human consumption, and needs to be explicitly enriched with the currently missing data and interfaces. In practice, this may require acknowledging the role of technical information or knowledge engineer, to lead the development of the semantic design information process in a design organization. There is also a trade-off between machine-readability and system complexity that needs to be studied further, both empirically and in theory.Research limitations/implications: The conceptualization of the semantic process is essentially an abstraction based on the idea of progressive design. While this effectively allows implementing semantic processes with, e.g., pipeline technologies, the abstraction is valid only when technical design is organized into

  14. Epidemic processes in complex networks

    OpenAIRE

    Pastor Satorras, Romualdo; Castellano, Claudio; Van Mieghem, Piet; Vespignani, Alessandro

    2015-01-01

    In recent years the research community has accumulated overwhelming evidence for the emergence of complex and heterogeneous connectivity patterns in a wide range of biological and sociotechnical systems. The complex properties of real-world networks have a profound impact on the behavior of equilibrium and nonequilibrium phenomena occurring in various systems, and the study of epidemic spreading is central to our understanding of the unfolding of dynamical processes in complex networks. The t...

  15. RATING MODELS AND INFORMATION TECHNOLOGIES APPLICATION FOR MANAGEMENT OF ADMINISTRATIVE-TERRITORIAL COMPLEXES

    Directory of Open Access Journals (Sweden)

    O. M. Pshinko

    2016-12-01

    Full Text Available Purpose. The paper aims to develop rating models and related information technologies designed to resolve the tasks of strategic planning of the administrative and territorial units’ development, as well as the tasks of multi-criteria control of inhomogeneous multiparameter objects operation. Methodology. When solving problems of strategic planning of administrative and territorial development and heterogeneous classes management of objects under control, a set of agreed methods is used. Namely the multi-criteria properties analysis for objects of planning and management, diagnostics of the state parameters, forecasting and management of complex systems of different classes. Their states are estimated by sets of different quality indicators, as well as represented by the individual models of operation process. A new information technology is proposed and created to implement the strategic planning and management tasks. This technology uses the procedures for solving typical tasks, that are implemented in MS SQL Server. Findings. A new approach to develop models of analyze and management of complex systems classes based on the ratings has been proposed. Rating models development for analysis of multicriteria and multiparameter systems has been obtained. The management of these systems is performed on the base of parameters of the current and predicted state by non-uniform distribution of resources. The procedure of sensitivity analysis of the changes in the rating model of inhomogeneous distribution of resources parameters has been developed. The information technology of strategic planning and management of heterogeneous classes of objects based on the rating model has been created. Originality. This article proposes a new approach of the rating indicators’ using as a general model for strategic planning of the development and management of heterogeneous objects that can be characterized by the sets of parameters measured on different scales

  16. Quantum Information Processing

    CERN Document Server

    Leuchs, Gerd

    2005-01-01

    Quantum processing and communication is emerging as a challenging technique at the beginning of the new millennium. This is an up-to-date insight into the current research of quantum superposition, entanglement, and the quantum measurement process - the key ingredients of quantum information processing. The authors further address quantum protocols and algorithms. Complementary to similar programmes in other countries and at the European level, the German Research Foundation (DFG) started a focused research program on quantum information in 1999. The contributions - written by leading experts - bring together the latest results in quantum information as well as addressing all the relevant questions

  17. CISAPS: Complex Informational Spectrum for the Analysis of Protein Sequences

    Directory of Open Access Journals (Sweden)

    Charalambos Chrysostomou

    2015-01-01

    Full Text Available Complex informational spectrum analysis for protein sequences (CISAPS and its web-based server are developed and presented. As recent studies show, only the use of the absolute spectrum in the analysis of protein sequences using the informational spectrum analysis is proven to be insufficient. Therefore, CISAPS is developed to consider and provide results in three forms including absolute, real, and imaginary spectrum. Biologically related features to the analysis of influenza A subtypes as presented as a case study in this study can also appear individually either in the real or imaginary spectrum. As the results presented, protein classes can present similarities or differences according to the features extracted from CISAPS web server. These associations are probable to be related with the protein feature that the specific amino acid index represents. In addition, various technical issues such as zero-padding and windowing that may affect the analysis are also addressed. CISAPS uses an expanded list of 611 unique amino acid indices where each one represents a different property to perform the analysis. This web-based server enables researchers with little knowledge of signal processing methods to apply and include complex informational spectrum analysis to their work.

  18. Survey of Applications of Complex Event Processing (CEP in Health Domain

    Directory of Open Access Journals (Sweden)

    Nadeem Mahmood

    2017-12-01

    Full Text Available It is always difficult to manipulate the production of huge amount of data which comes from multiple sources and to extract meaningful information to make appropriate decisions. When data comes from various input resources, to get required streams of events form this complex input network, the one of the strong functionality of Business Intelligence (BI the Complex Event Processing (CEP is the appropriate solution for the above mention problems. Real time processing, pattern matching, stream processing, big data management, sensor data processing and many more are the application areas of CEP. Health domain itself is a multi-dimension domain such as hospital supply chain, OPD management, disease diagnostic, In-patient, out-patient management, and emergency care etc. In this paper, the main focus is to discuss the application areas of Complex Event Processing (CEP in health domain by using sensor device, such that how CEP manipulate health data set events coming from sensor devices such as blood pressure, heart rate, fall detection, sugar level, temperature or any other vital signs and how this systems respond to these events as quickly as possible. Different existing models and application using CEP are discussed and summarized according to different characteristics.

  19. Information theory and signal transduction systems: from molecular information processing to network inference.

    Science.gov (United States)

    Mc Mahon, Siobhan S; Sim, Aaron; Filippi, Sarah; Johnson, Robert; Liepe, Juliane; Smith, Dominic; Stumpf, Michael P H

    2014-11-01

    Sensing and responding to the environment are two essential functions that all biological organisms need to master for survival and successful reproduction. Developmental processes are marshalled by a diverse set of signalling and control systems, ranging from systems with simple chemical inputs and outputs to complex molecular and cellular networks with non-linear dynamics. Information theory provides a powerful and convenient framework in which such systems can be studied; but it also provides the means to reconstruct the structure and dynamics of molecular interaction networks underlying physiological and developmental processes. Here we supply a brief description of its basic concepts and introduce some useful tools for systems and developmental biologists. Along with a brief but thorough theoretical primer, we demonstrate the wide applicability and biological application-specific nuances by way of different illustrative vignettes. In particular, we focus on the characterisation of biological information processing efficiency, examining cell-fate decision making processes, gene regulatory network reconstruction, and efficient signal transduction experimental design. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Conceptual models of information processing

    Science.gov (United States)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  1. Eye Movement Analysis of Information Processing under Different Testing Conditions.

    Science.gov (United States)

    Dillon, Ronna F.

    1985-01-01

    Undergraduates were given complex figural analogies items, and eye movements were observed under three types of feedback: (1) elaborate feedback; (2) subjects verbalized their thinking and application of rules; and (3) no feedback. Both feedback conditions enhanced the rule-governed information processing during inductive reasoning. (Author/GDC)

  2. Relay-based information broadcast in complex networks

    Science.gov (United States)

    Fan, Zhongyan; Han, Zeyu; Tang, Wallace K. S.; Lin, Dong

    2018-04-01

    Information broadcast (IB) is a critical process in complex network, usually accomplished by flooding mechanism. Although flooding is simple and no prior topological information is required, it consumes a lot of transmission overhead. Another extreme is the tree-based broadcast (TB), for which information is disseminated via a spanning tree. It achieves the minimal transmission overhead but the maintenance of spanning tree for every node is an obvious obstacle for implementation. Motivated by the success of scale-free network models for real-world networks, in this paper, we investigate the issues in IB by considering an alternative solution in-between these two extremes. A novel relay-based broadcast (RB) mechanism is proposed by employing a subset of nodes as relays. Information is firstly forwarded to one of these relays and then re-disseminated to others through the spanning tree whose root is the relay. This mechanism provides a trade-off solution between flooding and TB. On one hand, it saves up a lot of transmission overhead as compared to flooding; on the other hand, it costs much less resource for maintenance than TB as only a few spanning trees are needed. Based on two major criteria, namely the transmission overhead and the convergence time, the effectiveness of RB is confirmed. The impacts of relay assignment and network structures on performance are also studied in this work.

  3. Triangle network motifs predict complexes by complementing high-error interactomes with structural information

    Directory of Open Access Journals (Sweden)

    Labudde Dirk

    2009-06-01

    Full Text Available Abstract Background A lot of high-throughput studies produce protein-protein interaction networks (PPINs with many errors and missing information. Even for genome-wide approaches, there is often a low overlap between PPINs produced by different studies. Second-level neighbors separated by two protein-protein interactions (PPIs were previously used for predicting protein function and finding complexes in high-error PPINs. We retrieve second level neighbors in PPINs, and complement these with structural domain-domain interactions (SDDIs representing binding evidence on proteins, forming PPI-SDDI-PPI triangles. Results We find low overlap between PPINs, SDDIs and known complexes, all well below 10%. We evaluate the overlap of PPI-SDDI-PPI triangles with known complexes from Munich Information center for Protein Sequences (MIPS. PPI-SDDI-PPI triangles have ~20 times higher overlap with MIPS complexes than using second-level neighbors in PPINs without SDDIs. The biological interpretation for triangles is that a SDDI causes two proteins to be observed with common interaction partners in high-throughput experiments. The relatively few SDDIs overlapping with PPINs are part of highly connected SDDI components, and are more likely to be detected in experimental studies. We demonstrate the utility of PPI-SDDI-PPI triangles by reconstructing myosin-actin processes in the nucleus, cytoplasm, and cytoskeleton, which were not obvious in the original PPIN. Using other complementary datatypes in place of SDDIs to form triangles, such as PubMed co-occurrences or threading information, results in a similar ability to find protein complexes. Conclusion Given high-error PPINs with missing information, triangles of mixed datatypes are a promising direction for finding protein complexes. Integrating PPINs with SDDIs improves finding complexes. Structural SDDIs partially explain the high functional similarity of second-level neighbors in PPINs. We estimate that

  4. Efficacy of Cognitive Processes in Young People with High-Functioning Autism Spectrum Disorder Using a Novel Visual Information-Processing Task

    Science.gov (United States)

    Speirs, Samantha J.; Rinehart, Nicole J.; Robinson, Stephen R.; Tonge, Bruce J.; Yelland, Gregory W.

    2014-01-01

    Autism spectrum disorders (ASD) are characterised by a unique pattern of preserved abilities and deficits within and across cognitive domains. The Complex Information Processing Theory proposes this pattern reflects an altered capacity to respond to cognitive demands. This study compared how complexity induced by time constraints on processing…

  5. Phonological Processes in Complex and Compound Words

    Directory of Open Access Journals (Sweden)

    Alieh Kord Zaferanlu Kambuziya

    2016-02-01

    Full Text Available Abstract This research at making a comparison between phonological processes in complex and compound Persian words. Data are gathered from a 40,000-word Persian dictionary. To catch some results, 4,034 complex words and 1,464 compound ones are chosen. To count the data, "excel" software is used. Some results of the research are: 1- "Insertion" is the usual phonological process in complex words. More than half of different insertions belongs to the consonant /g/. Then /y/ and // are in the second and the third order. The consonant /v/ has the least percentage of all. The most percentage of vowel insertion belongs to /e/. The vowels /a/ and /o/ are in the second and third order. Deletion in complex words can only be seen in consonant /t/ and vowel /e/. 2- The most frequent phonological processes in compounds is consonant deletion. In this process, seven different consonants including /t/, //, /m/, /r/, / ǰ/, /d, and /c/. The only deleted vowel is /e/. In both groups of complex and compound, /t/ deletion can be observed. A sequence of three consonants paves the way for the deletion of one of the consonants, if one of the sequences is a sonorant one like /n/, the deletion process rarely happens. 3- In complex words, consonant deletion causes a lighter syllable weight, whereas vowel deletion causes a heavier syllable weight. So, both of the processes lead to bi-moraic weight. 4- The production of bi-moraic syllable in Persian is preferable to Syllable Contact Law. So, Specific Rules have precedence to Universals. 5- Vowel insertion can be seen in both groups of complex and compound words. In complex words, /e/ insertion has the most fundamental part. The vowels /a/ and /o/ are in the second and third place. Whenever there are two sequences of ultra-heavy syllables. By vowel insertion, the first syllable is broken into two light syllables. The compounds that are influenced by vowel insertion, can be and are pronounced without any insertion

  6. THE ELABORATION OF THE OPTIMAL SYNTHESIS ALGORITHM FOR COMPLEX PROCESSING INFORMATION OF THE SPATIAL POSITION OF THE UPPER-AIR RADIOSONDE

    Directory of Open Access Journals (Sweden)

    2016-01-01

    Full Text Available The article considers the elaboration of the problem of optimal algorithm synthesis of complex signal processing of satel- lite GLONASS/GPS systems navigation relayed from the Board of the upper-air radiosonde and the output data upper-air radar to determine the spatial coordinates of upper-air radiosonde. The upper-air sounding is performed with the help of technical means of radio sounding system of atmosphere, including the upper-air radiosonde, manufactured in free flight, and ground supporting equipment, which includes devices for signal processing of upper-air radiosonde and preparation of the operational upper-air mes- sages. The peculiarity of atmosphere radio sounding of domestic system is the measurement with method of radar slant range to upper-air radiosonde, the viewing angles of the antenna upper-air radar to determine azimuth and elevation of upper-air radiosonde. The disadvantage of the radar method of radiosonde support is the relatively low accuracy of determining the coordinates of the radiosonde and the possible disruption of automatic tracking in angular coordinates. Satellite navigation system based on the mi- crowave sensors has clear advantages in terms of efficiency, size, mobility, and use on mobile objects, however, with significant drawbacks associated primarily with the geometric factor and the error propagation of the navigation signal. The article presents a mathematical model useful incoherent GLONASS/GPS signals, relayed by the upper-air radiosonde, and interference on the input receiver ground point for complex information processing, and mathematical models of output data in upper-air radars.

  7. Toward theoretical understanding of the fertility preservation decision-making process: examining information processing among young women with cancer.

    Science.gov (United States)

    Hershberger, Patricia E; Finnegan, Lorna; Altfeld, Susan; Lake, Sara; Hirshfeld-Cytron, Jennifer

    2013-01-01

    Young women with cancer now face the complex decision about whether to undergo fertility preservation. Yet little is known about how these women process information involved in making this decision. The purpose of this article is to expand theoretical understanding of the decision-making process by examining aspects of information processing among young women diagnosed with cancer. Using a grounded theory approach, 27 women with cancer participated in individual, semistructured interviews. Data were coded and analyzed using constant-comparison techniques that were guided by 5 dimensions within the Contemplate phase of the decision-making process framework. In the first dimension, young women acquired information primarily from clinicians and Internet sources. Experiential information, often obtained from peers, occurred in the second dimension. Preferences and values were constructed in the third dimension as women acquired factual, moral, and ethical information. Women desired tailored, personalized information that was specific to their situation in the fourth dimension; however, women struggled with communicating these needs to clinicians. In the fifth dimension, women offered detailed descriptions of clinician behaviors that enhance or impede decisional debriefing. Better understanding of theoretical underpinnings surrounding women's information processes can facilitate decision support and improve clinical care.

  8. Spectral simplicity of apparent complexity. II. Exact complexities and complexity spectra

    Science.gov (United States)

    Riechers, Paul M.; Crutchfield, James P.

    2018-03-01

    The meromorphic functional calculus developed in Part I overcomes the nondiagonalizability of linear operators that arises often in the temporal evolution of complex systems and is generic to the metadynamics of predicting their behavior. Using the resulting spectral decomposition, we derive closed-form expressions for correlation functions, finite-length Shannon entropy-rate approximates, asymptotic entropy rate, excess entropy, transient information, transient and asymptotic state uncertainties, and synchronization information of stochastic processes generated by finite-state hidden Markov models. This introduces analytical tractability to investigating information processing in discrete-event stochastic processes, symbolic dynamics, and chaotic dynamical systems. Comparisons reveal mathematical similarities between complexity measures originally thought to capture distinct informational and computational properties. We also introduce a new kind of spectral analysis via coronal spectrograms and the frequency-dependent spectra of past-future mutual information. We analyze a number of examples to illustrate the methods, emphasizing processes with multivariate dependencies beyond pairwise correlation. This includes spectral decomposition calculations for one representative example in full detail.

  9. Electrospray ionization mass spectrometry for the hydrolysis complexes of cisplatin: implications for the hydrolysis process of platinum complexes.

    Science.gov (United States)

    Feifan, Xie; Pieter, Colin; Jan, Van Bocxlaer

    2017-07-01

    Non-enzyme-dependent hydrolysis of the drug cisplatin is important for its mode of action and toxicity. However, up until today, the hydrolysis process of cisplatin is still not completely understood. In the present study, the hydrolysis of cisplatin in an aqueous solution was systematically investigated by using electrospray ionization mass spectrometry coupled to liquid chromatography. A variety of previously unreported hydrolysis complexes corresponding to monomeric, dimeric and trimeric species were detected and identified. The characteristics of the Pt-containing complexes were investigated by using collision-induced dissociation (CID). The hydrolysis complexes demonstrate distinctive and correlative CID characteristics, which provides tools for an informative identification. The most frequently observed dissociation mechanism was sequential loss of NH 3 , H 2 O and HCl. Loss of the Pt atom was observed as the final step during the CID process. The formation mechanisms of the observed complexes were explored and experimentally examined. The strongly bound dimeric species, which existed in solution, are assumed to be formed from the clustering of the parent compound and its monohydrated or dihydrated complexes. The role of the electrospray process in the formation of some of the observed ions was also evaluated, and the electrospray ionization-related cold clusters were identified. The previously reported hydrolysis equilibria were tested and subsequently refined via a hydrolysis study resulting in a renewed mechanistic equilibrium system of cisplatin as proposed from our results. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  10. Wave-processing of long-scale information by neuronal chains.

    Directory of Open Access Journals (Sweden)

    José Antonio Villacorta-Atienza

    Full Text Available Investigation of mechanisms of information handling in neural assemblies involved in computational and cognitive tasks is a challenging problem. Synergetic cooperation of neurons in time domain, through synchronization of firing of multiple spatially distant neurons, has been widely spread as the main paradigm. Complementary, the brain may also employ information coding and processing in spatial dimension. Then, the result of computation depends also on the spatial distribution of long-scale information. The latter bi-dimensional alternative is notably less explored in the literature. Here, we propose and theoretically illustrate a concept of spatiotemporal representation and processing of long-scale information in laminar neural structures. We argue that relevant information may be hidden in self-sustained traveling waves of neuronal activity and then their nonlinear interaction yields efficient wave-processing of spatiotemporal information. Using as a testbed a chain of FitzHugh-Nagumo neurons, we show that the wave-processing can be achieved by incorporating into the single-neuron dynamics an additional voltage-gated membrane current. This local mechanism provides a chain of such neurons with new emergent network properties. In particular, nonlinear waves as a carrier of long-scale information exhibit a variety of functionally different regimes of interaction: from complete or asymmetric annihilation to transparent crossing. Thus neuronal chains can work as computational units performing different operations over spatiotemporal information. Exploiting complexity resonance these composite units can discard stimuli of too high or too low frequencies, while selectively compress those in the natural frequency range. We also show how neuronal chains can contextually interpret raw wave information. The same stimulus can be processed differently or identically according to the context set by a periodic wave train injected at the opposite end of the

  11. ALGORITHM OF CARDIO COMPLEX DETECTION AND SORTING FOR PROCESSING THE DATA OF CONTINUOUS CARDIO SIGNAL MONITORING.

    Science.gov (United States)

    Krasichkov, A S; Grigoriev, E B; Nifontov, E M; Shapovalov, V V

    The paper presents an algorithm of cardio complex classification as part of processing the data of continuous cardiac monitoring. R-wave detection concurrently with cardio complex sorting is discussed. The core of this approach is the use of prior information about. cardio complex forms, segmental structure, and degree of kindness. Results of the sorting algorithm testing are provided.

  12. Adaptive Beamforming Based on Complex Quaternion Processes

    Directory of Open Access Journals (Sweden)

    Jian-wu Tao

    2014-01-01

    Full Text Available Motivated by the benefits of array signal processing in quaternion domain, we investigate the problem of adaptive beamforming based on complex quaternion processes in this paper. First, a complex quaternion least-mean squares (CQLMS algorithm is proposed and its performance is analyzed. The CQLMS algorithm is suitable for adaptive beamforming of vector-sensor array. The weight vector update of CQLMS algorithm is derived based on the complex gradient, leading to lower computational complexity. Because the complex quaternion can exhibit the orthogonal structure of an electromagnetic vector-sensor in a natural way, a complex quaternion model in time domain is provided for a 3-component vector-sensor array. And the normalized adaptive beamformer using CQLMS is presented. Finally, simulation results are given to validate the performance of the proposed adaptive beamformer.

  13. Research on image complexity evaluation method based on color information

    Science.gov (United States)

    Wang, Hao; Duan, Jin; Han, Xue-hui; Xiao, Bo

    2017-11-01

    In order to evaluate the complexity of a color image more effectively and find the connection between image complexity and image information, this paper presents a method to compute the complexity of image based on color information.Under the complexity ,the theoretical analysis first divides the complexity from the subjective level, divides into three levels: low complexity, medium complexity and high complexity, and then carries on the image feature extraction, finally establishes the function between the complexity value and the color characteristic model. The experimental results show that this kind of evaluation method can objectively reconstruct the complexity of the image from the image feature research. The experimental results obtained by the method of this paper are in good agreement with the results of human visual perception complexity,Color image complexity has a certain reference value.

  14. Continuous-variable quantum information processing

    DEFF Research Database (Denmark)

    Andersen, Ulrik Lund; Leuchs, G.; Silberhorn, C.

    2010-01-01

    the continuous degree of freedom of a quantum system for encoding, processing or detecting information, one enters the field of continuous-variable (CV) quantum information processing. In this paper we review the basic principles of CV quantum information processing with main focus on recent developments...... in the field. We will be addressing the three main stages of a quantum information system; the preparation stage where quantum information is encoded into CVs of coherent states and single-photon states, the processing stage where CV information is manipulated to carry out a specified protocol and a detection...... stage where CV information is measured using homodyne detection or photon counting....

  15. Quantifying Complexity in Quantum Phase Transitions via Mutual Information Complex Networks.

    Science.gov (United States)

    Valdez, Marc Andrew; Jaschke, Daniel; Vargas, David L; Carr, Lincoln D

    2017-12-01

    We quantify the emergent complexity of quantum states near quantum critical points on regular 1D lattices, via complex network measures based on quantum mutual information as the adjacency matrix, in direct analogy to quantifying the complexity of electroencephalogram or functional magnetic resonance imaging measurements of the brain. Using matrix product state methods, we show that network density, clustering, disparity, and Pearson's correlation obtain the critical point for both quantum Ising and Bose-Hubbard models to a high degree of accuracy in finite-size scaling for three classes of quantum phase transitions, Z_{2}, mean field superfluid to Mott insulator, and a Berzinskii-Kosterlitz-Thouless crossover.

  16. Quantifying Complexity in Quantum Phase Transitions via Mutual Information Complex Networks

    Science.gov (United States)

    Valdez, Marc Andrew; Jaschke, Daniel; Vargas, David L.; Carr, Lincoln D.

    2017-12-01

    We quantify the emergent complexity of quantum states near quantum critical points on regular 1D lattices, via complex network measures based on quantum mutual information as the adjacency matrix, in direct analogy to quantifying the complexity of electroencephalogram or functional magnetic resonance imaging measurements of the brain. Using matrix product state methods, we show that network density, clustering, disparity, and Pearson's correlation obtain the critical point for both quantum Ising and Bose-Hubbard models to a high degree of accuracy in finite-size scaling for three classes of quantum phase transitions, Z2, mean field superfluid to Mott insulator, and a Berzinskii-Kosterlitz-Thouless crossover.

  17. Methods of information processing

    Energy Technology Data Exchange (ETDEWEB)

    Kosarev, Yu G; Gusev, V D

    1978-01-01

    Works are presented on automation systems for editing and publishing operations by methods of processing symbol information and information contained in training selection (ranking of objectives by promise, classification algorithm of tones and noise). The book will be of interest to specialists in the automation of processing textural information, programming, and pattern recognition.

  18. Natural Information Processing Systems

    OpenAIRE

    John Sweller; Susan Sweller

    2006-01-01

    Natural information processing systems such as biological evolution and human cognition organize information used to govern the activities of natural entities. When dealing with biologically secondary information, these systems can be specified by five common principles that we propose underlie natural information processing systems. The principles equate: (1) human long-term memory with a genome; (2) learning from other humans with biological reproduction; (3) problem solving through random ...

  19. Epidemic processes in complex networks

    Science.gov (United States)

    Pastor-Satorras, Romualdo; Castellano, Claudio; Van Mieghem, Piet; Vespignani, Alessandro

    2015-07-01

    In recent years the research community has accumulated overwhelming evidence for the emergence of complex and heterogeneous connectivity patterns in a wide range of biological and sociotechnical systems. The complex properties of real-world networks have a profound impact on the behavior of equilibrium and nonequilibrium phenomena occurring in various systems, and the study of epidemic spreading is central to our understanding of the unfolding of dynamical processes in complex networks. The theoretical analysis of epidemic spreading in heterogeneous networks requires the development of novel analytical frameworks, and it has produced results of conceptual and practical relevance. A coherent and comprehensive review of the vast research activity concerning epidemic processes is presented, detailing the successful theoretical approaches as well as making their limits and assumptions clear. Physicists, mathematicians, epidemiologists, computer, and social scientists share a common interest in studying epidemic spreading and rely on similar models for the description of the diffusion of pathogens, knowledge, and innovation. For this reason, while focusing on the main results and the paradigmatic models in infectious disease modeling, the major results concerning generalized social contagion processes are also presented. Finally, the research activity at the forefront in the study of epidemic spreading in coevolving, coupled, and time-varying networks is reported.

  20. Process-aware information system development for the healthcare domain : consistency, reliability and effectiveness

    NARCIS (Netherlands)

    Mans, R.S.; Aalst, van der W.M.P.; Russell, N.C.; Bakker, P.J.M.; Moleman, A.J.; Rinderle-Ma, S.; Sadiq, S.; Leymann, F.

    2010-01-01

    Optimal support for complex healthcare processes cannot be provided by a single out-of-the-box Process-Aware Information System and necessitates the construction of customized applications based on these systems. In order to allow for the seamless integration of the new technology into the existing

  1. Toward theoretical understanding of the fertility preservation decision-making process: Examining information processing among young women with cancer

    Science.gov (United States)

    Hershberger, Patricia E.; Finnegan, Lorna; Altfeld, Susan; Lake, Sara; Hirshfeld-Cytron, Jennifer

    2014-01-01

    Background Young women with cancer now face the complex decision about whether to undergo fertility preservation. Yet little is known about how these women process information involved in making this decision. Objective The purpose of this paper is to expand theoretical understanding of the decision-making process by examining aspects of information processing among young women diagnosed with cancer. Methods Using a grounded theory approach, 27 women with cancer participated in individual, semi-structured interviews. Data were coded and analyzed using constant-comparison techniques that were guided by five dimensions within the Contemplate phase of the decision-making process framework. Results In the first dimension, young women acquired information primarily from clinicians and Internet sources. Experiential information, often obtained from peers, occurred in the second dimension. Preferences and values were constructed in the third dimension as women acquired factual, moral, and ethical information. Women desired tailored, personalized information that was specific to their situation in the fourth dimension; however, women struggled with communicating these needs to clinicians. In the fifth dimension, women offered detailed descriptions of clinician behaviors that enhance or impede decisional debriefing. Conclusion Better understanding of theoretical underpinnings surrounding women’s information processes can facilitate decision support and improve clinical care. PMID:24552086

  2. Three faces of entropy for complex systems: Information, thermodynamics, and the maximum entropy principle

    Science.gov (United States)

    Thurner, Stefan; Corominas-Murtra, Bernat; Hanel, Rudolf

    2017-09-01

    There are at least three distinct ways to conceptualize entropy: entropy as an extensive thermodynamic quantity of physical systems (Clausius, Boltzmann, Gibbs), entropy as a measure for information production of ergodic sources (Shannon), and entropy as a means for statistical inference on multinomial processes (Jaynes maximum entropy principle). Even though these notions represent fundamentally different concepts, the functional form of the entropy for thermodynamic systems in equilibrium, for ergodic sources in information theory, and for independent sampling processes in statistical systems, is degenerate, H (p ) =-∑ipilogpi . For many complex systems, which are typically history-dependent, nonergodic, and nonmultinomial, this is no longer the case. Here we show that for such processes, the three entropy concepts lead to different functional forms of entropy, which we will refer to as SEXT for extensive entropy, SIT for the source information rate in information theory, and SMEP for the entropy functional that appears in the so-called maximum entropy principle, which characterizes the most likely observable distribution functions of a system. We explicitly compute these three entropy functionals for three concrete examples: for Pólya urn processes, which are simple self-reinforcing processes, for sample-space-reducing (SSR) processes, which are simple history dependent processes that are associated with power-law statistics, and finally for multinomial mixture processes.

  3. Additional application areas of the 3D process information display; Weiterfuehrende Einsatzgebiete des 3-D-Prozessinformationsdisplays

    Energy Technology Data Exchange (ETDEWEB)

    Meissner, K. [Institut fuer Automatisierung und Informatik GmbH, Zentrum fuer industrielle Forschung und Entwicklung, Wernigerode (Germany); Hensel, H. [Hochschule Harz, Fachbereich Automatisierung und Informatik, Wernigerode (Germany)

    2007-07-01

    The current technological progress in the process industry results in a significant increase of the complexity of control systems. The amount of supervised information grows constantly for each operator because of a higher level of automation and an optimized information acquisition of the control systems. This development results in a cognitive overload of the operator which causes incorrect behaviour and responses in alert situations. In technical literature, several approaches are discussed to counteract this problem. This paper presents the newly developed 3-D Process Information Display (3D-PID) and describes which additional application areas as a primary representation method to the supervision of complex process conditions are imaginable. The 3D-PID is based on a cognitive scenic representation of the process values within a 3-D process room. Particularly the problem of the overview and detail presentation known from the literature is discussed. (orig.)

  4. Evolution of the archaeal and mammalian information processing systems: towards an archaeal model for human disease.

    Science.gov (United States)

    Lyu, Zhe; Whitman, William B

    2017-01-01

    Current evolutionary models suggest that Eukaryotes originated from within Archaea instead of being a sister lineage. To test this model of ancient evolution, we review recent studies and compare the three major information processing subsystems of replication, transcription and translation in the Archaea and Eukaryotes. Our hypothesis is that if the Eukaryotes arose within the archaeal radiation, their information processing systems will appear to be one of kind and not wholly original. Within the Eukaryotes, the mammalian or human systems are emphasized because of their importance in understanding health. Biochemical as well as genetic studies provide strong evidence for the functional similarity of archaeal homologs to the mammalian information processing system and their dissimilarity to the bacterial systems. In many independent instances, a simple archaeal system is functionally equivalent to more elaborate eukaryotic homologs, suggesting that evolution of complexity is likely an central feature of the eukaryotic information processing system. Because fewer components are often involved, biochemical characterizations of the archaeal systems are often easier to interpret. Similarly, the archaeal cell provides a genetically and metabolically simpler background, enabling convenient studies on the complex information processing system. Therefore, Archaea could serve as a parsimonious and tractable host for studying human diseases that arise in the information processing systems.

  5. Proteomic amino-termini profiling reveals targeting information for protein import into complex plastids.

    Directory of Open Access Journals (Sweden)

    Pitter F Huesgen

    Full Text Available In organisms with complex plastids acquired by secondary endosymbiosis from a photosynthetic eukaryote, the majority of plastid proteins are nuclear-encoded, translated on cytoplasmic ribosomes, and guided across four membranes by a bipartite targeting sequence. In-depth understanding of this vital import process has been impeded by a lack of information about the transit peptide part of this sequence, which mediates transport across the inner three membranes. We determined the mature N-termini of hundreds of proteins from the model diatom Thalassiosira pseudonana, revealing extensive N-terminal modification by acetylation and proteolytic processing in both cytosol and plastid. We identified 63 mature N-termini of nucleus-encoded plastid proteins, deduced their complete transit peptide sequences, determined a consensus motif for their cleavage by the stromal processing peptidase, and found evidence for subsequent processing by a plastid methionine aminopeptidase. The cleavage motif differs from that of higher plants, but is shared with other eukaryotes with complex plastids.

  6. Statistical methods for anomaly detection in the complex process; Methodes statistiques de detection d'anomalies de fonctionnement dans les processus complexes

    Energy Technology Data Exchange (ETDEWEB)

    Al Mouhamed, Mayez

    1977-09-15

    In a number of complex physical systems the accessible signals are often characterized by random fluctuations about a mean value. The fluctuations (signature) often transmit information about the state of the system that the mean value cannot predict. This study is undertaken to elaborate statistical methods of anomaly detection on the basis of signature analysis of the noise inherent in the process. The algorithm presented first learns the characteristics of normal operation of a complex process. Then it detects small deviations from the normal behavior. The algorithm can be implemented in a medium-sized computer for on line application. (author) [French] Dans de nombreux systemes physiques complexes les grandeurs accessibles a l'homme sont souvent caracterisees par des fluctuations aleatoires autour d'une valeur moyenne. Les fluctuations (signatures) transmettent souvent des informations sur l'etat du systeme que la valeur moyenne ne peut predire. Cette etude est entreprise pour elaborer des methodes statistiques de detection d'anomalies de fonctionnement sur la base de l'analyse des signatures contenues dans les signaux de bruit provenant du processus. L'algorithme presente est capable de: 1/ Apprendre les caracteristiques des operations normales dans un processus complexe. 2/ Detecter des petites deviations par rapport a la conduite normale du processus. L'algorithme peut etre implante sur un calculateur de taille moyenne pour les applications en ligne. (auteur)

  7. The creation of the analytical information system to serve the process of complex decommissioning of nuclear submarines (NSM) and surface ships (SS) with nuclear power installations (NPI)

    International Nuclear Information System (INIS)

    Terentiev, V.G.; Yakovlev, N.E.; Tyurin, A.V.

    2002-01-01

    Management of the decommissioning of nuclear vessels includes information collection, accumulation, systematisation and analysis on the complex utilization of nuclear submarines and surface ships with nuclear power installations and on treatment of spent nuclear fuel and radioactive wastes. The relevant data on radiation and ecology, science and technology, law and economy, administration and management should be properly processed. The general objective of the analytical information system (AIS) development, described in the present paper, is the efficiency upgrading for nuclear submarine utilization management and decision making. The report considers information provision and functioning principles as well as software/hardware solutions associated with the AIS creation. (author)

  8. Supramolecular chemistry: from molecular information towards self-organization and complex matter

    International Nuclear Information System (INIS)

    Lehn, Jean-Marie

    2004-01-01

    Molecular chemistry has developed a wide range of very powerful procedures for constructing ever more sophisticated molecules from atoms linked by covalent bonds. Beyond molecular chemistry lies supramolecular chemistry, which aims at developing highly complex chemical systems from components interacting via non-covalent intermolecular forces. By the appropriate manipulation of these interactions, supramolecular chemistry became progressively the chemistry of molecular information, involving the storage of information at the molecular level, in the structural features, and its retrieval, transfer, and processing at the supramolecular level, through molecular recognition processes operating via specific interactional algorithms. This has paved the way towards apprehending chemistry also as an information science. Numerous receptors capable of recognizing, i.e. selectively binding, specific substrates have been developed, based on the molecular information stored in the interacting species. Suitably functionalized receptors may perform supramolecular catalysis and selective transport processes. In combination with polymolecular organization, recognition opens ways towards the design of molecular and supramolecular devices based on functional (photoactive, electroactive, ionoactive, etc) components. A step beyond preorganization consists in the design of systems undergoing self-organization, i.e. systems capable of spontaneously generating well-defined supramolecular architectures by self-assembly from their components. Self-organization processes, directed by the molecular information stored in the components and read out at the supramolecular level through specific interactions, represent the operation of programmed chemical systems. They have been implemented for the generation of a variety of discrete functional architectures of either organic or inorganic nature. Self-organization processes also give access to advanced supramolecular materials, such as

  9. Information communication on complex networks

    International Nuclear Information System (INIS)

    Igarashi, Akito; Kawamoto, Hiroki; Maruyama, Takahiro; Morioka, Atsushi; Naganuma, Yuki

    2013-01-01

    Since communication networks such as the Internet, which is regarded as a complex network, have recently become a huge scale and a lot of data pass through them, the improvement of packet routing strategies for transport is one of the most significant themes in the study of computer networks. It is especially important to find routing strategies which can bear as many traffic as possible without congestion in complex networks. First, using neural networks, we introduce a strategy for packet routing on complex networks, where path lengths and queue lengths in nodes are taken into account within a framework of statistical physics. Secondly, instead of using shortest paths, we propose efficient paths which avoid hubs, nodes with a great many degrees, on scale-free networks with a weight of each node. We improve the heuristic algorithm proposed by Danila et. al. which optimizes step by step routing properties on congestion by using the information of betweenness, the probability of paths passing through a node in all optimal paths which are defined according to a rule, and mitigates the congestion. We confirm the new heuristic algorithm which balances traffic on networks by achieving minimization of the maximum betweenness in much smaller number of iteration steps. Finally, We model virus spreading and data transfer on peer-to-peer (P2P) networks. Using mean-field approximation, we obtain an analytical formulation and emulate virus spreading on the network and compare the results with those of simulation. Moreover, we investigate the mitigation of information traffic congestion in the P2P networks.

  10. Information processing, computation, and cognition.

    Science.gov (United States)

    Piccinini, Gualtiero; Scarantino, Andrea

    2011-01-01

    Computation and information processing are among the most fundamental notions in cognitive science. They are also among the most imprecisely discussed. Many cognitive scientists take it for granted that cognition involves computation, information processing, or both - although others disagree vehemently. Yet different cognitive scientists use 'computation' and 'information processing' to mean different things, sometimes without realizing that they do. In addition, computation and information processing are surrounded by several myths; first and foremost, that they are the same thing. In this paper, we address this unsatisfactory state of affairs by presenting a general and theory-neutral account of computation and information processing. We also apply our framework by analyzing the relations between computation and information processing on one hand and classicism, connectionism, and computational neuroscience on the other. We defend the relevance to cognitive science of both computation, at least in a generic sense, and information processing, in three important senses of the term. Our account advances several foundational debates in cognitive science by untangling some of their conceptual knots in a theory-neutral way. By leveling the playing field, we pave the way for the future resolution of the debates' empirical aspects.

  11. Information Geometric Complexity of a Trivariate Gaussian Statistical Model

    Directory of Open Access Journals (Sweden)

    Domenico Felice

    2014-05-01

    Full Text Available We evaluate the information geometric complexity of entropic motion on low-dimensional Gaussian statistical manifolds in order to quantify how difficult it is to make macroscopic predictions about systems in the presence of limited information. Specifically, we observe that the complexity of such entropic inferences not only depends on the amount of available pieces of information but also on the manner in which such pieces are correlated. Finally, we uncover that, for certain correlational structures, the impossibility of reaching the most favorable configuration from an entropic inference viewpoint seems to lead to an information geometric analog of the well-known frustration effect that occurs in statistical physics.

  12. Enabling Controlling Complex Networks with Local Topological Information.

    Science.gov (United States)

    Li, Guoqi; Deng, Lei; Xiao, Gaoxi; Tang, Pei; Wen, Changyun; Hu, Wuhua; Pei, Jing; Shi, Luping; Stanley, H Eugene

    2018-03-15

    Complex networks characterize the nature of internal/external interactions in real-world systems including social, economic, biological, ecological, and technological networks. Two issues keep as obstacles to fulfilling control of large-scale networks: structural controllability which describes the ability to guide a dynamical system from any initial state to any desired final state in finite time, with a suitable choice of inputs; and optimal control, which is a typical control approach to minimize the cost for driving the network to a predefined state with a given number of control inputs. For large complex networks without global information of network topology, both problems remain essentially open. Here we combine graph theory and control theory for tackling the two problems in one go, using only local network topology information. For the structural controllability problem, a distributed local-game matching method is proposed, where every node plays a simple Bayesian game with local information and local interactions with adjacent nodes, ensuring a suboptimal solution at a linear complexity. Starring from any structural controllability solution, a minimizing longest control path method can efficiently reach a good solution for the optimal control in large networks. Our results provide solutions for distributed complex network control and demonstrate a way to link the structural controllability and optimal control together.

  13. Impact of familiarity on information complexity in human-computer interfaces

    Directory of Open Access Journals (Sweden)

    Bakaev Maxim

    2016-01-01

    Full Text Available A quantitative measure of information complexity remains very much desirable in HCI field, since it may aid in optimization of user interfaces, especially in human-computer systems for controlling complex objects. Our paper is dedicated to exploration of subjective (subject-depended aspect of the complexity, conceptualized as information familiarity. Although research of familiarity in human cognition and behaviour is done in several fields, the accepted models in HCI, such as Human Processor or Hick-Hyman’s law do not generally consider this issue. In our experimental study the subjects performed search and selection of digits and letters, whose familiarity was conceptualized as frequency of occurrence in numbers and texts. The analysis showed significant effect of information familiarity on selection time and throughput in regression models, although the R2 values were somehow low. Still, we hope that our results might aid in quantification of information complexity and its further application for optimizing interaction in human-machine systems.

  14. Kinetics of the Dynamical Information Shannon Entropy for Complex Systems

    International Nuclear Information System (INIS)

    Yulmetyev, R.M.; Yulmetyeva, D.G.

    1999-01-01

    Kinetic behaviour of dynamical information Shannon entropy is discussed for complex systems: physical systems with non-Markovian property and memory in correlation approximation, and biological and physiological systems with sequences of the Markovian and non-Markovian random noises. For the stochastic processes, a description of the information entropy in terms of normalized time correlation functions is given. The influence and important role of two mutually dependent channels of the entropy change, correlation (creation or generation of correlations) and anti-correlation (decay or annihilation of correlation) is discussed. The method developed here is also used in analysis of the density fluctuations in liquid cesium obtained from slow neutron scattering data, fractal kinetics of the long-range fluctuation in the short-time human memory and chaotic dynamics of R-R intervals of human ECG. (author)

  15. Musical beauty and information compression: Complex to the ear but simple to the mind?

    Science.gov (United States)

    Hudson, Nicholas J

    2011-01-20

    The biological origin of music, its universal appeal across human cultures and the cause of its beauty remain mysteries. For example, why is Ludwig Van Beethoven considered a musical genius but Kylie Minogue is not? Possible answers to these questions will be framed in the context of Information Theory. The entire life-long sensory data stream of a human is enormous. The adaptive solution to this problem of scale is information compression, thought to have evolved to better handle, interpret and store sensory data. In modern humans highly sophisticated information compression is clearly manifest in philosophical, mathematical and scientific insights. For example, the Laws of Physics explain apparently complex observations with simple rules. Deep cognitive insights are reported as intrinsically satisfying, implying that at some point in evolution, the practice of successful information compression became linked to the physiological reward system. I hypothesise that the establishment of this "compression and pleasure" connection paved the way for musical appreciation, which subsequently became free (perhaps even inevitable) to emerge once audio compression had become intrinsically pleasurable in its own right. For a range of compositions, empirically determine the relationship between the listener's pleasure and "lossless" audio compression. I hypothesise that enduring musical masterpieces will possess an interesting objective property: despite apparent complexity, they will also exhibit high compressibility. Artistic masterpieces and deep Scientific insights share the common process of data compression. Musical appreciation is a parasite on a much deeper information processing capacity. The coalescence of mathematical and musical talent in exceptional individuals has a parsimonious explanation. Musical geniuses are skilled in composing music that appears highly complex to the ear yet transpires to be highly simple to the mind. The listener's pleasure is influenced

  16. Scientific information processing procedures

    Directory of Open Access Journals (Sweden)

    García, Maylin

    2013-07-01

    Full Text Available The paper systematizes several theoretical view-points on scientific information processing skill. It decomposes the processing skills into sub-skills. Several methods such analysis, synthesis, induction, deduction, document analysis were used to build up a theoretical framework. Interviews and survey to professional being trained and a case study was carried out to evaluate the results. All professional in the sample improved their performance in scientific information processing.

  17. A foundational methodology for determining system static complexity using notional lunar oxygen production processes

    Science.gov (United States)

    Long, Nicholas James

    This thesis serves to develop a preliminary foundational methodology for evaluating the static complexity of future lunar oxygen production systems when extensive information is not yet available about the various systems under consideration. Evaluating static complexity, as part of a overall system complexity analysis, is an important consideration in ultimately selecting a process to be used in a lunar base. When system complexity is higher, there is generally an overall increase in risk which could impact the safety of astronauts and the economic performance of the mission. To evaluate static complexity in lunar oxygen production, static complexity is simplified and defined into its essential components. First, three essential dimensions of static complexity are investigated, including interconnective complexity, strength of connections, and complexity in variety. Then a set of methods is developed upon which to separately evaluate each dimension. Q-connectivity analysis is proposed as a means to evaluate interconnective complexity and strength of connections. The law of requisite variety originating from cybernetic theory is suggested to interpret complexity in variety. Secondly, a means to aggregate the results of each analysis is proposed to create holistic measurement for static complexity using the Single Multi-Attribute Ranking Technique (SMART). Each method of static complexity analysis and the aggregation technique is demonstrated using notional data for four lunar oxygen production processes.

  18. Information accessibility and cryptic processes

    Energy Technology Data Exchange (ETDEWEB)

    Mahoney, John R; Ellison, Christopher J; Crutchfield, James P [Complexity Sciences Center and Physics Department, University of California at Davis, One Shields Avenue, Davis, CA 95616 (United States)], E-mail: jrmahoney@ucdavis.edu, E-mail: cellison@cse.ucdavis.edu, E-mail: chaos@cse.ucdavis.edu

    2009-09-11

    We give a systematic expansion of the crypticity-a recently introduced measure of the inaccessibility of a stationary process's internal state information. This leads to a hierarchy of k-cryptic processes and allows us to identify finite-state processes that have infinite cryptic order-the internal state information is present across arbitrarily long, observed sequences. The crypticity expansion is exact in both the finite- and infinite-order cases. It turns out that k-crypticity is complementary to the Markovian finite-order property that describes state information in processes. One application of these results is an efficient expansion of the excess entropy-the mutual information between a process's infinite past and infinite future-that is finite and exact for finite-order cryptic processes. (fast track communication)

  19. Motivated information processing and group decision-making : Effects of process accountability on information processing and decision quality

    NARCIS (Netherlands)

    Scholten, Lotte; van Knippenberg, Daan; Nijstad, Bernard A.; De Dreu, Carsten K. W.

    Integrating dual-process models [Chaiken, S., & Trope, Y. (Eds.). (1999). Dual-process theories in social psychology. NewYork: Guilford Press] with work on information sharing and group decision-making [Stasser, G., & Titus, W. (1985). Pooling of unshared information in group decision making: biased

  20. Structural Information Inference from Lanthanoid Complexing Systems: Photoluminescence Studies on Isolated Ions

    Science.gov (United States)

    Greisch, Jean Francois; Harding, Michael E.; Chmela, Jiri; Klopper, Willem M.; Schooss, Detlef; Kappes, Manfred M.

    2016-06-01

    The application of lanthanoid complexes ranges from photovoltaics and light-emitting diodes to quantum memories and biological assays. Rationalization of their design requires a thorough understanding of intramolecular processes such as energy transfer, charge transfer, and non-radiative decay involving their subunits. Characterization of the excited states of such complexes considerably benefits from mass spectrometric methods since the associated optical transitions and processes are strongly affected by stoichiometry, symmetry, and overall charge state. We report herein spectroscopic measurements on ensembles of ions trapped in the gas phase and soft-landed in neon matrices. Their interpretation is considerably facilitated by direct comparison with computations. The combination of energy- and time-resolved measurements on isolated species with density functional as well as ligand-field and Franck-Condon computations enables us to infer structural as well as dynamical information about the species studied. The approach is first illustrated for sets of model lanthanoid complexes whose structure and electronic properties are systematically varied via the substitution of one component (lanthanoid or alkali,alkali-earth ion): (i) systematic dependence of ligand-centered phosphorescence on the lanthanoid(III) promotion energy and its impact on sensitization, and (ii) structural changes induced by the substitution of alkali or alkali-earth ions in relation with structures inferred using ion mobility spectroscopy. The temperature dependence of sensitization is briefly discussed. The focus is then shifted to measurements involving europium complexes with doxycycline an antibiotic of the tetracycline family. Besides discussing the complexes' structural and electronic features, we report on their use to monitor enzymatic processes involving hydrogen peroxide or biologically relevant molecules such as adenosine triphosphate (ATP).

  1. Information search and decision making: effects of age and complexity on strategy use.

    Science.gov (United States)

    Queen, Tara L; Hess, Thomas M; Ennis, Gilda E; Dowd, Keith; Grühn, Daniel

    2012-12-01

    The impact of task complexity on information search strategy and decision quality was examined in a sample of 135 young, middle-aged, and older adults. We were particularly interested in the competing roles of fluid cognitive ability and domain knowledge and experience, with the former being a negative influence and the latter being a positive influence on older adults' performance. Participants utilized 2 decision matrices, which varied in complexity, regarding a consumer purchase. Using process tracing software and an algorithm developed to assess decision strategy, we recorded search behavior, strategy selection, and final decision. Contrary to expectations, older adults were not more likely than the younger age groups to engage in information-minimizing search behaviors in response to increases in task complexity. Similarly, adults of all ages used comparable decision strategies and adapted their strategies to the demands of the task. We also examined decision outcomes in relation to participants' preferences. Overall, it seems that older adults utilize simpler sets of information primarily reflecting the most valued attributes in making their choice. The results of this study suggest that older adults are adaptive in their approach to decision making and that this ability may benefit from accrued knowledge and experience. 2013 APA, all rights reserved

  2. Information Processing and Limited Liability

    OpenAIRE

    Bartosz Mackowiak; Mirko Wiederholt

    2012-01-01

    Decision-makers often face limited liability and thus know that their loss will be bounded. We study how limited liability affects the behavior of an agent who chooses how much information to acquire and process in order to take a good decision. We find that an agent facing limited liability processes less information than an agent with unlimited liability. The informational gap between the two agents is larger in bad times than in good times and when information is more costly to process.

  3. Information and analytical data system on radioecological impact of the Russian nuclear complex

    International Nuclear Information System (INIS)

    Iskra, A. A.; Serezhnikov, D. A.

    2006-01-01

    The information and analytical system contains data on enterprises of the Russian nuclear complex, beginning from mining and processing of uranium ores and ending by processing of spent nuclear fuel (SNF) and ionizing radiation sources (IRS). Radioecological information is presented about radiation hazardous objects of civil mission of the Federal Agency for Atomic Energy (Rosatom): underground leaching sites, radioactive waste (RW) storage facilities, tailing dumps, burials, reactors and critical facilities, etc. Radioecological impact is examined and information and regulatory-methodical documents of Federal Agency on Hydro meteorology and Environmental Monitoring, Federal Agency for Atomic Energy, Federal Agency on ecological, technological and atomic control, Federal Agency on Geodesy and Cartography is used concerning: -radionuclide discharges from the enterprises; -radionuclide releases from the enterprises under routine and accidental conditions; -contaminated lands; -radioecological consequences of RW dumped in the Arctic and Far-East seas. The report is accompanied by the operating sophisticated database demonstration

  4. A Process Mining Based Service Composition Approach for Mobile Information Systems

    Directory of Open Access Journals (Sweden)

    Chengxi Huang

    2017-01-01

    Full Text Available Due to the growing trend in applying big data and cloud computing technologies in information systems, it is becoming an important issue to handle the connection between large scale of data and the associated business processes in the Internet of Everything (IoE environment. Service composition as a widely used phase in system development has some limits when the complexity of relationship among data increases. Considering the expanding scale and the variety of devices in mobile information systems, a process mining based service composition approach is proposed in this paper in order to improve the adaptiveness and efficiency of compositions. Firstly, a preprocessing is conducted to extract existing service execution information from server-side logs. Then process mining algorithms are applied to discover the overall event sequence with preprocessed data. After that, a scene-based service composition is applied to aggregate scene information and relocate services of the system. Finally, a case study that applied the work in mobile medical application proves that the approach is practical and valuable in improving service composition adaptiveness and efficiency.

  5. Real-time monitoring of clinical processes using complex event processing and transition systems.

    Science.gov (United States)

    Meinecke, Sebastian

    2014-01-01

    Dependencies between tasks in clinical processes are often complex and error-prone. Our aim is to describe a new approach for the automatic derivation of clinical events identified via the behaviour of IT systems using Complex Event Processing. Furthermore we map these events on transition systems to monitor crucial clinical processes in real-time for preventing and detecting erroneous situations.

  6. Antecedents and Consequences of Consumer's Response to Health Information Complexity

    DEFF Research Database (Denmark)

    Hansen, Torben; Uth Thomsen, Thyra; Beckmann, Suzanne C.

    2013-01-01

    This study develops and empirically tests a model for understanding food consumers' health information seeking behaviour. Data were collected from 504 food consumers using a nationally representative consumer panel. The obtained Lisrel results suggest that consumers' product-specific health...... information seeking is positively affected by general food involvement and by usability of product-specific health information. Moreover, product-specific health information seeking and product-specific health information complexity are both positively related to post-purchase health-related dissonance....... This link between information complexity and post-purchase dissonance has implications for marketers of food products since our results suggest that consumers might avoid purchasing the same food item again if post-purchase dissonance is experienced....

  7. Towards a reverse Newman’s theorem in interactive information complexity

    Czech Academy of Sciences Publication Activity Database

    Brody, J.; Buhrman, H.; Koucký, Michal; Loff, B.; Speelman, F.; Vereshchagin, N.K.

    2016-01-01

    Roč. 76, č. 3 (2016), s. 749-781 ISSN 0178-4617 R&D Projects: GA AV ČR IAA100190902 Institutional support: RVO:67985840 Keywords : communication complexity * information complexity * information theory Subject RIV: BA - General Mathematics Impact factor: 0.735, year: 2016 http://link.springer.com/article/10.1007%2Fs00453-015-0112-9

  8. Towards a reverse Newman’s theorem in interactive information complexity

    Czech Academy of Sciences Publication Activity Database

    Brody, J.; Buhrman, H.; Koucký, Michal; Loff, B.; Speelman, F.; Vereshchagin, N.K.

    2016-01-01

    Roč. 76, č. 3 (2016), s. 749-781 ISSN 0178-4617 R&D Projects: GA AV ČR IAA100190902 Institutional support: RVO:67985840 Keywords : communication complexity * information complexity * information theory Subject RIV: BA - General Mathematics Impact factor: 0.735, year: 2016 http ://link.springer.com/article/10.1007%2Fs00453-015-0112-9

  9. Emotional Picture and Word Processing: An fMRI Study on Effects of Stimulus Complexity

    Science.gov (United States)

    Schlochtermeier, Lorna H.; Kuchinke, Lars; Pehrs, Corinna; Urton, Karolina; Kappelhoff, Hermann; Jacobs, Arthur M.

    2013-01-01

    Neuroscientific investigations regarding aspects of emotional experiences usually focus on one stimulus modality (e.g., pictorial or verbal). Similarities and differences in the processing between the different modalities have rarely been studied directly. The comparison of verbal and pictorial emotional stimuli often reveals a processing advantage of emotional pictures in terms of larger or more pronounced emotion effects evoked by pictorial stimuli. In this study, we examined whether this picture advantage refers to general processing differences or whether it might partly be attributed to differences in visual complexity between pictures and words. We first developed a new stimulus database comprising valence and arousal ratings for more than 200 concrete objects representable in different modalities including different levels of complexity: words, phrases, pictograms, and photographs. Using fMRI we then studied the neural correlates of the processing of these emotional stimuli in a valence judgment task, in which the stimulus material was controlled for differences in emotional arousal. No superiority for the pictorial stimuli was found in terms of emotional information processing with differences between modalities being revealed mainly in perceptual processing regions. While visual complexity might partly account for previously found differences in emotional stimulus processing, the main existing processing differences are probably due to enhanced processing in modality specific perceptual regions. We would suggest that both pictures and words elicit emotional responses with no general superiority for either stimulus modality, while emotional responses to pictures are modulated by perceptual stimulus features, such as picture complexity. PMID:23409009

  10. Toward understanding the thermodynamics of TALSPEAK process. Medium effects on actinide complexation

    International Nuclear Information System (INIS)

    Zalupski, Peter R.; Martin, Leigh R.; Nash, Ken; Nakamura, Yoshinobu; Yamamoto, Masahiko

    2009-01-01

    The ingenious combination of lactate and diethylenetriamine-N,N,N',N(double p rime),N(double p rime)-pentaacetic acid (DTPA) as an aqueous actinide-complexing medium forms the basis of the successful separation of americium and curium from lanthanides known as the TALSPEAK process. While numerous reports in the prior literature have focused on the optimization of this solvent extraction system, considerably less attention has been devoted to the understanding of the basic thermodynamic features of the complex fluids responsible for the separation. The available thermochemical information of both lactate and DTPA protonation and metal complexation reactions are representative of the behavior of these ions under idealized conditions. Our previous studies of medium effects on lactate protonation suggest that significant departures from the speciation predicted based on reported thermodynamic values should be expected in the TALSPEAK aqueous environment. Thermodynamic parameters describing the separation chemistry of this process thus require further examination at conditions significantly removed from conventional ideal systems commonly employed in fundamental solution chemistry. Such thermodynamic characterization is the key to predictive modelling of TALSPEAK. Improved understanding will, in principle, allow process technologists to more efficiently respond to off-normal conditions during large scale process operation. In this report, the results of calorimetric and potentiometric investigations of the effects of aqueous electrolytes on the thermodynamic parameters for lactate protonation and lactate complexation of americium and neodymium will be presented. Studies on the lactate protonation equilibrium will clearly illustrate distinct thermodynamic variations between strong electrolyte aqueous systems and buffered lactate environment.

  11. Hybrid quantum information processing

    Energy Technology Data Exchange (ETDEWEB)

    Furusawa, Akira [Department of Applied Physics, School of Engineering, The University of Tokyo (Japan)

    2014-12-04

    I will briefly explain the definition and advantage of hybrid quantum information processing, which is hybridization of qubit and continuous-variable technologies. The final goal would be realization of universal gate sets both for qubit and continuous-variable quantum information processing with the hybrid technologies. For that purpose, qubit teleportation with a continuousvariable teleporter is one of the most important ingredients.

  12. Phenylketonuria and Complex Spatial Visualization: An Analysis of Information Processing.

    Science.gov (United States)

    Brunner, Robert L.; And Others

    1987-01-01

    The study of the ability of 16 early treated phenylketonuric (PKU) patients (ages 6-23 years) to solve complex spatial problems suggested that choice of problem-solving strategy, attention span, and accuracy of mental representation may be affected in PKU patients, despite efforts to maintain well-controlled phenylalanine concentrations in the…

  13. Information theoretic methods for image processing algorithm optimization

    Science.gov (United States)

    Prokushkin, Sergey F.; Galil, Erez

    2015-01-01

    Modern image processing pipelines (e.g., those used in digital cameras) are full of advanced, highly adaptive filters that often have a large number of tunable parameters (sometimes > 100). This makes the calibration procedure for these filters very complex, and the optimal results barely achievable in the manual calibration; thus an automated approach is a must. We will discuss an information theory based metric for evaluation of algorithm adaptive characteristics ("adaptivity criterion") using noise reduction algorithms as an example. The method allows finding an "orthogonal decomposition" of the filter parameter space into the "filter adaptivity" and "filter strength" directions. This metric can be used as a cost function in automatic filter optimization. Since it is a measure of a physical "information restoration" rather than perceived image quality, it helps to reduce the set of the filter parameters to a smaller subset that is easier for a human operator to tune and achieve a better subjective image quality. With appropriate adjustments, the criterion can be used for assessment of the whole imaging system (sensor plus post-processing).

  14. Visual perception of complex shape-transforming processes.

    Science.gov (United States)

    Schmidt, Filipp; Fleming, Roland W

    2016-11-01

    Morphogenesis-or the origin of complex natural form-has long fascinated researchers from practically every branch of science. However, we know practically nothing about how we perceive and understand such processes. Here, we measured how observers visually infer shape-transforming processes. Participants viewed pairs of objects ('before' and 'after' a transformation) and identified points that corresponded across the transformation. This allowed us to map out in spatial detail how perceived shape and space were affected by the transformations. Participants' responses were strikingly accurate and mutually consistent for a wide range of non-rigid transformations including complex growth-like processes. A zero-free-parameter model based on matching and interpolating/extrapolating the positions of high-salience contour features predicts the data surprisingly well, suggesting observers infer spatial correspondences relative to key landmarks. Together, our findings reveal the operation of specific perceptual organization processes that make us remarkably adept at identifying correspondences across complex shape-transforming processes by using salient object features. We suggest that these abilities, which allow us to parse and interpret the causally significant features of shapes, are invaluable for many tasks that involve 'making sense' of shape. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  15. Selected Topics on Managing Complexity and Information Systems Engineering: Editorial Introduction to Issue 8 of CSIMQ

    Directory of Open Access Journals (Sweden)

    Peter Forbrig

    2016-10-01

    Full Text Available Business process models greatly contribute to analyze and understand the activities of enterprises. However, it is still a challenge to cope with the complexity of systems specifications and their requirements. This issue of the journal of Complex Systems Informatics and Modeling (CSIMQ presents papers that discuss topics on managing complexity and information systems engineering. The papers are extended versions of selected papers from the workshop on Continuous Requirements Engineering held at the requirements engineering conference REFSQ 2016 in Gothenburg, the workshop on Managed Complexity held at the business informatics conference BIR 2016 in Prague, and the CAiSE 2016 Forum held in Ljubljana.

  16. Information in general medical practices: the information processing model.

    Science.gov (United States)

    Crowe, Sarah; Tully, Mary P; Cantrill, Judith A

    2010-04-01

    The need for effective communication and handling of secondary care information in general practices is paramount. To explore practice processes on receiving secondary care correspondence in a way that integrates the information needs and perceptions of practice staff both clinical and administrative. Qualitative study using semi-structured interviews with a wide range of practice staff (n = 36) in nine practices in the Northwest of England. Analysis was based on the framework approach using N-Vivo software and involved transcription, familiarization, coding, charting, mapping and interpretation. The 'information processing model' was developed to describe the six stages involved in practice processing of secondary care information. These included the amendment or updating of practice records whilst simultaneously or separately actioning secondary care recommendations, using either a 'one-step' or 'two-step' approach, respectively. Many factors were found to influence each stage and impact on the continuum of patient care. The primary purpose of processing secondary care information is to support patient care; this study raises the profile of information flow and usage within practices as an issue requiring further consideration.

  17. Improving Treatment Response for Paediatric Anxiety Disorders: An Information-Processing Perspective.

    Science.gov (United States)

    Ege, Sarah; Reinholdt-Dunne, Marie Louise

    2016-12-01

    Cognitive behavioural therapy (CBT) is considered the treatment of choice for paediatric anxiety disorders, yet there remains substantial room for improvement in treatment outcomes. This paper examines whether theory and research into the role of information-processing in the underlying psychopathology of paediatric anxiety disorders indicate possibilities for improving treatment response. Using a critical review of recent theoretical, empirical and academic literature, the paper examines the role of information-processing biases in paediatric anxiety disorders, the extent to which CBT targets information-processing biases, and possibilities for improving treatment response. The literature reviewed indicates a role for attentional and interpretational biases in anxious psychopathology. While there is theoretical grounding and limited empirical evidence to indicate that CBT ameliorates interpretational biases, evidence regarding the effects of CBT on attentional biases is mixed. Novel treatment methods including attention bias modification training, attention feedback awareness and control training, and mindfulness-based therapy may hold potential in targeting attentional biases, and thereby in improving treatment response. The integration of novel interventions into an existing evidence-based protocol is a complex issue and faces important challenges with regard to determining the optimal treatment package. Novel interventions targeting information-processing biases may hold potential in improving response to CBT for paediatric anxiety disorders. Many important questions remain to be answered.

  18. The capitalization of the accounting information in the process of stocks analyse

    OpenAIRE

    ciumag, anca

    2009-01-01

    The information is justifying its importance by the fact that its detailed aspects which contains can lead to signified economies and to the fusion of the stocking operations and procedures, when it is used the elaboration of decisions. The most complex structure, as data basis offered to the economic analysts, is represented by the accounting whose ability of coverage of the economic phenomena and processes, as well as of the patrimony existence in analytical and synthetic information, ...

  19. When you talk about "Information processing" what actually do you have in mind?

    OpenAIRE

    Diamant, Emanuel

    2012-01-01

    "Information Processing" is a recently launched buzzword whose meaning is vague and obscure even for the majority of its users. The reason for this is the lack of a suitable definition for the term "information". In my attempt to amend this bizarre situation, I have realized that, following the insights of Kolmogorov's Complexity theory, information can be defined as a description of structures observable in a given data set. Two types of structures could be easily distinguished in every data...

  20. Patterns of patient safety culture: a complexity and arts-informed project of knowledge translation.

    Science.gov (United States)

    Mitchell, Gail J; Tregunno, Deborah; Gray, Julia; Ginsberg, Liane

    2011-01-01

    The purpose of this paper is to describe patterns of patient safety culture that emerged from an innovative collaboration among health services researchers and fine arts colleagues. The group engaged in an arts-informed knowledge translation project to produce a dramatic expression of patient safety culture research for inclusion in a symposium. Scholars have called for a deeper understanding of the complex interrelationships among structure, process and outcomes relating to patient safety. Four patterns of patient safety culture--blinding familiarity, unyielding determination, illusion of control and dismissive urgency--are described with respect to how they informed creation of an arts-informed project for knowledge translation.

  1. Musical beauty and information compression: Complex to the ear but simple to the mind?

    Directory of Open Access Journals (Sweden)

    Hudson Nicholas J

    2011-01-01

    Full Text Available Abstract Background The biological origin of music, its universal appeal across human cultures and the cause of its beauty remain mysteries. For example, why is Ludwig Van Beethoven considered a musical genius but Kylie Minogue is not? Possible answers to these questions will be framed in the context of Information Theory. Presentation of the Hypothesis The entire life-long sensory data stream of a human is enormous. The adaptive solution to this problem of scale is information compression, thought to have evolved to better handle, interpret and store sensory data. In modern humans highly sophisticated information compression is clearly manifest in philosophical, mathematical and scientific insights. For example, the Laws of Physics explain apparently complex observations with simple rules. Deep cognitive insights are reported as intrinsically satisfying, implying that at some point in evolution, the practice of successful information compression became linked to the physiological reward system. I hypothesise that the establishment of this "compression and pleasure" connection paved the way for musical appreciation, which subsequently became free (perhaps even inevitable to emerge once audio compression had become intrinsically pleasurable in its own right. Testing the Hypothesis For a range of compositions, empirically determine the relationship between the listener's pleasure and "lossless" audio compression. I hypothesise that enduring musical masterpieces will possess an interesting objective property: despite apparent complexity, they will also exhibit high compressibility. Implications of the Hypothesis Artistic masterpieces and deep Scientific insights share the common process of data compression. Musical appreciation is a parasite on a much deeper information processing capacity. The coalescence of mathematical and musical talent in exceptional individuals has a parsimonious explanation. Musical geniuses are skilled in composing music

  2. Infochemistry Information Processing at the Nanoscale

    CERN Document Server

    Szacilowski, Konrad

    2012-01-01

    Infochemistry: Information Processing at the Nanoscale, defines a new field of science, and describes the processes, systems and devices at the interface between chemistry and information sciences. The book is devoted to the application of molecular species and nanostructures to advanced information processing. It includes the design and synthesis of suitable materials and nanostructures, their characterization, and finally applications of molecular species and nanostructures for information storage and processing purposes. Divided into twelve chapters; the first three chapters serve as an int

  3. Tidal Analysis Using Time–Frequency Signal Processing and Information Clustering

    Directory of Open Access Journals (Sweden)

    Antonio M. Lopes

    2017-07-01

    Full Text Available Geophysical time series have a complex nature that poses challenges to reaching assertive conclusions, and require advanced mathematical and computational tools to unravel embedded information. In this paper, time–frequency methods and hierarchical clustering (HC techniques are combined for processing and visualizing tidal information. In a first phase, the raw data are pre-processed for estimating missing values and obtaining dimensionless reliable time series. In a second phase, the Jensen–Shannon divergence is adopted for measuring dissimilarities between data collected at several stations. The signals are compared in the frequency and time–frequency domains, and the HC is applied to visualize hidden relationships. In a third phase, the long-range behavior of tides is studied by means of power law functions. Numerical examples demonstrate the effectiveness of the approach when dealing with a large volume of real-world data.

  4. Gradation of complexity and predictability of hydrological processes

    Science.gov (United States)

    Sang, Yan-Fang; Singh, Vijay P.; Wen, Jun; Liu, Changming

    2015-06-01

    Quantification of the complexity and predictability of hydrological systems is important for evaluating the impact of climate change on hydrological processes, and for guiding water activities. In the literature, the focus seems to have been on describing the complexity of spatiotemporal distribution of hydrological variables, but little attention has been paid to the study of complexity gradation, because the degree of absolute complexity of hydrological systems cannot be objectively evaluated. Here we show that complexity and predictability of hydrological processes can be graded into three ranks (low, middle, and high). The gradation is based on the difference in the energy distribution of hydrological series and that of white noise under multitemporal scales. It reflects different energy concentration levels and contents of deterministic components of the hydrological series in the three ranks. Higher energy concentration level reflects lower complexity and higher predictability, but scattered energy distribution being similar to white noise has the highest complexity and is almost unpredictable. We conclude that the three ranks (low, middle, and high) approximately correspond to deterministic, stochastic, and random hydrological systems, respectively. The result of complexity gradation can guide hydrological observations and modeling, and identification of similarity patterns among different hydrological systems.

  5. Information Center Complex publications and presentations, 1971-1980

    International Nuclear Information System (INIS)

    Gill, A.B.; Hawthorne, S.W.

    1981-08-01

    This indexed bibliography lists publications and presentations of the Information Center Complex, Information Division, Oak Ridge National Laboratory, from 1971 through 1980. The 659 entries cover such topics as toxicology, air and water pollution, management and transportation of hazardous wastes, energy resources and conservation, and information science. Publications range in length from 1 page to 3502 pages and include topical reports, books, journal articles, fact sheets, and newsletters. Author, title, and group indexes are provided. Annual updates are planned

  6. Information Center Complex publications and presentations, 1971-1980

    Energy Technology Data Exchange (ETDEWEB)

    Gill, A.B.; Hawthorne, S.W.

    1981-08-01

    This indexed bibliography lists publications and presentations of the Information Center Complex, Information Division, Oak Ridge National Laboratory, from 1971 through 1980. The 659 entries cover such topics as toxicology, air and water pollution, management and transportation of hazardous wastes, energy resources and conservation, and information science. Publications range in length from 1 page to 3502 pages and include topical reports, books, journal articles, fact sheets, and newsletters. Author, title, and group indexes are provided. Annual updates are planned.

  7. Application of SADT and ARIS methodologies for modeling and management of business processes of information systems

    Directory of Open Access Journals (Sweden)

    O. V. Fedorova

    2018-01-01

    Full Text Available The article is devoted to application of SADT and ARIS methodologies for modeling and management of business processes of information systems. The relevance of this article is beyond doubt, because the design of the architecture of information systems, based on a thorough system analysis of the subject area, is of paramount importance for the development of information systems in general. The authors conducted a serious work on the analysis of the application of SADT and ARIS methodologies for modeling and managing business processes of information systems. The analysis was carried out both in terms of modeling business processes (notation and applying the CASE-tool, and in terms of business process management. The first point of view reflects the interaction of the business analyst and the programmer in the development of the information system. The second point of view is the interaction of the business analyst and the customer. The basis of many modern methodologies for modeling business processes is the SADT methodology. Using the methodology of the IDEF family, it is possible to efficiently display and analyze the activity models of a wide range of complex information systems in various aspects. CASE-tool ARIS is a complex of tools for analysis and modeling of the organization's activities. The methodical basis of ARIS is a set of different modeling methods that reflect different views on the system under study. The authors' conclusions are fully justified. The results of the work can be useful for specialists in the field of modeling business processes of information systems. In addition, the article has an oriented character when working on the constituent elements of curricula for students specializing in information specialties and management, provides an update of the content and structure of disciplines on modeling the architecture of information systems and organization management, using models.

  8. Quantum information processing

    National Research Council Canada - National Science Library

    Leuchs, Gerd; Beth, Thomas

    2003-01-01

    ... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.5 SimulationofHamiltonians... References... 1 1 1 3 5 8 10 2 Quantum Information Processing and Error Correction with Jump Codes (G. Alber, M. Mussinger...

  9. Complex Systems and Dependability

    CERN Document Server

    Zamojski, Wojciech; Sugier, Jaroslaw

    2012-01-01

    Typical contemporary complex system is a multifaceted amalgamation of technical, information, organization, software and human (users, administrators and management) resources. Complexity of such a system comes not only from its involved technical and organizational structure but mainly from complexity of information processes that must be implemented in the operational environment (data processing, monitoring, management, etc.). In such case traditional methods of reliability analysis focused mainly on technical level are usually insufficient in performance evaluation and more innovative meth

  10. The process of urban regeneration in context of information society

    Directory of Open Access Journals (Sweden)

    Bazik Dragana

    2006-01-01

    Full Text Available This paper deals with the concept of innovation of the urban regeneration process in context of transformations which are generated by information-communication technologies. From one aspect, Serbia has an exceptional human potential presented in number of 13,000 graduates each year, or in share of 42% of population who speaks English, which is the largest among all Eastern and Central European countries. This forms a basis for formulation of strategies of information society development in Serbia as well as for economic adjustments based upon knowledge, and for tracing the way to future knowledge society, i.e. eEurope 2020. On the other hand, we are witnessing an intensive development of huge complexes of mega and hypermarkets as a present dominant way for our city spaces' regeneration. At the same time, experiences from some other locations point to the deterioration of cities' urban identity as a consequence of the global capital infiltration and of development within an urban tissue of a huge complex of multi-national companies. Aiming to overcome the mistakes portrayed by international experience, as well as potential oversights that may occur because of routine and mismatch between certain phases of the sustainable development process, this paper makes an emphasis on the importance of an integral evaluation of the information society development trends and the spatial aspects of urban regeneration. It is essential to adjust devastated urban spaces as artifacts of one technological era to the actual information era with indication of future digital knowledge era, i.e. to plan, design and develop *according to new technological requirements and possibilities for new working places and new quality of living.

  11. [Neurophysiological investigations of information processing in the somato-sensory system].

    Science.gov (United States)

    Kunesch, E

    2009-08-01

    The ability of the human hand to perform complex sensorimotor tasks such as tactile exploration and grasping is based on 1. exact encoding of somatosensory information by cutaneous mechanoreceptors, 2. elaborated processing of afferent signals in somatosensory relay stations and cortex fields, 3. rapid and effective interaction of sensory feedback with motor programs, and 4. different modes of sensory control, which can be switched over. (c) Georg Thieme Verlag KG Stuttgart-New York.

  12. PREFACE: Quantum information processing

    Science.gov (United States)

    Briggs, Andrew; Ferry, David; Stoneham, Marshall

    2006-05-01

    Microelectronics and the classical information technologies transformed the physics of semiconductors. Photonics has given optical materials a new direction. Quantum information technologies, we believe, will have immense impact on condensed matter physics. The novel systems of quantum information processing need to be designed and made. Their behaviours must be manipulated in ways that are intrinsically quantal and generally nanoscale. Both in this special issue and in previous issues (see e.g., Spiller T P and Munro W J 2006 J. Phys.: Condens. Matter 18 V1-10) we see the emergence of new ideas that link the fundamentals of science to the pragmatism of market-led industry. We hope these papers will be followed by many others on quantum information processing in the Journal of Physics: Condensed Matter.

  13. Process Knowledge Summary Report for Materials and Fuels Complex Contact-Handled Transuranic Debris Waste

    Energy Technology Data Exchange (ETDEWEB)

    R. P. Grant; P. J. Crane; S. Butler; M. A. Henry

    2010-02-01

    This Process Knowledge Summary Report summarizes the information collected to satisfy the transportation and waste acceptance requirements for the transfer of transuranic (TRU) waste between the Materials and Fuels Complex (MFC) and the Advanced Mixed Waste Treatment Project (AMWTP). The information collected includes documentation that addresses the requirements for AMWTP and the applicable portion of their Resource Conservation and Recovery Act permits for receipt and treatment of TRU debris waste in AMWTP. This report has been prepared for contact-handled TRU debris waste generated by the Idaho National Laboratory at MFC. The TRU debris waste will be shipped to AMWTP for purposes of supercompaction. This Process Knowledge Summary Report includes information regarding, but not limited to, the generation process, the physical form, radiological characteristics, and chemical contaminants of the TRU debris waste, prohibited items, and packaging configuration. This report, along with the referenced supporting documents, will create a defensible and auditable record for waste originating from MFC.

  14. A trade-off between local and distributed information processing associated with remote episodic versus semantic memory.

    Science.gov (United States)

    Heisz, Jennifer J; Vakorin, Vasily; Ross, Bernhard; Levine, Brian; McIntosh, Anthony R

    2014-01-01

    Episodic memory and semantic memory produce very different subjective experiences yet rely on overlapping networks of brain regions for processing. Traditional approaches for characterizing functional brain networks emphasize static states of function and thus are blind to the dynamic information processing within and across brain regions. This study used information theoretic measures of entropy to quantify changes in the complexity of the brain's response as measured by magnetoencephalography while participants listened to audio recordings describing past personal episodic and general semantic events. Personal episodic recordings evoked richer subjective mnemonic experiences and more complex brain responses than general semantic recordings. Critically, we observed a trade-off between the relative contribution of local versus distributed entropy, such that personal episodic recordings produced relatively more local entropy whereas general semantic recordings produced relatively more distributed entropy. Changes in the relative contributions of local and distributed entropy to the total complexity of the system provides a potential mechanism that allows the same network of brain regions to represent cognitive information as either specific episodes or more general semantic knowledge.

  15. Social Information Processing in Deaf Adolescents

    Science.gov (United States)

    Torres, Jesús; Saldaña, David; Rodríguez-Ortiz, Isabel R.

    2016-01-01

    The goal of this study was to compare the processing of social information in deaf and hearing adolescents. A task was developed to assess social information processing (SIP) skills of deaf adolescents based on Crick and Dodge's (1994; A review and reformulation of social information-processing mechanisms in children's social adjustment.…

  16. Links between attachment and social information processing: examination of intergenerational processes.

    Science.gov (United States)

    Dykas, Matthew J; Ehrlich, Katherine B; Cassidy, Jude

    2011-01-01

    This chapter describes theory and research on intergenerational connections between parents' attachment and children's social information processing, as well as between parents' social information processing and children's attachment. The chapter begins with a discussion of attachment theorists' early insights into the role that social information processing plays in attachment processes. Next, current theory about the mechanisms through which cross-generational links between attachment and social information processing might emerge is presented. The central proposition is that the quality of attachment and/or the social information processing of the parent contributes to the quality of attachment and/or social information processing in the child, and these links emerge through mediating processes related to social learning, open communication, gate-keeping, emotion regulation, and joint attention. A comprehensive review of the literature is then presented. The chapter ends with the presentation of a current theoretical perspective and suggestions for future empirical and clinical endeavors.

  17. Can Intrinsic Fluctuations Increase Efficiency in Neural Information Processing?

    Science.gov (United States)

    Liljenström, Hans

    2003-05-01

    All natural processes are accompanied by fluctuations, characterized as noise or chaos. Biological systems, which have evolved during billions of years, are likely to have adapted, not only to cope with such fluctuations, but also to make use of them. We investigate how the complex dynamics of the brain, including oscillations, chaos and noise, can affect the efficiency of neural information processing. In particular, we consider the amplification and functional role of internal fluctuations. Using computer simulations of a neural network model of the olfactory cortex and hippocampus, we demonstrate how microscopic fluctuations can result in global effects at the network level. We show that the rate of information processing in associative memory tasks can be maximized for optimal noise levels, analogous to stochastic resonance phenomena. Noise can also induce transitions between different dynamical states, which could be of significance for learning and memory. A chaotic-like behavior, induced by noise or by an increase in neuronal excitability, can enhance system performance if it is transient and converges to a limit cycle memory state. We speculate whether this dynamical behavior perhaps could be related to (creative) thinking.

  18. Knowledge-based inspection:modelling complex processes with the integrated Safeguards Modelling Method (iSMM)

    International Nuclear Information System (INIS)

    Abazi, F.

    2011-01-01

    Increased level of complexity in almost every discipline and operation today raises the demand for knowledge in order to successfully run an organization whether to generate profit or to attain a non-profit mission. Traditional way of transferring knowledge to information systems rich in data structures and complex algorithms continue to hinder the ability to swiftly turnover concepts into operations. Diagrammatic modelling commonly applied in engineering in order to represent concepts or reality remains to be an excellent way of converging knowledge from domain experts. The nuclear verification domain represents ever more a matter which has great importance to the World safety and security. Demand for knowledge about nuclear processes and verification activities used to offset potential misuse of nuclear technology will intensify with the growth of the subject technology. This Doctoral thesis contributes with a model-based approach for representing complex process such as nuclear inspections. The work presented contributes to other domains characterized with knowledge intensive and complex processes. Based on characteristics of a complex process a conceptual framework was established as the theoretical basis for creating a number of modelling languages to represent the domain. The integrated Safeguards Modelling Method (iSMM) is formalized through an integrated meta-model. The diagrammatic modelling languages represent the verification domain and relevant nuclear verification aspects. Such a meta-model conceptualizes the relation between practices of process management, knowledge management and domain specific verification principles. This fusion is considered as necessary in order to create quality processes. The study also extends the formalization achieved through a meta-model by contributing with a formalization language based on Pattern Theory. Through the use of graphical and mathematical constructs of the theory, process structures are formalized enhancing

  19. Inclusive Education as Complex Process and Challenge for School System

    Directory of Open Access Journals (Sweden)

    Al-Khamisy Danuta

    2015-08-01

    Full Text Available Education may be considered as a number of processes, actions and effects affecting human being, as the state or level of the results of these processes or as the modification of the functions, institutions and social practices roles, which in the result of inclusion become new, integrated system. Thus this is very complex process. Nowadays the complexity appears to be one of very significant terms both in science and in philosophy. It appears that despite searching for simple rules, strategies, solutions everything is still more complex. The environment is complex, the organism living in it and exploring it, and just the exploration itself is a complex phenomenon, much more than this could initially seem to be.

  20. Interpreting complex data by methods of recognition and classification in an automated system of aerogeophysical material processing

    Energy Technology Data Exchange (ETDEWEB)

    Koval' , L.A.; Dolgov, S.V.; Liokumovich, G.B.; Ovcharenko, A.V.; Priyezzhev, I.I.

    1984-01-01

    The system of automated processing of aerogeophysical data, ASOM-AGS/YeS, is equipped with complex interpretation of multichannel measurements. Algorithms of factor analysis, automatic classification and apparatus of a priori specified (selected) decisive rules are used. The areas of effect of these procedures can be initially limited to the specified geological information. The possibilities of the method are demonstrated by the results of automated processing of the aerogram-spectrometric measurements in the region of the known copper-porphyr manifestation in Kazakhstan. This ore deposit was clearly noted after processing by the method of main components by complex aureole of independent factors U (severe increase), Th (noticeable increase), K (decrease).

  1. SBGNViz: A Tool for Visualization and Complexity Management of SBGN Process Description Maps.

    Directory of Open Access Journals (Sweden)

    Mecit Sari

    Full Text Available Information about cellular processes and pathways is becoming increasingly available in detailed, computable standard formats such as BioPAX and SBGN. Effective visualization of this information is a key recurring requirement for biological data analysis, especially for -omic data. Biological data analysis is rapidly migrating to web based platforms; thus there is a substantial need for sophisticated web based pathway viewers that support these platforms and other use cases.Towards this goal, we developed a web based viewer named SBGNViz for process description maps in SBGN (SBGN-PD. SBGNViz can visualize both BioPAX and SBGN formats. Unique features of SBGNViz include the ability to nest nodes to arbitrary depths to represent molecular complexes and cellular locations, automatic pathway layout, editing and highlighting facilities to enable focus on sub-maps, and the ability to inspect pathway members for detailed information from EntrezGene. SBGNViz can be used within a web browser without any installation and can be readily embedded into web pages. SBGNViz has two editions built with ActionScript and JavaScript. The JavaScript edition, which also works on touch enabled devices, introduces novel methods for managing and reducing complexity of large SBGN-PD maps for more effective analysis.SBGNViz fills an important gap by making the large and fast-growing corpus of rich pathway information accessible to web based platforms. SBGNViz can be used in a variety of contexts and in multiple scenarios ranging from visualization of the results of a single study in a web page to building data analysis platforms.

  2. Process information systems in nuclear reprocessing

    International Nuclear Information System (INIS)

    Jaeschke, A.; Keller, H.; Orth, H.

    1987-01-01

    On a production management level, a process information system in a nuclear reprocessing plant (NRP) has to fulfill conventional operating functions and functions for nuclear material surveillance (safeguards). Based on today's state of the art of on-line process control technology, the progress in hardware and software technology allows to introduce more process-specific intelligence into process information systems. Exemplified by an expert-system-aided laboratory management system as component of a NRP process information system, the paper demonstrates that these technologies can be applied already. (DG) [de

  3. Technical Characteristics of the Process Information System - Nuclear Power Plant Krsko

    International Nuclear Information System (INIS)

    Mandic, D.; Smolej, M.

    1998-01-01

    process Information System (PIS) of Nuclear Power Plant Krsko (NEK) is newly installed distributed and redundant process computer system which was built in NEK (Phase I: 1991-1995) to integrate the following main functions: - Signal Data Acquisition from the technological processes and environment - Implementation of the basic SCADA functions on the real time process signals data base - Execution of complex plant specific application programs - Advanced MMI (Man Machine Interface) features for users in MCR - Process data transfer to other than Main Control Room (MCR) locations - Process data archiving and capability to retrieve same data for future analysis PIS NEK architecture consists of three hierarchically interconnected hardware platforms: - PIS Level 1, DAS (Data Acquisition System) Level - PIS Level2, Level for MMI, application programs and process data archiving - PIS Level 3, Level for distribution of process data to remote users of PIS data. (author)

  4. Altered Topology in Information Processing of a Narrated Story in Older Adults with Mild Cognitive Impairment.

    Science.gov (United States)

    Yogev-Seligmann, Galit; Oren, Noga; Ash, Elissa L; Hendler, Talma; Giladi, Nir; Lerner, Yulia

    2016-05-03

    The ability to store, integrate, and manipulate information declines with aging. These changes occur earlier, faster, and to a greater degree as a result of neurodegeneration. One of the most common and early characteristics of cognitive decline is difficulty with comprehension of information. The neural mechanisms underlying this breakdown of information processing are poorly understood. Using functional MRI and natural stimuli (e.g., stories), we mapped the neural mechanisms by which the human brain accumulates and processes information with increasing duration and complexity in participants with amnestic mild cognitive impairment (aMCI) and healthy older adults. To explore the mechanisms of information processing, we measured the reliability of brain responses elicited by listening to different versions of a narrated story created by segmenting the story into words, sentences, and paragraphs and then scrambling the segments. Comparing healthy older adults and participants with aMCI revealed that in both groups, all types of stimuli similarly recruited primary auditory areas. However, prominent differences between groups were found at the level of processing long and complex stimuli. In healthy older adults, parietal and frontal regions demonstrated highly synchronized responses in both the paragraph and full story conditions, as has been previously reported in young adults. Participants with aMCI, however, exhibited a robust functional shift of long time scale processing to the pre- and post-central sulci. Our results suggest that participants with aMCI experienced a functional shift of higher order auditory information processing, possibly reflecting a functional response to concurrent or impending neuronal or synaptic loss. This observation might assist in understanding mechanisms of cognitive decline in aMCI.

  5. Combining complexity measures of EEG data: multiplying measures reveal previously hidden information.

    Science.gov (United States)

    Burns, Thomas; Rajan, Ramesh

    2015-01-01

    Many studies have noted significant differences among human electroencephalograph (EEG) results when participants or patients are exposed to different stimuli, undertaking different tasks, or being affected by conditions such as epilepsy or Alzheimer's disease. Such studies often use only one or two measures of complexity and do not regularly justify their choice of measure beyond the fact that it has been used in previous studies. If more measures were added to such studies, however, more complete information might be found about these reported differences. Such information might be useful in confirming the existence or extent of such differences, or in understanding their physiological bases. In this study we analysed publically-available EEG data using a range of complexity measures to determine how well the measures correlated with one another. The complexity measures did not all significantly correlate, suggesting that different measures were measuring unique features of the EEG signals and thus revealing information which other measures were unable to detect. Therefore, the results from this analysis suggests that combinations of complexity measures reveal unique information which is in addition to the information captured by other measures of complexity in EEG data. For this reason, researchers using individual complexity measures for EEG data should consider using combinations of measures to more completely account for any differences they observe and to ensure the robustness of any relationships identified.

  6. A Cognition-based View of Decision Processes in Complex Social-Ecological Systems

    Directory of Open Access Journals (Sweden)

    Kathi K. Beratan

    2007-06-01

    Full Text Available This synthesis paper is intended to provide an overview of individual and collective decision-making processes that might serve as a theoretical foundation for a complexity-based approach to environmental policy design and natural resource management planning. Human activities are the primary drivers of change in the Earth's biosphere today, so efforts to shift the trajectory of social-ecological systems must focus on changes in individual and collective human behavior. Recent advances in understanding the biological basis of thought and memory offer insights of use in designing management and planning processes. The human brain has evolved ways of dealing with complexity and uncertainty, and is particularly attuned to social information. Changes in an individual's schemas, reflecting changes in the patterns of neural connections that are activated by particular stimuli, occur primarily through nonconsious processes in response to experiential learning during repeated exposure to novel situations, ideas, and relationships. Discourse is an important mechanism for schema modification, and thus for behavior change. Through discourse, groups of people construct a shared story - a collective model - that is useful for predicting likely outcomes of actions and events. In effect, good stories are models that filter and organize distributed knowledge about complex situations and relationships in ways that are readily absorbed by human cognitive processes. The importance of discourse supports the view that collaborative approaches are needed to effectively deal with environmental problems and natural resource management challenges. Methods derived from the field of mediation and dispute resolution can help us take advantage of the distinctly human ability to deal with complexity and uncertainty. This cognitive view of decision making supports fundamental elements of resilience management and adaptive co-management, including fostering social learning

  7. Process-aware information systems : lessons to be learned from process mining

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Jensen, K.; Aalst, van der W.M.P.

    2009-01-01

    A Process-Aware Information System (PAIS) is a software system that manages and executes operational processes involving people, applications, and/or information sources on the basis of process models. Example PAISs are workflow management systems, case-handling systems, enterprise information

  8. Information structure and reference tracking in complex sentences

    CERN Document Server

    Gijn, Rik van; Matic, Dejan

    2014-01-01

    This paper discusses argument marking and reference tracking in Mekens complex clauses and their correlation to information structure. The distribution of pronominal arguments in Mekens simple clauses follows an absolutive pattern with main verbs. Complex clauses maintain the morphological absolutive argument marking, but show a nominative pattern with respect to argument reference tracking, since transitive and intransitive subjects function as syntactic pivots. The language extends the use of argument-marking verb morphology to control the reference of discourse participants across clauses.

  9. Intelligent monitoring and fault diagnosis for ATLAS TDAQ: a complex event processing solution

    CERN Document Server

    Magnoni, Luca; Luppi, Eleonora

    Effective monitoring and analysis tools are fundamental in modern IT infrastructures to get insights on the overall system behavior and to deal promptly and effectively with failures. In recent years, Complex Event Processing (CEP) technologies have emerged as effective solutions for information processing from the most disparate fields: from wireless sensor networks to financial analysis. This thesis proposes an innovative approach to monitor and operate complex and distributed computing systems, in particular referring to the ATLAS Trigger and Data Acquisition (TDAQ) system currently in use at the European Organization for Nuclear Research (CERN). The result of this research, the AAL project, is currently used to provide ATLAS data acquisition operators with automated error detection and intelligent system analysis. The thesis begins by describing the TDAQ system and the controlling architecture, with a focus on the monitoring infrastructure and the expert system used for error detection and automated reco...

  10. Using evaluation to adapt health information outreach to the complex environments of community-based organizations.

    Science.gov (United States)

    Olney, Cynthia A

    2005-10-01

    After arguing that most community-based organizations (CBOs) function as complex adaptive systems, this white paper describes the evaluation goals, questions, indicators, and methods most important at different stages of community-based health information outreach. This paper presents the basic characteristics of complex adaptive systems and argues that the typical CBO can be considered this type of system. It then presents evaluation as a tool for helping outreach teams adapt their outreach efforts to the CBO environment and thus maximize success. Finally, it describes the goals, questions, indicators, and methods most important or helpful at each stage of evaluation (community assessment, needs assessment and planning, process evaluation, and outcomes assessment). Literature from complex adaptive systems as applied to health care, business, and evaluation settings is presented. Evaluation models and applications, particularly those based on participatory approaches, are presented as methods for maximizing the effectiveness of evaluation in dynamic CBO environments. If one accepts that CBOs function as complex adaptive systems-characterized by dynamic relationships among many agents, influences, and forces-then effective evaluation at the stages of community assessment, needs assessment and planning, process evaluation, and outcomes assessment is critical to outreach success.

  11. Extraction of quantifiable information from complex systems

    CERN Document Server

    Dahmen, Wolfgang; Griebel, Michael; Hackbusch, Wolfgang; Ritter, Klaus; Schneider, Reinhold; Schwab, Christoph; Yserentant, Harry

    2014-01-01

    In April 2007, the  Deutsche Forschungsgemeinschaft (DFG) approved the  Priority Program 1324 “Mathematical Methods for Extracting Quantifiable Information from Complex Systems.” This volume presents a comprehensive overview of the most important results obtained over the course of the program.   Mathematical models of complex systems provide the foundation for further technological developments in science, engineering and computational finance.  Motivated by the trend toward steadily increasing computer power, ever more realistic models have been developed in recent years. These models have also become increasingly complex, and their numerical treatment poses serious challenges.   Recent developments in mathematics suggest that, in the long run, much more powerful numerical solution strategies could be derived if the interconnections between the different fields of research were systematically exploited at a conceptual level. Accordingly, a deeper understanding of the mathematical foundations as w...

  12. Bridging the Operational Divide: An Information-Processing Model of Internal Supply Chain Integration

    Science.gov (United States)

    Rosado Feger, Ana L.

    2009-01-01

    Supply Chain Management, the coordination of upstream and downstream flows of product, services, finances, and information from a source to a customer, has risen in prominence over the past fifteen years. The delivery of a product to the consumer is a complex process requiring action from several independent entities. An individual firm consists…

  13. Atmospheric processes over complex terrain

    Science.gov (United States)

    Banta, Robert M.; Berri, G.; Blumen, William; Carruthers, David J.; Dalu, G. A.; Durran, Dale R.; Egger, Joseph; Garratt, J. R.; Hanna, Steven R.; Hunt, J. C. R.

    1990-06-01

    A workshop on atmospheric processes over complex terrain, sponsored by the American Meteorological Society, was convened in Park City, Utah from 24 vto 28 October 1988. The overall objective of the workshop was one of interaction and synthesis--interaction among atmospheric scientists carrying out research on a variety of orographic flow problems, and a synthesis of their results and points of view into an assessment of the current status of topical research problems. The final day of the workshop was devoted to an open discussion on the research directions that could be anticipated in the next decade because of new and planned instrumentation and observational networks, the recent emphasis on development of mesoscale numerical models, and continual theoretical investigations of thermally forced flows, orographic waves, and stratified turbulence. This monograph represents an outgrowth of the Park City Workshop. The authors have contributed chapters based on their lecture material. Workshop discussions indicated interest in both the remote sensing and predictability of orographic flows. These chapters were solicited following the workshop in order to provide a more balanced view of current progress and future directions in research on atmospheric processes over complex terrain.

  14. Information-Processing Models and Curriculum Design

    Science.gov (United States)

    Calfee, Robert C.

    1970-01-01

    "This paper consists of three sections--(a) the relation of theoretical analyses of learning to curriculum design, (b) the role of information-processing models in analyses of learning processes, and (c) selected examples of the application of information-processing models to curriculum design problems." (Author)

  15. BRICS and Quantum Information Processing

    DEFF Research Database (Denmark)

    Schmidt, Erik Meineche

    1998-01-01

    BRICS is a research centre and international PhD school in theoretical computer science, based at the University of Aarhus, Denmark. The centre has recently become engaged in quantum information processing in cooperation with the Department of Physics, also University of Aarhus. This extended...... abstract surveys activities at BRICS with special emphasis on the activities in quantum information processing....

  16. Complexity in Evolutionary Processes

    International Nuclear Information System (INIS)

    Schuster, P.

    2010-01-01

    Darwin's principle of evolution by natural selection is readily casted into a mathematical formalism. Molecular biology revealed the mechanism of mutation and provides the basis for a kinetic theory of evolution that models correct reproduction and mutation as parallel chemical reaction channels. A result of the kinetic theory is the existence of a phase transition in evolution occurring at a critical mutation rate, which represents a localization threshold for the population in sequence space. Occurrence and nature of such phase transitions depend critically on fitness landscapes. The fitness landscape being tantamount to a mapping from sequence or genotype space into phenotype space is identified as the true source of complexity in evolution. Modeling evolution as a stochastic process is discussed and neutrality with respect to selection is shown to provide a major challenge for understanding evolutionary processes (author)

  17. Simulation and Analysis of Complex Biological Processes: an Organisation Modelling Perspective

    NARCIS (Netherlands)

    Bosse, T.; Jonker, C.M.; Treur, J.

    2005-01-01

    This paper explores how the dynamics of complex biological processes can be modelled and simulated as an organisation of multiple agents. This modelling perspective identifies organisational structure occurring in complex decentralised processes and handles complexity of the analysis of the dynamics

  18. RNACompress: Grammar-based compression and informational complexity measurement of RNA secondary structure

    Directory of Open Access Journals (Sweden)

    Chen Chun

    2008-03-01

    Full Text Available Abstract Background With the rapid emergence of RNA databases and newly identified non-coding RNAs, an efficient compression algorithm for RNA sequence and structural information is needed for the storage and analysis of such data. Although several algorithms for compressing DNA sequences have been proposed, none of them are suitable for the compression of RNA sequences with their secondary structures simultaneously. This kind of compression not only facilitates the maintenance of RNA data, but also supplies a novel way to measure the informational complexity of RNA structural data, raising the possibility of studying the relationship between the functional activities of RNA structures and their complexities, as well as various structural properties of RNA based on compression. Results RNACompress employs an efficient grammar-based model to compress RNA sequences and their secondary structures. The main goals of this algorithm are two fold: (1 present a robust and effective way for RNA structural data compression; (2 design a suitable model to represent RNA secondary structure as well as derive the informational complexity of the structural data based on compression. Our extensive tests have shown that RNACompress achieves a universally better compression ratio compared with other sequence-specific or common text-specific compression algorithms, such as Gencompress, winrar and gzip. Moreover, a test of the activities of distinct GTP-binding RNAs (aptamers compared with their structural complexity shows that our defined informational complexity can be used to describe how complexity varies with activity. These results lead to an objective means of comparing the functional properties of heteropolymers from the information perspective. Conclusion A universal algorithm for the compression of RNA secondary structure as well as the evaluation of its informational complexity is discussed in this paper. We have developed RNACompress, as a useful tool

  19. On the Intensification of Information Protection Processes

    Directory of Open Access Journals (Sweden)

    A. A. Malyuk

    2011-03-01

    Full Text Available The features of the information protection task solution in its modern statement as a complex problem that encompasses all aspects of information technology development are discussed. Such an interpretation would inevitably lead to an increase of the role of the systemic problems solution of which relies on advanced scientific and methodological basis, so called information protection processes’ intensification.

  20. Risk perception and information processing: the development and validation of a questionnaire to assess self-reported information processing.

    Science.gov (United States)

    Smerecnik, Chris M R; Mesters, Ilse; Candel, Math J J M; De Vries, Hein; De Vries, Nanne K

    2012-01-01

    The role of information processing in understanding people's responses to risk information has recently received substantial attention. One limitation of this research concerns the unavailability of a validated questionnaire of information processing. This article presents two studies in which we describe the development and validation of the Information-Processing Questionnaire to meet that need. Study 1 describes the development and initial validation of the questionnaire. Participants were randomized to either a systematic processing or a heuristic processing condition after which they completed a manipulation check and the initial 15-item questionnaire and again two weeks later. The questionnaire was subjected to factor reliability and validity analyses on both measurement times for purposes of cross-validation of the results. A two-factor solution was observed representing a systematic processing and a heuristic processing subscale. The resulting scale showed good reliability and validity, with the systematic condition scoring significantly higher on the systematic subscale and the heuristic processing condition significantly higher on the heuristic subscale. Study 2 sought to further validate the questionnaire in a field study. Results of the second study corresponded with those of Study 1 and provided further evidence of the validity of the Information-Processing Questionnaire. The availability of this information-processing scale will be a valuable asset for future research and may provide researchers with new research opportunities. © 2011 Society for Risk Analysis.

  1. Compliance with Environmental Regulations through Complex Geo-Event Processing

    Directory of Open Access Journals (Sweden)

    Federico Herrera

    2017-11-01

    Full Text Available In a context of e-government, there are usually regulatory compliance requirements that support systems must monitor, control and enforce. These requirements may come from environmental laws and regulations that aim to protect the natural environment and mitigate the effects of pollution on human health and ecosystems. Monitoring compliance with these requirements involves processing a large volume of data from different sources, which is a major challenge. This volume is also increased with data coming from autonomous sensors (e.g. reporting carbon emission in protected areas and from citizens providing information (e.g. illegal dumping in a voluntary way. Complex Event Processing (CEP technologies allow processing large amount of event data and detecting patterns from them. However, they do not provide native support for the geographic dimension of events which is essential for monitoring requirements which apply to specific geographic areas. This paper proposes a geospatial extension for CEP that allows monitoring environmental requirements considering the geographic location of the processed data. We extend an existing platform-independent, model-driven approach for CEP adding the geographic location to events and specifying patterns using geographic operators. The use and technical feasibility of the proposal is shown through the development of a case study and the implementation of a prototype.

  2. Vulnerability of complex networks under intentional attack with incomplete information

    International Nuclear Information System (INIS)

    Wu, J; Deng, H Z; Tan, Y J; Zhu, D Z

    2007-01-01

    We study the vulnerability of complex networks under intentional attack with incomplete information, which means that one can only preferentially attack the most important nodes among a local region of a network. The known random failure and the intentional attack are two extreme cases of our study. Using the generating function method, we derive the exact value of the critical removal fraction f c of nodes for the disintegration of networks and the size of the giant component. To validate our model and method, we perform simulations of intentional attack with incomplete information in scale-free networks. We show that the attack information has an important effect on the vulnerability of scale-free networks. We also demonstrate that hiding a fraction of the nodes information is a cost-efficient strategy for enhancing the robustness of complex networks

  3. Process of making decisions on loan currency: Influence of representativeness on information processing and coherence with consumption motives

    Directory of Open Access Journals (Sweden)

    Anđelković Dragan

    2016-01-01

    Full Text Available Rationality of decision maker is often reduced by heuristics and biases, and also by different types of external stimuli. In decision-making process individuals simplify phases of information selection and information processing by using heuristics, simple rules which are focused on one aspect of complex problem and ignore other aspects, and in that way 'speed up' decision-making process. This method of making decisions, although efficient in making simple decisions, can lead to mistakes in probability assessment and diminish rationality of decision maker. In that way it can influence drastically on transaction outcome for which decision is being made. The subject of this study is influence of representativeness heuristic on making financial decisions by individuals, and influence of consumption motives on stereotypical elements in information processing phase. Study was conducted by determining attitudes of respondents toward currencies, and then by conducting experiments with aim of analyzing method of making decisions on loan currency. Aim of study was determining whether and to what extent representativeness influence choice of currency in process of making loan decisions. Results of conducted behavioral experiments show that respondents, opposite to rational model, do not asses probability by processing available information and in accordance with their preferences, but by comparing decision objects with other objects which have same attributes, showing in that way moderate positive correlation between stereotypical attitudes and choice of loan currency. Experiments have shown that instrumental motive significantly influence representativeness heuristics, that is, individuals are prone to process information with diminished influence of stereotypical attitudes caused by external stimuli, in situations where there is no so called 'hedonistic decision-making'. Respondents have been making more efficient decisions if they had motive which does

  4. Adaptive Channel Estimation based on Soft Information Processing in Broadband Spatial Multiplexing Receivers

    Directory of Open Access Journals (Sweden)

    P. Beinschob

    2010-11-01

    Full Text Available In this paper we present a novel approach in Multiple-Input Multiple Output (MIMO Orthogonal Frequency Division Multiplexing (OFDM channel estimation technique based on a Decision Directed Recursive Least Squares (RLS algorithm in which no pilot symbols need to be integrated in the data after a short initial preamble. The novelty and key concept of the proposed technique is the block-wise causal and anti-causal RLS processing that yields two independent processings of RLS along with the associated decisions. Due to the usage of low density parity check (LDPC channel code, the receiver operates with soft information, which enables us to introduce a new modification of the Turbo principle as well as a simple information combining approach based on approximated aposteriori log-likelihood ratios (LLRs. Although the computational complexity is increased by both of our approaches, the latter is relatively less complex than the former. Simulation results show that these implementations outperform the simple RLS-DDCE algorithm and yield lower bit error rates (BER and more accurate channel estimates.

  5. 1 SUPPLEMENTARY INFORMATION A novel zinc(II) complex ...

    Indian Academy of Sciences (India)

    BİLGİSAYAR

    1. SUPPLEMENTARY INFORMATION. A novel zinc(II) complex containing square pyramidal, octahedral and tetrahedral geometries on the same polymeric chain constructed from pyrazine-2,3-dicarboxylic acid and 1-vinylimidazole. HAKAN YILMAZ* and OMER ANDAC. Department of Chemistry, Ondokuz Mayis University, ...

  6. Information processing among high-performance managers

    Directory of Open Access Journals (Sweden)

    S.C. Garcia-Santos

    2010-01-01

    Full Text Available The purpose of this study was to evaluate the information processing of 43 business managers with a professional superior performance. The theoretical framework considers three models: the Theory of Managerial Roles of Henry Mintzberg, the Theory of Information Processing, and Process Model Response to Rorschach by John Exner. The participants have been evaluated by Rorschach method. The results show that these managers are able to collect data, evaluate them and establish rankings properly. At same time, they are capable of being objective and accurate in the problems assessment. This information processing style permits an interpretation of the world around on basis of a very personal and characteristic processing way or cognitive style.

  7. New levels of language processing complexity and organization revealed by granger causation.

    Science.gov (United States)

    Gow, David W; Caplan, David N

    2012-01-01

    Granger causation analysis of high spatiotemporal resolution reconstructions of brain activation offers a new window on the dynamic interactions between brain areas that support language processing. Premised on the observation that causes both precede and uniquely predict their effects, this approach provides an intuitive, model-free means of identifying directed causal interactions in the brain. It requires the analysis of all non-redundant potentially interacting signals, and has shown that even "early" processes such as speech perception involve interactions of many areas in a strikingly large network that extends well beyond traditional left hemisphere perisylvian cortex that play out over hundreds of milliseconds. In this paper we describe this technique and review several general findings that reframe the way we think about language processing and brain function in general. These include the extent and complexity of language processing networks, the central role of interactive processing dynamics, the role of processing hubs where the input from many distinct brain regions are integrated, and the degree to which task requirements and stimulus properties influence processing dynamics and inform our understanding of "language-specific" localized processes.

  8. An Intelligent Complex Event Processing with D Numbers under Fuzzy Environment

    Directory of Open Access Journals (Sweden)

    Fuyuan Xiao

    2016-01-01

    Full Text Available Efficient matching of incoming mass events to persistent queries is fundamental to complex event processing systems. Event matching based on pattern rule is an important feature of complex event processing engine. However, the intrinsic uncertainty in pattern rules which are predecided by experts increases the difficulties of effective complex event processing. It inevitably involves various types of the intrinsic uncertainty, such as imprecision, fuzziness, and incompleteness, due to the inability of human beings subjective judgment. Nevertheless, D numbers is a new mathematic tool to model uncertainty, since it ignores the condition that elements on the frame must be mutually exclusive. To address the above issues, an intelligent complex event processing method with D numbers under fuzzy environment is proposed based on the Technique for Order Preferences by Similarity to an Ideal Solution (TOPSIS method. The novel method can fully support decision making in complex event processing systems. Finally, a numerical example is provided to evaluate the efficiency of the proposed method.

  9. Proprioceptive information processing in schizophrenia

    DEFF Research Database (Denmark)

    Arnfred, Sidse M H

    of the left somatosensory cortex and it was suggested to be in accordance with two theories of schizophrenic information processing: the theory of deficiency of corollary discharge and the theory of weakening of the influence of past regularities. No gating deficiency was observed and the imprecision...... Rado (1890-1972) suggested that one of two un-reducible deficits in schizophrenia was a disorder of proprioception. Exploration of proprioceptive information processing is possible through the measurement of evoked and event related potentials. Event related EEG can be analyzed as conventional time...... and amplitude attenuation was not a general phenomenon across the entire brain response. Summing up, in support of Rado's hypothesis, schizophrenia spectrum patients demonstrated abnormalities in proprioceptive information processing. Future work needs to extend the findings in larger un-medicated, non...

  10. Information, complexity and efficiency: The automobile model

    Energy Technology Data Exchange (ETDEWEB)

    Allenby, B. [Lucent Technologies (United States)]|[Lawrence Livermore National Lab., CA (United States)

    1996-08-08

    The new, rapidly evolving field of industrial ecology - the objective, multidisciplinary study of industrial and economic systems and their linkages with fundamental natural systems - provides strong ground for believing that a more environmentally and economically efficient economy will be more information intensive and complex. Information and intellectual capital will be substituted for the more traditional inputs of materials and energy in producing a desirable, yet sustainable, quality of life. While at this point this remains a strong hypothesis, the evolution of the automobile industry can be used to illustrate how such substitution may, in fact, already be occurring in an environmentally and economically critical sector.

  11. Dependability problems of complex information systems

    CERN Document Server

    Zamojski, Wojciech

    2014-01-01

    This monograph presents original research results on selected problems of dependability in contemporary Complex Information Systems (CIS). The ten chapters are concentrated around the following three aspects: methods for modelling of the system and its components, tasks ? or in more generic and more adequate interpretation, functionalities ? accomplished by the system and conditions for their correct realization in the dynamic operational environment. While the main focus is on theoretical advances and roadmaps for implementations of new technologies, a?much needed forum for sharing of the bes

  12. Ethnographic process evaluation in primary care: explaining the complexity of implementation.

    Science.gov (United States)

    Bunce, Arwen E; Gold, Rachel; Davis, James V; McMullen, Carmit K; Jaworski, Victoria; Mercer, MaryBeth; Nelson, Christine

    2014-12-05

    The recent growth of implementation research in care delivery systems has led to a renewed interest in methodological approaches that deliver not only intervention outcome data but also deep understanding of the complex dynamics underlying the implementation process. We suggest that an ethnographic approach to process evaluation, when informed by and integrated with quantitative data, can provide this nuanced insight into intervention outcomes. The specific methods used in such ethnographic process evaluations are rarely presented in detail; our objective is to stimulate a conversation around the successes and challenges of specific data collection methods in health care settings. We use the example of a translational clinical trial among 11 community clinics in Portland, OR that are implementing an evidence-based, health-information technology (HIT)-based intervention focused on patients with diabetes. Our ethnographic process evaluation employed weekly diaries by clinic-based study employees, observation, informal and formal interviews, document review, surveys, and group discussions to identify barriers and facilitators to implementation success, provide insight into the quantitative study outcomes, and uncover lessons potentially transferable to other implementation projects. These methods captured the depth and breadth of factors contributing to intervention uptake, while minimizing disruption to clinic work and supporting mid-stream shifts in implementation strategies. A major challenge is the amount of dedicated researcher time required. The deep understanding of the 'how' and 'why' behind intervention outcomes that can be gained through an ethnographic approach improves the credibility and transferability of study findings. We encourage others to share their own experiences with ethnography in implementation evaluation and health services research, and to consider adapting the methods and tools described here for their own research.

  13. Hierarchical process memory: memory as an integral component of information processing

    Science.gov (United States)

    Hasson, Uri; Chen, Janice; Honey, Christopher J.

    2015-01-01

    Models of working memory commonly focus on how information is encoded into and retrieved from storage at specific moments. However, in the majority of real-life processes, past information is used continuously to process incoming information across multiple timescales. Considering single unit, electrocorticography, and functional imaging data, we argue that (i) virtually all cortical circuits can accumulate information over time, and (ii) the timescales of accumulation vary hierarchically, from early sensory areas with short processing timescales (tens to hundreds of milliseconds) to higher-order areas with long processing timescales (many seconds to minutes). In this hierarchical systems perspective, memory is not restricted to a few localized stores, but is intrinsic to information processing that unfolds throughout the brain on multiple timescales. “The present contains nothing more than the past, and what is found in the effect was already in the cause.”Henri L Bergson PMID:25980649

  14. Disentangling brain activity related to the processing of emotional visual information and emotional arousal.

    Science.gov (United States)

    Kuniecki, Michał; Wołoszyn, Kinga; Domagalik, Aleksandra; Pilarczyk, Joanna

    2018-05-01

    Processing of emotional visual information engages cognitive functions and induces arousal. We aimed to examine the modulatory role of emotional valence on brain activations linked to the processing of visual information and those linked to arousal. Participants were scanned and their pupil size was measured while viewing negative and neutral images. The visual noise was added to the images in various proportions to parametrically manipulate the amount of visual information. Pupil size was used as an index of physiological arousal. We show that arousal induced by the negative images, as compared to the neutral ones, is primarily related to greater amygdala activity while increasing visibility of negative content to enhanced activity in the lateral occipital complex (LOC). We argue that more intense visual processing of negative scenes can occur irrespective of the level of arousal. It may suggest that higher areas of the visual stream are fine-tuned to process emotionally relevant objects. Both arousal and processing of emotional visual information modulated activity within the ventromedial prefrontal cortex (vmPFC). Overlapping activations within the vmPFC may reflect the integration of these aspects of emotional processing. Additionally, we show that emotionally-evoked pupil dilations are related to activations in the amygdala, vmPFC, and LOC.

  15. Selective perception of novel science: how definitions affect information processing about nanotechnology

    Science.gov (United States)

    Kim, Jiyoun; Akin, Heather; Brossard, Dominique; Xenos, Michael; Scheufele, Dietram A.

    2017-05-01

    This study examines how familiarity with an issue—nanotechnology—moderates the effect of exposure to science information on how people process mediated messages about a complex issue. In an online experiment, we provide a nationally representative sample three definitions of nanotechnology (technical, technical applications, and technical risk/benefit definitions). We then ask them to read an article about the topic. We find significant interactions between perceived nano-familiarity and the definition received in terms of how respondents perceive favorable information conveyed in the stimulus. People less familiar with nanotechnology were more significantly affected by the type of definition they received.

  16. The Process of Solving Complex Problems

    Science.gov (United States)

    Fischer, Andreas; Greiff, Samuel; Funke, Joachim

    2012-01-01

    This article is about Complex Problem Solving (CPS), its history in a variety of research domains (e.g., human problem solving, expertise, decision making, and intelligence), a formal definition and a process theory of CPS applicable to the interdisciplinary field. CPS is portrayed as (a) knowledge acquisition and (b) knowledge application…

  17. The Effect of Positive Mood on Flexible Processing of Affective Information.

    Science.gov (United States)

    Grol, Maud; De Raedt, Rudi

    2017-07-17

    Recent efforts have been made to understand the cognitive mechanisms underlying psychological resilience. Cognitive flexibility in the context of affective information has been related to individual differences in resilience. However, it is unclear whether flexible affective processing is sensitive to mood fluctuations. Furthermore, it remains to be investigated how effects on flexible affective processing interact with the affective valence of information that is presented. To fill this gap, we tested the effects of positive mood and individual differences in self-reported resilience on affective flexibility, using a task switching paradigm (N = 80). The main findings showed that positive mood was related to lower task switching costs, reflecting increased flexibility, in line with previous findings. In line with this effect of positive mood, we showed that greater resilience levels, specifically levels of acceptance of self and life, also facilitated task set switching in the context of affective information. However, the effects of resilience on affective flexibility seem more complex. Resilience tended to relate to more efficient task switching when negative information was preceded by positive information, possibly because the presentation of positive information, as well as positive mood, can facilitate task set switching. Positive mood also influenced costs associated with switching affective valence of the presented information. This latter effect was indicative of a reduced impact of no longer relevant negative information and more impact of no longer relevant positive information. Future research should confirm these effects of individual differences in resilience on affective flexibility, considering the affective valence of the presented information. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  18. The geosystems of complex geographical atlases

    Directory of Open Access Journals (Sweden)

    Jovanović Jasmina

    2012-01-01

    Full Text Available Complex geographical atlases represent geosystems of different hierarchical rank, complexity and diversity, scale and connection. They represent a set of large number of different pieces of information about geospace. Also, they contain systematized, correlative and in the apparent form represented pieces of information about space. The degree of information revealed in the atlas is precisely explained by its content structure and the form of presentation. The quality of atlas depends on the method of visualization of data and the quality of geodata. Cartographic visualization represents cognitive process. The analysis converts geospatial data into knowledge. A complex geographical atlas represents information complex of spatial - temporal coordinated database on geosystems of different complexity and territorial scope. Each geographical atlas defines a concrete geosystem. Systemic organization (structural and contextual determines its complexity and concreteness. In complex atlases, the attributes of geosystems are modeled and pieces of information are given in systematized, graphically unique form. The atlas can be considered as a database. In composing a database, semantic analysis of data is important. The result of semantic modeling is expressed in structuring of data information, in emphasizing logic connections between phenomena and processes and in defining their classes according to the degree of similarity. Accordingly, the efficiency of research of needed pieces of information in the process of the database use is enabled. An atlas map has a special power to integrate sets of geodata and present information contents in user - friendly and understandable visual and tactile way using its visual ability. Composing an atlas by systemic cartography requires the pieces of information on concrete - defined geosystems of different hierarchical level, the application of scientific methods and making of adequate number of analytical, synthetic

  19. An information theory-based approach to modeling the information processing of NPP operators

    International Nuclear Information System (INIS)

    Kim, Jong Hyun; Seong, Poong Hyun

    2002-01-01

    This paper proposes a quantitative approach to modeling the information processing of NPP operators. The aim of this work is to derive the amount of the information processed during a certain control task. The focus will be on i) developing a model for information processing of NPP operators and ii) quantifying the model. To resolve the problems of the previous approaches based on the information theory, i.e. the problems of single channel approaches, we primarily develop the information processing model having multiple stages, which contains information flows. Then the uncertainty of the information is quantified using the Conant's model, a kind of information theory

  20. Extrinsic and intrinsic complexities of the Los Alamos plutonium processing facility

    International Nuclear Information System (INIS)

    Bearse, R.C.; Roberts, N.J.; Longmire, V.L.

    1985-01-01

    Analysis of the data obtained in one year of plutonium accounting at Los Alamos reveals significant complexity. Much of this complexity arises from the complexity of the processes themselves. Additional complexity is induced by errors in the data entry process. It is important to note that there is no evidence that this complexity is adversely affecting the accounting in the plant. The authors have been analyzing transaction data from fiscal year 1983 processing. This study involved 62,595 transactions. The data have been analyzed using the relational database program INGRES on a VAX 11/780 computer. This software allows easy manipulation of the original data and subsets drawn from it. The authors have been attempting for several years to understand the global features of the TA-55 accounting data. This project has underscored several of the system's complexities

  1. The Influence of Information Acquisition on the Complex Dynamics of Market Competition

    Science.gov (United States)

    Guo, Zhanbing; Ma, Junhai

    In this paper, we build a dynamical game model with three bounded rational players (firms) to study the influence of information on the complex dynamics of market competition, where useful information is about rival’s real decision. In this dynamical game model, one information-sharing team is composed of two firms, they acquire and share the information about their common competitor, however, they make their own decisions separately, where the amount of information acquired by this information-sharing team will determine the estimation accuracy about the rival’s real decision. Based on this dynamical game model and some creative 3D diagrams, the influence of the amount of information on the complex dynamics of market competition such as local dynamics, global dynamics and profits is studied. These results have significant theoretical and practical values to realize the influence of information.

  2. Defining information need in health - assimilating complex theories derived from information science.

    Science.gov (United States)

    Ormandy, Paula

    2011-03-01

    Key policy drivers worldwide include optimizing patients' roles in managing their care; focusing services around patients' needs and preferences; and providing information to support patients' contributions and choices. The term information need penetrates many policy documents. Information need is espoused as the foundation from which to develop patient-centred or patient-led services. Yet there is no clear definition as to what the term means or how patients' information needs inform and shape information provision and patient care. The assimilation of complex theories originating from information science has much to offer considerations of patient information need within the context of health care. Health-related research often focuses on the content of information patients prefer, not why they need information. This paper extends and applies knowledge of information behaviour to considerations of information need in health, exposing a working definition for patient information need that reiterates the importance of considering the patient's goals and understanding the patient's context/situation. A patient information need is defined as 'recognition that their knowledge is inadequate to satisfy a goal, within the context/situation that they find themselves at a specific point in the time'. This typifies the key concepts of national/international health policy, the centrality and importance of the patient. The proposed definition of patient information need provides a conceptual framework to guide health-care practitioners on what to consider and why when meeting the information needs of patients in practice. This creates a solid foundation from which to inform future research. © 2010 The Author. Health Expectations © 2010 Blackwell Publishing Ltd.

  3. ETANA-DL: Managing Complex Information Applications - an Archaeology Digital Library

    OpenAIRE

    Ravindranathan, Unni; Shen, Rao; Goncalves, Marcos A.; Fan, Weiguo; Fox, Edward A.; Flanagan, James

    2004-01-01

    Archaeological research results in the generation of large quantities of heterogeneous information managed by different projects using custom information systems. We will demonstrate a prototype Digital Library (DL) for integrating and managing archaeological data and providing services useful to various user communities. ETANA-DL is a model-based, componentized, extensible, archaeological DL that manages complex information sources using the client-server paradigm of the Open Archives Initia...

  4. 40 CFR 68.65 - Process safety information.

    Science.gov (United States)

    2010-07-01

    ... (CONTINUED) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.65 Process safety... 40 Protection of Environment 15 2010-07-01 2010-07-01 false Process safety information. 68.65... compilation of written process safety information before conducting any process hazard analysis required by...

  5. Information Design for “Weak Signal” detection and processing in Economic Intelligence: A case study on Health resources

    Directory of Open Access Journals (Sweden)

    Sahbi Sidhom

    2011-12-01

    Full Text Available The topics of this research cover all phases of “Information Design” applied to detect and profit from weak signals in economic intelligence (EI or business intelligence (BI. The field of the information design (ID applies to the process of translating complex, unorganized or unstructured data into valuable and meaningful information. ID practice requires an interdisciplinary approach, which combines skills in graphic design (writing, analysis processing and editing, human performances technology and human factors. Applied in the context of information system, it allows end-users to easily detect implicit topics known as “weak signals” (WS. In our approach to implement the ID, the processes cover the development of a knowledge management (KM process in the context of EI. A case study concerning information monitoring health resources is presented using ID processes to outline weak signals. Both French and American bibliographic databases were applied to make the connection to multilingual concepts in the health watch process.

  6. Information maximization explains the emergence of complex cell-like neurons

    Directory of Open Access Journals (Sweden)

    Takuma eTanaka

    2013-11-01

    Full Text Available We propose models and a method to qualitatively explain the receptive field properties of complex cells in the primary visual cortex. We apply a learning method based on the information maximization principle in a feedforward network, which comprises an input layer of image patches, simple cell-like first-output-layer neurons, and second-output-layer neurons (Model 1. The information maximization results in the emergence of the complex cell-like receptive field properties in the second-output-layer neurons. After learning, second-output-layer neurons receive connection weights having the same size from two first-output-layer neurons with sign-inverted receptive fields. The second-output-layer neurons replicate the phase invariance and iso-orientation suppression. Furthermore, on the basis of these results, we examine a simplified model showing the emergence of complex cell-like receptive fields (Model 2. We show that after learning, the output neurons of this model exhibit iso-orientation suppression, cross-orientation facilitation, and end stopping, which are similar to those found in complex cells. These properties of model neurons suggest that complex cells in the primary visual cortex become selective to features composed of edges to increase the variability of the output.

  7. A Sensitivity Analysis Method to Study the Behavior of Complex Process-based Models

    Science.gov (United States)

    Brugnach, M.; Neilson, R.; Bolte, J.

    2001-12-01

    The use of process-based models as a tool for scientific inquiry is becoming increasingly relevant in ecosystem studies. Process-based models are artificial constructs that simulate the system by mechanistically mimicking the functioning of its component processes. Structurally, a process-based model can be characterized, in terms of its processes and the relationships established among them. Each process comprises a set of functional relationships among several model components (e.g., state variables, parameters and input data). While not encoded explicitly, the dynamics of the model emerge from this set of components and interactions organized in terms of processes. It is the task of the modeler to guarantee that the dynamics generated are appropriate and semantically equivalent to the phenomena being modeled. Despite the availability of techniques to characterize and understand model behavior, they do not suffice to completely and easily understand how a complex process-based model operates. For example, sensitivity analysis studies model behavior by determining the rate of change in model output as parameters or input data are varied. One of the problems with this approach is that it considers the model as a "black box", and it focuses on explaining model behavior by analyzing the relationship input-output. Since, these models have a high degree of non-linearity, understanding how the input affects an output can be an extremely difficult task. Operationally, the application of this technique may constitute a challenging task because complex process-based models are generally characterized by a large parameter space. In order to overcome some of these difficulties, we propose a method of sensitivity analysis to be applicable to complex process-based models. This method focuses sensitivity analysis at the process level, and it aims to determine how sensitive the model output is to variations in the processes. Once the processes that exert the major influence in

  8. Hanford Central Waste Complex: Waste Receiving and Processing Facility dangerous waste permit application

    International Nuclear Information System (INIS)

    1991-10-01

    The Hanford Central Waste Complex is an existing and planned series of treatment, and/or disposal (TSD) unites that will centralize the management of solid waste operations at a single location on the Hanford Facility. The Complex includes two units: the WRAP Facility and the Radioactive Mixed Wastes Storage Facility (RMW Storage Facility). This Part B permit application addresses the WRAP Facility. The Facility will be a treatment and storage unit that will provide the capability to examine, sample, characterize, treat, repackage, store, and certify radioactive and/or mixed waste. Waste treated and stored will include both radioactive and/or mixed waste received from onsite and offsite sources. Certification will be designed to ensure and demonstrate compliance with waste acceptance criteria set forth by onsite disposal units and/or offsite facilities that subsequently are to receive waste from the WRAP Facility. This permit application discusses the following: facility description and general provisions; waste characterization; process information; groundwater monitoring; procedures to prevent hazards; contingency plant; personnel training; exposure information report; waste minimization plan; closure and postclosure requirements; reporting and recordkeeping; other relevant laws; certification

  9. Information management in process planning

    NARCIS (Netherlands)

    Lutters, Diederick; Wijnker, T.C.; Kals, H.J.J.

    1999-01-01

    A recently proposed reference model indicates the use of structured information as the basis for the control of design and manufacturing processes. The model is used as a basis to describe the integration of design and process planning. A differentiation is made between macro- and micro process

  10. From DTCA-PD to patient information to health information: the complex politics and semantics of EU health policy.

    Science.gov (United States)

    Brooks, Eleanor; Geyer, Robert

    2012-12-01

    Between 2001 and 2011 the pharmaceutical industry, supported by DG Enterprise, was engaged in an ongoing campaign to repeal/amend the European Union (EU) ban on direct-to-consumer advertising of prescription drugs (DTCA-PD). As it became increasingly clear that the ban would not be repealed, DTCA-PD supporters tried to shift the debate away from advertising and towards the provision of 'patient information' and the rights of patients to access such information. Meanwhile, a variety of national and European health organizations, supported by DG SANCO, sought to maintain the ban and oppose the industry-supported 'patient information' campaign. Instead, they promoted a concept of 'health information' that included all aspects of citizens' health, not just pharmaceuticals. This article aims to analyse the transition from DTCA-PD to patient information to health information and examine its implications for EU health policy as a complex policy space. The article examines the emergence and development of EU health policy and the evolution of the DTCA-PD debate through the lens of complexity theory. It analyses the nature of the semantic, political and policy transition and asks why it occurred, what it tells us about EU health policy and future EU health legislation and how it may be understood from a complexity perspective. The article concludes that the complexity framework is ideally suited for the field of public health and, in particular, the DTCA-PD debate. Having successfully shifted the policy-focus of the debate to patients' rights and health information, opponents of the legislation are likely to face their next battle in the realm of cyberspace, where regulatory issues change the nature of advertising. © 2012 Blackwell Publishing Ltd.

  11. Extrinsic and intrinsic complexities of the Los Alamos Plutonium Processing Facility

    International Nuclear Information System (INIS)

    Bearse, R.C.; Longmire, V.L.; Roberts, N.J.

    1985-01-01

    Analysis of the data obtained in one year of plutonium accounting at Los Alamos reveals significant complexity. Much of this complexity arises from the complexity of the processes themselves. Additional complexity is induced by errors in the data entry process. It is important to note that there is no evidence that this complexity is adversely affecting the accounting in the plant. We have been analyzing transaction data from fiscal year 1983 processing. This study involved 62,595 transactions. The data have been analyzed using the relational database program INGRES on a VAX 11/780 computer. This software allows easy manipulation of the original data and subsets drawn from it. We have been attempting for several years to understand the global features of the TA-55 accounting data. This project has underscored several of the system's complexities. Examples that will be reported here include audit trails, lot-name multiplicity, etc

  12. Aligning Business Process Quality and Information System Quality

    OpenAIRE

    Heinrich, Robert

    2013-01-01

    Business processes and information systems mutually affect each other in non-trivial ways. Frequently, the business process design and the information system design are not well aligned. This means that business processes are designed without taking the information system impact into account, and vice versa. Missing alignment at design time often results in quality problems at runtime, such as large response times of information systems, large process execution times, overloaded information s...

  13. Pure sources and efficient detectors for optical quantum information processing

    Science.gov (United States)

    Zielnicki, Kevin

    Over the last sixty years, classical information theory has revolutionized the understanding of the nature of information, and how it can be quantified and manipulated. Quantum information processing extends these lessons to quantum systems, where the properties of intrinsic uncertainty and entanglement fundamentally defy classical explanation. This growing field has many potential applications, including computing, cryptography, communication, and metrology. As inherently mobile quantum particles, photons are likely to play an important role in any mature large-scale quantum information processing system. However, the available methods for producing and detecting complex multi-photon states place practical limits on the feasibility of sophisticated optical quantum information processing experiments. In a typical quantum information protocol, a source first produces an interesting or useful quantum state (or set of states), perhaps involving superposition or entanglement. Then, some manipulations are performed on this state, perhaps involving quantum logic gates which further manipulate or entangle the intial state. Finally, the state must be detected, obtaining some desired measurement result, e.g., for secure communication or computationally efficient factoring. The work presented here concerns the first and last stages of this process as they relate to photons: sources and detectors. Our work on sources is based on the need for optimized non-classical states of light delivered at high rates, particularly of single photons in a pure quantum state. We seek to better understand the properties of spontaneous parameteric downconversion (SPDC) sources of photon pairs, and in doing so, produce such an optimized source. We report an SPDC source which produces pure heralded single photons with little or no spectral filtering, allowing a significant rate enhancement. Our work on detectors is based on the need to reliably measure single-photon states. We have focused on

  14. Science-based information processing in the process control of power stations

    International Nuclear Information System (INIS)

    Weisang, C.

    1992-01-01

    Through the application of specialized systems, future-orientated information processing integrates the sciences of processes, control systems, process control strategies, user behaviour and ergonomics. Improvements in process control can be attained, inter alia, by the preparation of the information contained (e.g. by suppressing the flow of signals and replacing it with signals which are found on substance) and also by an ergonomic representation of the study of the process. (orig.) [de

  15. Moral Judgment as Information Processing: An Integrative Review

    Directory of Open Access Journals (Sweden)

    Steve eGuglielmo

    2015-10-01

    Full Text Available This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two fundamental questions: (1 What input information guides moral judgments?; and (2 What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework, critically evaluates them on empirical and theoretical grounds, outlines a general integrative model grounded in information processing, and offers conceptual and methodological suggestions for future research. The information processing perspective provides a useful theoretical framework for organizing extant and future work in the rapidly growing field of moral judgment.

  16. Information properties of morphologically complex words modulate brain activity during word reading.

    Science.gov (United States)

    Hakala, Tero; Hultén, Annika; Lehtonen, Minna; Lagus, Krista; Salmelin, Riitta

    2018-06-01

    Neuroimaging studies of the reading process point to functionally distinct stages in word recognition. Yet, current understanding of the operations linked to those various stages is mainly descriptive in nature. Approaches developed in the field of computational linguistics may offer a more quantitative approach for understanding brain dynamics. Our aim was to evaluate whether a statistical model of morphology, with well-defined computational principles, can capture the neural dynamics of reading, using the concept of surprisal from information theory as the common measure. The Morfessor model, created for unsupervised discovery of morphemes, is based on the minimum description length principle and attempts to find optimal units of representation for complex words. In a word recognition task, we correlated brain responses to word surprisal values derived from Morfessor and from other psycholinguistic variables that have been linked with various levels of linguistic abstraction. The magnetoencephalography data analysis focused on spatially, temporally and functionally distinct components of cortical activation observed in reading tasks. The early occipital and occipito-temporal responses were correlated with parameters relating to visual complexity and orthographic properties, whereas the later bilateral superior temporal activation was correlated with whole-word based and morphological models. The results show that the word processing costs estimated by the statistical Morfessor model are relevant for brain dynamics of reading during late processing stages. © 2018 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.

  17. New levels of language processing complexity and organization revealed by Granger causation

    Directory of Open Access Journals (Sweden)

    David W Gow

    2012-11-01

    Full Text Available Granger causation analysis of high spatiotemporal resolution reconstructions of brain activation offers a new window on the dynamic interactions between brain areas that support language processing. Premised on the observation that causes both precede and uniquely predict their effects, this approach provides an intuitive, model-free means of identifying directed causal interactions in the brain. It requires the analysis of all nonredundant potentially interacting signals, and has shown that even early processes such as speech perception involve interactions of many areas in a strikingly large network that extends well beyond traditional left hemisphere perisylvian cortex that play out over hundreds of milliseconds. In this paper we describe this technique and review several general findings that reframe the way we think about language processing and brain function in general. These include the extent and complexity of language processing networks, the central role of interactive processing dynamics, the role of processing hubs where the input from many distinct brain regions are integrated, and the degree to which task requirements and stimulus properties influence processing dynamics and inform our understanding of language-specific localized processes.

  18. Information and complexity measures in the interface of a metal and a superconductor

    Science.gov (United States)

    Moustakidis, Ch. C.; Panos, C. P.

    2018-06-01

    Fisher information, Shannon information entropy and Statistical Complexity are calculated for the interface of a normal metal and a superconductor, as a function of the temperature for several materials. The order parameter Ψ (r) derived from the Ginzburg-Landau theory is used as an input together with experimental values of critical transition temperature Tc and the superconducting coherence length ξ0. Analytical expressions are obtained for information and complexity measures. Thus Tc is directly related in a simple way with disorder and complexity. An analytical relation is found of the Fisher Information with the energy profile of superconductivity i.e. the ratio of surface free energy and the bulk free energy. We verify that a simple relation holds between Shannon and Fisher information i.e. a decomposition of a global information quantity (Shannon) in terms of two local ones (Fisher information), previously derived and verified for atoms and molecules by Liu et al. Finally, we find analytical expressions for generalized information measures like the Tsallis entropy and Fisher information. We conclude that the proper value of the non-extensivity parameter q ≃ 1, in agreement with previous work using a different model, where q ≃ 1.005.

  19. Aridity and decomposition processes in complex landscapes

    Science.gov (United States)

    Ossola, Alessandro; Nyman, Petter

    2015-04-01

    Decomposition of organic matter is a key biogeochemical process contributing to nutrient cycles, carbon fluxes and soil development. The activity of decomposers depends on microclimate, with temperature and rainfall being major drivers. In complex terrain the fine-scale variation in microclimate (and hence water availability) as a result of slope orientation is caused by differences in incoming radiation and surface temperature. Aridity, measured as the long-term balance between net radiation and rainfall, is a metric that can be used to represent variations in water availability within the landscape. Since aridity metrics can be obtained at fine spatial scales, they could theoretically be used to investigate how decomposition processes vary across complex landscapes. In this study, four research sites were selected in tall open sclerophyll forest along a aridity gradient (Budyko dryness index ranging from 1.56 -2.22) where microclimate, litter moisture and soil moisture were monitored continuously for one year. Litter bags were packed to estimate decomposition rates (k) using leaves of a tree species not present in the study area (Eucalyptus globulus) in order to avoid home-field advantage effects. Litter mass loss was measured to assess the activity of macro-decomposers (6mm litter bag mesh size), meso-decomposers (1 mm mesh), microbes above-ground (0.2 mm mesh) and microbes below-ground (2 cm depth, 0.2 mm mesh). Four replicates for each set of bags were installed at each site and bags were collected at 1, 2, 4, 7 and 12 months since installation. We first tested whether differences in microclimate due to slope orientation have significant effects on decomposition processes. Then the dryness index was related to decomposition rates to evaluate if small-scale variation in decomposition can be predicted using readily available information on rainfall and radiation. Decomposition rates (k), calculated fitting single pool negative exponential models, generally

  20. The order of information processing alters economic gain-loss framing effects.

    Science.gov (United States)

    Kwak, Youngbin; Huettel, Scott

    2018-01-01

    Adaptive decision making requires analysis of available information during the process of choice. In many decisions that information is presented visually - which means that variations in visual properties (e.g., salience, complexity) can potentially influence the process of choice. In the current study, we demonstrate that variation in the left-right positioning of risky and safe decision options can influence the canonical gain-loss framing effect. Two experiments were conducted using an economic framing task in which participants chose between gambles and certain outcomes. The first experiment demonstrated that the magnitude of the gain-loss framing effect was greater when the certain option signaling the current frame was presented on the left side of the visual display. Eye-tracking data during task performance showed a left-gaze bias for initial fixations, suggesting that the option presented on the left side was processed first. Combination of eye-tracking and choice data revealed that there was a significant effect of direction of first gaze (i.e. left vs. right) as well as an interaction between gaze direction and identity of the first fixated information (i.e. certain vs. gamble) regardless of frame. A second experiment presented the gamble and certain options in a random order, with a temporal delay between their presentations. We found that the magnitude of gain-loss framing was larger when the certain option was presented first, regardless of left and right positioning, only in individuals with lower risk-taking tendencies. The effect of presentation order on framing was not present in high risk-takers. These results suggest that the sequence of visual information processing as well as their left-right positioning can bias choices by changing the impact of the presented information during risky decision making. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Information interfaces for process plant diagnosis

    International Nuclear Information System (INIS)

    Lind, M.

    1984-02-01

    The paper describes a systematic approach to the design of information interfaces for operator support in diagnosing complex systems faults. The need of interpreting primary measured plant variables within the framework of different system representations organized into an abstraction hierarchy is identified from an analysis of the problem of diagnosing complex systems. A formalized approach to the modelling of production systems, called Multilevel Flow Modelling, is described. A MFM model specifies plant control requirements and the associated need for plant information and provide a consistent context for the interpretation of real time plant signals in diagnosis of malfunctions. The use of MFM models as a basis for functional design of the plant instrumentation system is outlined, and the use of knowledge Based (Expert) Systems for the design of man-machine interfaces is mentioned. Such systems would allow an active user participation in diagnosis and thus provide the basis for cooperative problem solving. 14 refs. (author)

  2. [Complex automatic data processing in multi-profile hospitals].

    Science.gov (United States)

    Dovzhenko, Iu M; Panov, G D

    1990-01-01

    The computerization of data processing in multi-disciplinary hospitals is the key factor in raising the quality of medical care provided to the population, intensifying the work of the personnel, improving the curative and diagnostic process and the use of resources. Even a small experience in complex computerization at the Botkin Hospital indicates that due to the use of the automated system the quality of data processing in being improved, a high level of patients' examination is being provided, a speedy training of young specialists is being achieved, conditions are being created for continuing education of physicians through the analysis of their own activity. At big hospitals a complex solution of administrative and curative diagnostic tasks on the basis of general hospital network of display connection and general hospital data bank is the most prospective form of computerization.

  3. Processing Complex Sounds Passing through the Rostral Brainstem: The New Early Filter Model

    Science.gov (United States)

    Marsh, John E.; Campbell, Tom A.

    2016-01-01

    The rostral brainstem receives both “bottom-up” input from the ascending auditory system and “top-down” descending corticofugal connections. Speech information passing through the inferior colliculus of elderly listeners reflects the periodicity envelope of a speech syllable. This information arguably also reflects a composite of temporal-fine-structure (TFS) information from the higher frequency vowel harmonics of that repeated syllable. The amplitude of those higher frequency harmonics, bearing even higher frequency TFS information, correlates positively with the word recognition ability of elderly listeners under reverberatory conditions. Also relevant is that working memory capacity (WMC), which is subject to age-related decline, constrains the processing of sounds at the level of the brainstem. Turning to the effects of a visually presented sensory or memory load on auditory processes, there is a load-dependent reduction of that processing, as manifest in the auditory brainstem responses (ABR) evoked by to-be-ignored clicks. Wave V decreases in amplitude with increases in the visually presented memory load. A visually presented sensory load also produces a load-dependent reduction of a slightly different sort: The sensory load of visually presented information limits the disruptive effects of background sound upon working memory performance. A new early filter model is thus advanced whereby systems within the frontal lobe (affected by sensory or memory load) cholinergically influence top-down corticofugal connections. Those corticofugal connections constrain the processing of complex sounds such as speech at the level of the brainstem. Selective attention thereby limits the distracting effects of background sound entering the higher auditory system via the inferior colliculus. Processing TFS in the brainstem relates to perception of speech under adverse conditions. Attentional selectivity is crucial when the signal heard is degraded or masked: e

  4. Moral judgment as information processing: an integrative review.

    Science.gov (United States)

    Guglielmo, Steve

    2015-01-01

    How do humans make moral judgments about others' behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment.

  5. Moral judgment as information processing: an integrative review

    Science.gov (United States)

    Guglielmo, Steve

    2015-01-01

    How do humans make moral judgments about others’ behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment. PMID:26579022

  6. Process-aware information systems : design, enactment and analysis

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Wah, B.W.

    2009-01-01

    Process-aware information systems support operational business processes by combining advances in information technology with recent insights from management science. Workflow management systems are typical examples of such systems. However, many other types of information systems are also "process

  7. Information paths within the new product development process

    DEFF Research Database (Denmark)

    Jespersen, Kristina Risom

    2007-01-01

    collection platform to obtain measurements from within the NPD process. 42 large, international companies participated in the data collecting simulation. Results revealed five different information paths that were not connecting all stages of the NPD process. Moreover, results show that the front......-end is not driving the information acquisition through the stages of the NPD process, and that environmental turbulence disconnects stages from the information paths in the NPD process. This implies that information is at the same time a key to success and a key to entrapment in the NPD process....

  8. Information Processing Features Can Detect Behavioral Regimes of Dynamical Systems

    Directory of Open Access Journals (Sweden)

    Rick Quax

    2018-01-01

    Full Text Available In dynamical systems, local interactions between dynamical units generate correlations which are stored and transmitted throughout the system, generating the macroscopic behavior. However a framework to quantify exactly how these correlations are stored, transmitted, and combined at the microscopic scale is missing. Here we propose to characterize the notion of “information processing” based on all possible Shannon mutual information quantities between a future state and all possible sets of initial states. We apply it to the 256 elementary cellular automata (ECA, which are the simplest possible dynamical systems exhibiting behaviors ranging from simple to complex. Our main finding is that only a few information features are needed for full predictability of the systemic behavior and that the “information synergy” feature is always most predictive. Finally we apply the idea to foreign exchange (FX and interest-rate swap (IRS time-series data. We find an effective “slowing down” leading indicator in all three markets for the 2008 financial crisis when applied to the information features, as opposed to using the data itself directly. Our work suggests that the proposed characterization of the local information processing of units may be a promising direction for predicting emergent systemic behaviors.

  9. Photonic single nonlinear-delay dynamical node for information processing

    Science.gov (United States)

    Ortín, Silvia; San-Martín, Daniel; Pesquera, Luis; Gutiérrez, José Manuel

    2012-06-01

    An electro-optical system with a delay loop based on semiconductor lasers is investigated for information processing by performing numerical simulations. This system can replace a complex network of many nonlinear elements for the implementation of Reservoir Computing. We show that a single nonlinear-delay dynamical system has the basic properties to perform as reservoir: short-term memory and separation property. The computing performance of this system is evaluated for two prediction tasks: Lorenz chaotic time series and nonlinear auto-regressive moving average (NARMA) model. We sweep the parameters of the system to find the best performance. The results achieved for the Lorenz and the NARMA-10 tasks are comparable to those obtained by other machine learning methods.

  10. Process-aware information systems : bridging people and software through process technology

    NARCIS (Netherlands)

    Dumas, M.; Aalst, van der W.M.P.; Hofstede, ter A.H.M.

    2005-01-01

    A unifying foundation to design and implement process-aware information systems This publication takes on the formidable task of establishing a unifying foundation and set of common underlying principles to effectively model, design, and implement process-aware information systems. Authored by

  11. Managing information security in a process industrial environment; Gestao de seguranca da informacao em processos industriais

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, Raphael Gomes; Aguiar, Leandro Pfleger de [Siemens Company (Brazil)

    2008-07-01

    With the recently globalization expansion (growth), the exploration of energetic resources is crossing over countries boundaries, resulting in worldwide companies exploring Oil and Gas fields available in any place of the world. To the government's bodies, this information about those fields should be treated as a national security interest subject by bringing an adequate management and protection to all the important and critical information and assets, and making possible, at the same time, the freedom and transparency in concurrence processes. This create a complex security context to be managed, where information disruption might, for instance, imply in broke of integrity in public auctions processes as a result of privileged information usage. Furthermore, with the terrorism problem, the process itself becomes an attractive target for different kinds of attacks, motivated by the opportunism to explore the known incapacity of the big industries in well manage their large and complex environments. With all transformations that are happening in productive processes, as the growing TCP/IP protocol usage, the Windows operating systems adoption in SCADA systems and the integration of industrial with business network, are factors that contribute to an eminent landscape of problems. This landscape demonstrates the need from the organizations and countries that are operating in energetic resources exploration, for renew their risk management areas, establishing a unique and integrated process to protect information security infrastructure. This work presents a study of the challenges to be faced by the organizations while rebuilding their internal processes to integrate the risk management and information security areas, as long as a set of essential steps to establish an affective corporative governance of risk management and compliance aspects. Moreover, the work presents the necessary points of the government involvement to improve all the regulatory aspects

  12. Scaling the Information Processing Demands of Occupations

    Science.gov (United States)

    Haase, Richard F.; Jome, LaRae M.; Ferreira, Joaquim Armando; Santos, Eduardo J. R.; Connacher, Christopher C.; Sendrowitz, Kerrin

    2011-01-01

    The purpose of this study was to provide additional validity evidence for a model of person-environment fit based on polychronicity, stimulus load, and information processing capacities. In this line of research the confluence of polychronicity and information processing (e.g., the ability of individuals to process stimuli from the environment…

  13. Web mapping system for complex processing and visualization of environmental geospatial datasets

    Science.gov (United States)

    Titov, Alexander; Gordov, Evgeny; Okladnikov, Igor

    2016-04-01

    Environmental geospatial datasets (meteorological observations, modeling and reanalysis results, etc.) are used in numerous research applications. Due to a number of objective reasons such as inherent heterogeneity of environmental datasets, big dataset volume, complexity of data models used, syntactic and semantic differences that complicate creation and use of unified terminology, the development of environmental geodata access, processing and visualization services as well as client applications turns out to be quite a sophisticated task. According to general INSPIRE requirements to data visualization geoportal web applications have to provide such standard functionality as data overview, image navigation, scrolling, scaling and graphical overlay, displaying map legends and corresponding metadata information. It should be noted that modern web mapping systems as integrated geoportal applications are developed based on the SOA and might be considered as complexes of interconnected software tools for working with geospatial data. In the report a complex web mapping system including GIS web client and corresponding OGC services for working with geospatial (NetCDF, PostGIS) dataset archive is presented. There are three basic tiers of the GIS web client in it: 1. Tier of geospatial metadata retrieved from central MySQL repository and represented in JSON format 2. Tier of JavaScript objects implementing methods handling: --- NetCDF metadata --- Task XML object for configuring user calculations, input and output formats --- OGC WMS/WFS cartographical services 3. Graphical user interface (GUI) tier representing JavaScript objects realizing web application business logic Metadata tier consists of a number of JSON objects containing technical information describing geospatial datasets (such as spatio-temporal resolution, meteorological parameters, valid processing methods, etc). The middleware tier of JavaScript objects implementing methods for handling geospatial

  14. Physics as Information Processing

    International Nuclear Information System (INIS)

    D'Ariano, Giacomo Mauro

    2011-01-01

    I review some recent advances in foundational research at Pavia QUIT group. The general idea is that there is only Quantum Theory without quantization rules, and the whole Physics - including space-time and relativity - is emergent from the quantum-information processing. And since Quantum Theory itself is axiomatized solely on informational principles, the whole Physics must be reformulated in information-theoretical terms: this is the It from bit of J. A. Wheeler.The review is divided into four parts: a) the informational axiomatization of Quantum Theory; b) how space-time and relativistic covariance emerge from quantum computation; c) what is the information-theoretical meaning of inertial mass and of (ℎ/2π), and how the quantum field emerges; d) an observational consequence of the new quantum field theory: a mass-dependent refraction index of vacuum. I will conclude with the research lines that will follow in the immediate future.

  15. A language for information commerce processes

    NARCIS (Netherlands)

    Aberer, Karl; Wombacher, Andreas

    Automatizing information commerce requires languages to represent the typical information commerce processes. Existing languages and standards cover either only very specific types of business models or are too general to capture in a concise way the specific properties of information commerce

  16. Stroop performance in multiple sclerosis: information processing, selective attention, or executive functioning?

    Science.gov (United States)

    Macniven, J A B; Davis, C; Ho, M-Y; Bradshaw, C M; Szabadi, E; Constantinescu, C S

    2008-09-01

    Cognitive impairments in information processing speed, attention and executive functioning are widely reported in patients with multiple sclerosis (MS). Several studies have identified impaired performance on the Stroop test in people with MS, yet uncertainty remains over the cause of this phenomenon. In this study, 25 patients with MS were assessed with a neuropsychological test battery including a computerized Stroop test and a computerized test of information processing speed, the Graded Conditional Discrimination Tasks (GCDT). The patient group was compared with an individually age, sex and estimated premorbid IQ-matched healthy control group. The patients' reaction times (RTs) were significantly longer than those of the controls on all Stroop test trials and there was a significantly enhanced absolute (RT(incongruent)-RT(neutral)) and relative (100 x [RT(incongruent)-RT(neutral)]/RT(neutral)) Stroop interference effect for the MS group. The linear function relating RT to stimulus complexity in the GCDT was significantly steeper in the patient group, indicating slowed information processing. The results are discussed with reference to the difference engine model, a theory of diversity in speeded cognition. It is concluded that, in the assessment of people with MS, great caution must be used in the interpretation of performance on neuropsychological tests which rely on RT as the primary measure.

  17. Identification of Functional Information Subgraphs in Complex Networks

    International Nuclear Information System (INIS)

    Bettencourt, Luis M. A.; Gintautas, Vadas; Ham, Michael I.

    2008-01-01

    We present a general information theoretic approach for identifying functional subgraphs in complex networks. We show that the uncertainty in a variable can be written as a sum of information quantities, where each term is generated by successively conditioning mutual informations on new measured variables in a way analogous to a discrete differential calculus. The analogy to a Taylor series suggests efficient optimization algorithms for determining the state of a target variable in terms of functional groups of other nodes. We apply this methodology to electrophysiological recordings of cortical neuronal networks grown in vitro. Each cell's firing is generally explained by the activity of a few neurons. We identify these neuronal subgraphs in terms of their redundant or synergetic character and reconstruct neuronal circuits that account for the state of target cells

  18. A process framework for information security management

    Directory of Open Access Journals (Sweden)

    Knut Haufe

    2016-01-01

    Full Text Available Securing sensitive organizational data has become increasingly vital to organizations. An Information Security Management System (ISMS is a systematic approach for establishing, implementing, operating, monitoring, reviewing, maintaining and improving an organization's information security. Key elements of the operation of an ISMS are ISMS processes. However, and in spite of its importance, an ISMS process framework with a description of ISMS processes and their interaction as well as the interaction with other management processes is not available in the literature. Cost benefit analysis of information security investments regarding single measures protecting information and ISMS processes are not in the focus of current research, mostly focused on economics. This article aims to fill this research gap by proposing such an ISMS process framework as the main contribution. Based on a set of agreed upon ISMS processes in existing standards like ISO 27000 series, COBIT and ITIL. Within the framework, identified processes are described and their interaction and interfaces are specified. This framework helps to focus on the operation of the ISMS, instead of focusing on measures and controls. By this, as a main finding, the systemic character of the ISMS consisting of processes and the perception of relevant roles of the ISMS is strengthened.

  19. The Importance of Water for High Fidelity Information Processing and for Life

    Science.gov (United States)

    Hoehler, Tori M.; Pohorille, Andrew

    2011-01-01

    Is water an absolute prerequisite for life? Life depends on a variety of non-covalent interactions among molecules, the nature of which is determined as much by the solvent in which they occur as by the molecules themselves. Catalysis and information processing, two essential functions of life, require non-covalent molecular recognition with very high specificity. For example, to correctly reproduce a string consisting of 600,000 units of information (e.g ., 600 kilobases, equivalent to the genome of the smallest free living terrestrial organisms) with a 90% success rate requires specificity > 107 : 1 for the target molecule vs. incorrect alternatives. Such specificity requires (i) that the correct molecular association is energetically stabilized by at least 40 kJ/mol relative to alternatives, and (ii) that the system is able to sample among possible states (alternative molecular associations) rapidly enough to allow the system to fall under thermodynamic control and express the energetic stabilization. We argue that electrostatic interactions are required to confer the necessary energetic stabilization vs. a large library of molecular alternatives, and that a solvent with polarity and dielectric properties comparable to water is required for the system to sample among possible states and express thermodynamic control. Electrostatic associations can be made in non-polar solvents, but the resulting complexes are too stable to be "unmade" with sufficient frequency to confer thermodynamic control on the system. An electrostatic molecular complex representing 3 units of information (e.g., 3 base pairs) with specificity > 107 per unit has a stability in non-polar solvent comparable to that of a carbon-carbon bond at room temperature. These considerations suggest that water, or a solvent with properties very like water, is necessary to support high-fidelity information processing, and can therefore be considered a critical prerequisite for life.

  20. Understanding the implementation of complex interventions in health care: the normalization process model

    Directory of Open Access Journals (Sweden)

    Rogers Anne

    2007-09-01

    Full Text Available Abstract Background The Normalization Process Model is a theoretical model that assists in explaining the processes by which complex interventions become routinely embedded in health care practice. It offers a framework for process evaluation and also for comparative studies of complex interventions. It focuses on the factors that promote or inhibit the routine embedding of complex interventions in health care practice. Methods A formal theory structure is used to define the model, and its internal causal relations and mechanisms. The model is broken down to show that it is consistent and adequate in generating accurate description, systematic explanation, and the production of rational knowledge claims about the workability and integration of complex interventions. Results The model explains the normalization of complex interventions by reference to four factors demonstrated to promote or inhibit the operationalization and embedding of complex interventions (interactional workability, relational integration, skill-set workability, and contextual integration. Conclusion The model is consistent and adequate. Repeated calls for theoretically sound process evaluations in randomized controlled trials of complex interventions, and policy-makers who call for a proper understanding of implementation processes, emphasize the value of conceptual tools like the Normalization Process Model.

  1. Procurement of complex performance in public infrastructure: a process perspective

    OpenAIRE

    Hartmann, Andreas; Roehrich, Jens; Davies, Andrew; Frederiksen, Lars; Davies, J.; Harrington, T.; Kirkwood, D.; Holweg, M.

    2011-01-01

    The paper analyzes the process of transitioning from procuring single products and services to procuring complex performance in public infrastructure. The aim is to examine the change in the interactions between buyer and supplier, the emergence of value co-creation and the capability development during the transition process. Based on a multiple, longitudinal case study the paper proposes three generic transition stages towards increased performance and infrastructural complexity. These stag...

  2. Habitat Complexity in Aquatic Microcosms Affects Processes Driven by Detritivores.

    Directory of Open Access Journals (Sweden)

    Lorea Flores

    Full Text Available Habitat complexity can influence predation rates (e.g. by providing refuge but other ecosystem processes and species interactions might also be modulated by the properties of habitat structure. Here, we focussed on how complexity of artificial habitat (plastic plants, in microcosms, influenced short-term processes driven by three aquatic detritivores. The effects of habitat complexity on leaf decomposition, production of fine organic matter and pH levels were explored by measuring complexity in three ways: 1. as the presence vs. absence of habitat structure; 2. as the amount of structure (3 or 4.5 g of plastic plants; and 3. as the spatial configuration of structures (measured as fractal dimension. The experiment also addressed potential interactions among the consumers by running all possible species combinations. In the experimental microcosms, habitat complexity influenced how species performed, especially when comparing structure present vs. structure absent. Treatments with structure showed higher fine particulate matter production and lower pH compared to treatments without structures and this was probably due to higher digestion and respiration when structures were present. When we explored the effects of the different complexity levels, we found that the amount of structure added explained more than the fractal dimension of the structures. We give a detailed overview of the experimental design, statistical models and R codes, because our statistical analysis can be applied to other study systems (and disciplines such as restoration ecology. We further make suggestions of how to optimise statistical power when artificially assembling, and analysing, 'habitat complexity' by not confounding complexity with the amount of structure added. In summary, this study highlights the importance of habitat complexity for energy flow and the maintenance of ecosystem processes in aquatic ecosystems.

  3. The Application of Karl Popper's Three Worlds Schema to Questions about Information in the Fields of Complexity, Cybernetics, and Informatics

    Directory of Open Access Journals (Sweden)

    Paul D. Nugent

    2015-04-01

    Full Text Available More technically leaning disciplines such as informatics, complexity theory, and cybernetics often make simplifying assumptions about human beings and their causal/informational roles within larger techo-social systems. This paper employs the philosopher Karl Popper's three worlds schema to explore in depth the unique ways in which conscious human subjects process and create knowledge and information. The three worlds represent the physical world, the subjective world of the conscious subject, and the world of language, models, and schemas. The works of major philosophers are invoked to consider what makes conscious human subjects unique in the context of information systems. Context-based understandings, the expressive facet of consciousness, and experience-based valuing emerge as key themes that we believe could strengthen the fields of informatics, complexity theory, and cybernetics.

  4. Cerebro-cerebellar interactions underlying temporal information processing.

    Science.gov (United States)

    Aso, Kenji; Hanakawa, Takashi; Aso, Toshihiko; Fukuyama, Hidenao

    2010-12-01

    The neural basis of temporal information processing remains unclear, but it is proposed that the cerebellum plays an important role through its internal clock or feed-forward computation functions. In this study, fMRI was used to investigate the brain networks engaged in perceptual and motor aspects of subsecond temporal processing without accompanying coprocessing of spatial information. Direct comparison between perceptual and motor aspects of time processing was made with a categorical-design analysis. The right lateral cerebellum (lobule VI) was active during a time discrimination task, whereas the left cerebellar lobule VI was activated during a timed movement generation task. These findings were consistent with the idea that the cerebellum contributed to subsecond time processing in both perceptual and motor aspects. The feed-forward computational theory of the cerebellum predicted increased cerebro-cerebellar interactions during time information processing. In fact, a psychophysiological interaction analysis identified the supplementary motor and dorsal premotor areas, which had a significant functional connectivity with the right cerebellar region during a time discrimination task and with the left lateral cerebellum during a timed movement generation task. The involvement of cerebro-cerebellar interactions may provide supportive evidence that temporal information processing relies on the simulation of timing information through feed-forward computation in the cerebellum.

  5. Information-Theoretic Approaches for Evaluating Complex Adaptive Social Simulation Systems

    Energy Technology Data Exchange (ETDEWEB)

    Omitaomu, Olufemi A [ORNL; Ganguly, Auroop R [ORNL; Jiao, Yu [ORNL

    2009-01-01

    In this paper, we propose information-theoretic approaches for comparing and evaluating complex agent-based models. In information theoretic terms, entropy and mutual information are two measures of system complexity. We used entropy as a measure of the regularity of the number of agents in a social class; and mutual information as a measure of information shared by two social classes. Using our approaches, we compared two analogous agent-based (AB) models developed for regional-scale social-simulation system. The first AB model, called ABM-1, is a complex AB built with 10,000 agents on a desktop environment and used aggregate data; the second AB model, ABM-2, was built with 31 million agents on a highperformance computing framework located at Oak Ridge National Laboratory, and fine-resolution data from the LandScan Global Population Database. The initializations were slightly different, with ABM-1 using samples from a probability distribution and ABM-2 using polling data from Gallop for a deterministic initialization. The geographical and temporal domain was present-day Afghanistan, and the end result was the number of agents with one of three behavioral modes (proinsurgent, neutral, and pro-government) corresponding to the population mindshare. The theories embedded in each model were identical, and the test simulations focused on a test of three leadership theories - legitimacy, coercion, and representative, and two social mobilization theories - social influence and repression. The theories are tied together using the Cobb-Douglas utility function. Based on our results, the hypothesis that performance measures can be developed to compare and contrast AB models appears to be supported. Furthermore, we observed significant bias in the two models. Even so, further tests and investigations are required not only with a wider class of theories and AB models, but also with additional observed or simulated data and more comprehensive performance measures.

  6. PHYSICAL RESOURCES OF INFORMATION PROCESSES AND TECHNOLOGIES

    Directory of Open Access Journals (Sweden)

    Mikhail O. Kolbanev

    2014-11-01

    Full Text Available Subject of study. The paper describes basic information technologies for automating of information processes of data storage, distribution and processing in terms of required physical resources. It is shown that the study of these processes with such traditional objectives of modern computer science, as the ability to transfer knowledge, degree of automation, information security, coding, reliability, and others, is not enough. The reasons are: on the one hand, the increase in the volume and intensity of information exchange in the subject of human activity and, on the other hand, drawing near to the limit of information systems efficiency based on semiconductor technologies. Creation of such technologies, which not only provide support for information interaction, but also consume a rational amount of physical resources, has become an actual problem of modern engineering development. Thus, basic information technologies for storage, distribution and processing of information to support the interaction between people are the object of study, and physical temporal, spatial and energy resources required for implementation of these technologies are the subject of study. Approaches. An attempt is made to enlarge the possibilities of traditional cybernetics methodology, which replaces the consideration of material information component by states search for information objects. It is done by taking explicitly into account the amount of physical resources required for changes in the states of information media. Purpose of study. The paper deals with working out of a common approach to the comparison and subsequent selection of basic information technologies for storage, distribution and processing of data, taking into account not only the requirements for the quality of information exchange in particular subject area and the degree of technology application, but also the amounts of consumed physical resources. Main findings. Classification of resources

  7. Conjoint Management of Business Processes and Information Technologies

    DEFF Research Database (Denmark)

    Siurdyban, Artur

    and improve business processes. As a consequence, there is a growing need to address managerial aspects of the relationships between information technologies and business processes. The aim of this PhD study is to investigate how the practice of conjoint management of business processes and information...... technologies can be supported and improved. The study is organized into five research papers and this summary. Each paper addresses a different aspect of conjoint management of business processes and information technologies, i.e. problem development and managerial practices on software...... and information technologies in a project environment. It states that both elements are intrinsically related and should be designed and considered together. The second case examines the relationships between information technology management and business process management. It discusses the multi-faceted role...

  8. Tailored information for cancer patients on the Internet: effects of visual cues and language complexity on information recall and satisfaction.

    NARCIS (Netherlands)

    Weert, J.C.M. van; Noort, G. van; Bol, N.; Dijk, L. van; Tates, K.; Jansen, J.

    2011-01-01

    Objective: This study was designed to investigate the effects of visual cues and language complexity on satisfaction and information recall using a personalised website for lung cancer patients. In addition, age effects were investigated. Methods: An experiment using a 2 (complex vs. non-complex

  9. Tailored information for cancer patients on the Internet: effects of visual cues and language complexity on information recall and satisfaction

    NARCIS (Netherlands)

    van Weert, J.C.M.; van Noort, G.; Bol, N.; van Dijk, L.; Tates, K.; Jansen, J.

    2011-01-01

    Objective This study was designed to investigate the effects of visual cues and language complexity on satisfaction and information recall using a personalised website for lung cancer patients. In addition, age effects were investigated. Methods An experiment using a 2 (complex vs. non-complex

  10. Synergistic Information Processing Encrypts Strategic Reasoning in Poker.

    Science.gov (United States)

    Frey, Seth; Albino, Dominic K; Williams, Paul L

    2018-06-14

    There is a tendency in decision-making research to treat uncertainty only as a problem to be overcome. But it is also a feature that can be leveraged, particularly in social interaction. Comparing the behavior of profitable and unprofitable poker players, we reveal a strategic use of information processing that keeps decision makers unpredictable. To win at poker, a player must exploit public signals from others. But using public inputs makes it easier for an observer to reconstruct that player's strategy and predict his or her behavior. How should players trade off between exploiting profitable opportunities and remaining unexploitable themselves? Using a recent multivariate approach to information theoretic data analysis and 1.75 million hands of online two-player No-Limit Texas Hold'em, we find that the important difference between winning and losing players is not in the amount of information they process, but how they process it. In particular, winning players are better at integrative information processing-creating new information from the interaction between their cards and their opponents' signals. We argue that integrative information processing does not just produce better decisions, it makes decision-making harder for others to reverse engineer, as an expert poker player's cards act like the private key in public-key cryptography. Poker players encrypt their reasoning with the way they process information. The encryption function of integrative information processing makes it possible for players to exploit others while remaining unexploitable. By recognizing the act of information processing as a strategic behavior in its own right, we offer a detailed account of how experts use endemic uncertainty to conceal their intentions in high-stakes competitive environments, and we highlight new opportunities between cognitive science, information theory, and game theory. Copyright © 2018 Cognitive Science Society, Inc.

  11. Animal models for information processing during sleep

    NARCIS (Netherlands)

    Coenen, A.M.L.; Drinkenburg, W.H.I.M.

    2002-01-01

    Information provided by external stimuli does reach the brain during sleep, although the amount of information is reduced during sleep compared to wakefulness. The process controlling this reduction is called `sensory' gating and evidence exists that the underlying neurophysiological processes take

  12. Quantum information processing beyond ten ion-qubits

    International Nuclear Information System (INIS)

    Monz, T.

    2011-01-01

    Successful processing of quantum information is, to a large degree, based on two aspects: a) the implementation of high-fidelity quantum gates, as well as b) avoiding or suppressing decoherence processes that destroy quantum information. The presented work shows our progress in the field of experimental quantum information processing over the last years: the implementation and characterisation of several quantum operations, amongst others the first realisation of the quantum Toffoli gate in an ion-trap based quantum computer. The creation of entangled states with up to 14 qubits serves as basis for investigations of decoherence processes. Based on the realised quantum operations as well as the knowledge about dominant noise processes in the employed apparatus, entanglement swapping as well as quantum operations within a decoherence-free subspace are demonstrated. (author) [de

  13. A large scale analysis of information-theoretic network complexity measures using chemical structures.

    Directory of Open Access Journals (Sweden)

    Matthias Dehmer

    Full Text Available This paper aims to investigate information-theoretic network complexity measures which have already been intensely used in mathematical- and medicinal chemistry including drug design. Numerous such measures have been developed so far but many of them lack a meaningful interpretation, e.g., we want to examine which kind of structural information they detect. Therefore, our main contribution is to shed light on the relatedness between some selected information measures for graphs by performing a large scale analysis using chemical networks. Starting from several sets containing real and synthetic chemical structures represented by graphs, we study the relatedness between a classical (partition-based complexity measure called the topological information content of a graph and some others inferred by a different paradigm leading to partition-independent measures. Moreover, we evaluate the uniqueness of network complexity measures numerically. Generally, a high uniqueness is an important and desirable property when designing novel topological descriptors having the potential to be applied to large chemical databases.

  14. Characterization of ESIPT reactions with instant spectra of fluorescence and complexation processes

    International Nuclear Information System (INIS)

    Tomin, Vladimir I.; Ushakou, Dzmitryi V.

    2016-01-01

    Proton transfer processes and especially excited-state intramolecular proton transfer (ESIPT) are of interest not only in physical studies but in a wide range of biological and chemical researches, since they play an important role in different fundamental reactions. Moreover, occurrence of ESIPT very often causes two-bands emission spectra corresponding to the normal and photoproduct (tautomer) forms of molecular structure. It allows carrying out unique measurement of microcharacteristics in chemical and biological researches by using substances with ESIPT as molecular probes, because its dual emission is very sensitive to parameters of microenvironment. Dual fluorescence signal is very convenient for two wavelength ratiometric measurements as they are more sensitive to variations of sample characteristics. Recently new approach for revealing type of excited state reaction which is based on analysis of dynamic changes of relative intensities in instant spectra of fluorescence ESIPT solutes was suggested and tested for neat solutions. Now we generalize this method on solutions in which ESIPT solute may participate also in creating fluorescent complexes. We demonstrate that relative intensities of instant spectra of fluorescence registered with high time resolution allow to get valuable information referring to type of excited state reaction in which dye may undergo complexation reactions with ions in solvent. In addition we show how it is possible in such case to determine characteristics of complexation as, for example, stability constant and efficiency of complexation.

  15. The Effects of Using Multimedia Presentations and Modular Worked-Out Examples as Instructional Methodologies to Manage the Cognitive Processing Associated with Information Literacy Instruction at the Graduate and Undergraduate Levels of Nursing Education

    Science.gov (United States)

    Calhoun, Shawn P.

    2012-01-01

    Information literacy is a complex knowledge domain. Cognitive processing theory describes the effects an instructional subject and the learning environment have on working memory. Essential processing is one component of cognitive processing theory that explains the inherent complexity of knowledge domains such as information literacy. Prior…

  16. Mission informed needed information: discoverable, available sensing sources (MINI-DASS): the operators and process flows the magic rabbits must negotiate

    Science.gov (United States)

    Kolodny, Michael A.

    2017-05-01

    Today's battlefield space is extremely complex, dealing with an enemy that is neither well-defined nor well-understood. Adversaries are comprised of widely-distributed, loosely-networked groups engaging in nefarious activities. Situational understanding is needed by decision makers; understanding of adversarial capabilities and intent is essential. Information needed at any time is dependent on the mission/task at hand. Information sources potentially providing mission-relevant information are disparate and numerous; they include sensors, social networks, fusion engines, internet, etc. Management of these multi-dimensional informational sources is critical. This paper will present a new approach being undertaken to answer the challenge of enhancing battlefield understanding by optimizing the utilization of available informational sources (means) to required missions/tasks as well as determining the "goodness'" of the information acquired in meeting the capabilities needed. Requirements are usually expressed in terms of a presumed technology solution (e.g., imagery). A metaphor of the "magic rabbits" was conceived to remove presumed technology solutions from requirements by claiming the "required" technology is obsolete. Instead, intelligent "magic rabbits" are used to provide needed information. The question then becomes: "WHAT INFORMATION DO YOU NEED THE RABBITS TO PROVIDE YOU?" This paper will describe a new approach called Mission-Informed Needed Information - Discoverable, Available Sensing Sources (MINI-DASS) that designs a process that builds information acquisition missions and determines what the "magic rabbits" need to provide in a manner that is machine understandable. Also described is the Missions and Means Framework (MMF) model used, the process flow utilized, the approach to developing an ontology of information source means and the approach for determining the value of the information acquired.

  17. Introduction to information processing

    CERN Document Server

    Dietel, Harvey M

    2014-01-01

    An Introduction to Information Processing provides an informal introduction to the computer field. This book introduces computer hardware, which is the actual computing equipment.Organized into three parts encompassing 12 chapters, this book begins with an overview of the evolution of personal computing and includes detailed case studies on two of the most essential personal computers for the 1980s, namely, the IBM Personal Computer and Apple's Macintosh. This text then traces the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapte

  18. Environmental Complexity Related Information for the Assessment of Country Logistics Environments

    DEFF Research Database (Denmark)

    Kinra, Aseem

    2015-01-01

    logistics assessment generates some of this information, its relevance for the decision makers, and relationship to their unpredictability from foreign national logistics systems remains indefinite. This paper identifies and categorises the relevant, available information on country logistics environments...... by using a content analysis approach. We demonstrate the immensity and nature of this information, are able to confirm the changing spatial transaction cost structures, and to reflect upon the overall conditions of information-related complexity and globalisation in the environment. Besides making...

  19. Wavelet analysis of molecular dynamics: Efficient extraction of time-frequency information in ultrafast optical processes

    International Nuclear Information System (INIS)

    Prior, Javier; Castro, Enrique; Chin, Alex W.; Almeida, Javier; Huelga, Susana F.; Plenio, Martin B.

    2013-01-01

    New experimental techniques based on nonlinear ultrafast spectroscopies have been developed over the last few years, and have been demonstrated to provide powerful probes of quantum dynamics in different types of molecular aggregates, including both natural and artificial light harvesting complexes. Fourier transform-based spectroscopies have been particularly successful, yet “complete” spectral information normally necessitates the loss of all information on the temporal sequence of events in a signal. This information though is particularly important in transient or multi-stage processes, in which the spectral decomposition of the data evolves in time. By going through several examples of ultrafast quantum dynamics, we demonstrate that the use of wavelets provide an efficient and accurate way to simultaneously acquire both temporal and frequency information about a signal, and argue that this greatly aids the elucidation and interpretation of physical process responsible for non-stationary spectroscopic features, such as those encountered in coherent excitonic energy transport

  20. Simulating Complex, Cold-region Process Interactions Using a Multi-scale, Variable-complexity Hydrological Model

    Science.gov (United States)

    Marsh, C.; Pomeroy, J. W.; Wheater, H. S.

    2017-12-01

    Accurate management of water resources is necessary for social, economic, and environmental sustainability worldwide. In locations with seasonal snowcovers, the accurate prediction of these water resources is further complicated due to frozen soils, solid-phase precipitation, blowing snow transport, and snowcover-vegetation-atmosphere interactions. Complex process interactions and feedbacks are a key feature of hydrological systems and may result in emergent phenomena, i.e., the arising of novel and unexpected properties within a complex system. One example is the feedback associated with blowing snow redistribution, which can lead to drifts that cause locally-increased soil moisture, thus increasing plant growth that in turn subsequently impacts snow redistribution, creating larger drifts. Attempting to simulate these emergent behaviours is a significant challenge, however, and there is concern that process conceptualizations within current models are too incomplete to represent the needed interactions. An improved understanding of the role of emergence in hydrological systems often requires high resolution distributed numerical hydrological models that incorporate the relevant process dynamics. The Canadian Hydrological Model (CHM) provides a novel tool for examining cold region hydrological systems. Key features include efficient terrain representation, allowing simulations at various spatial scales, reduced computational overhead, and a modular process representation allowing for an alternative-hypothesis framework. Using both physics-based and conceptual process representations sourced from long term process studies and the current cold regions literature allows for comparison of process representations and importantly, their ability to produce emergent behaviours. Examining the system in a holistic, process-based manner can hopefully derive important insights and aid in development of improved process representations.

  1. Video Analysis and Remote Digital Ethnography: Approaches to understanding user perspectives and processes involving healthcare information technology.

    Science.gov (United States)

    Kushniruk, Andre W; Borycki, Elizabeth M

    2015-01-01

    Innovations in healthcare information systems promise to revolutionize and streamline healthcare processes worldwide. However, the complexity of these systems and the need to better understand issues related to human-computer interaction have slowed progress in this area. In this chapter the authors describe their work in using methods adapted from usability engineering, video ethnography and analysis of digital log files for improving our understanding of complex real-world healthcare interactions between humans and technology. The approaches taken are cost-effective and practical and can provide detailed ethnographic data on issues health professionals and consumers encounter while using systems as well as potential safety problems. The work is important in that it can be used in techno-anthropology to characterize complex user interactions with technologies and also to provide feedback into redesign and optimization of improved healthcare information systems.

  2. Genetic effects on information processing speed are moderated by age--converging results from three samples.

    Science.gov (United States)

    Ising, M; Mather, K A; Zimmermann, P; Brückl, T; Höhne, N; Heck, A; Schenk, L A; Rujescu, D; Armstrong, N J; Sachdev, P S; Reppermund, S

    2014-06-01

    Information processing is a cognitive trait forming the basis of complex abilities like executive function. The Trail Making Test (TMT) is a well-established test of information processing with moderate to high heritability. Age of the individual also plays an important role. A number of genetic association studies with the TMT have been performed, which, however, did not consider age as a moderating factor. We report the results of genome-wide association studies (GWASs) on age-independent and age-dependent TMT performance in two population-representative community samples (Munich Antidepressant Response Signature, MARS: N1 = 540; Ludwig Maximilians University, LMU: N2 = 350). Age-dependent genome-wide findings were then evaluated in a third sample of healthy elderly subjects (Sydney Memory and Ageing Study, Sydney MAS: N3 = 448). While a meta-analysis on the GWAS findings did not reveal age-independent TMT associations withstanding correction for multiple testing, we found a genome-wide significant age-moderated effect between variants in the DSG1 gene region and TMT-A performance predominantly reflecting visual processing speed (rs2199301, P(meta-analysis) = 1.3 × 10(-7)). The direction of the interaction suggests for the minor allele a beneficial effect in younger adults turning into a detrimental effect in older adults. The detrimental effect of the missense single nucleotide polymorphism rs1426310 within the same DSG1 gene region could be replicated in Sydney MAS participants aged 70-79, but not in those aged 80 years and older, presumably a result of survivor bias. Our findings demonstrate opposing effects of DSG1 variants on information processing speed depending on age, which might be related to the complex processes that DSG1 is involved with, including cell adhesion and apoptosis. © 2014 John Wiley & Sons Ltd and International Behavioural and Neural Genetics Society.

  3. Uncertainty Reduction for Stochastic Processes on Complex Networks

    Science.gov (United States)

    Radicchi, Filippo; Castellano, Claudio

    2018-05-01

    Many real-world systems are characterized by stochastic dynamical rules where a complex network of interactions among individual elements probabilistically determines their state. Even with full knowledge of the network structure and of the stochastic rules, the ability to predict system configurations is generally characterized by a large uncertainty. Selecting a fraction of the nodes and observing their state may help to reduce the uncertainty about the unobserved nodes. However, choosing these points of observation in an optimal way is a highly nontrivial task, depending on the nature of the stochastic process and on the structure of the underlying interaction pattern. In this paper, we introduce a computationally efficient algorithm to determine quasioptimal solutions to the problem. The method leverages network sparsity to reduce computational complexity from exponential to almost quadratic, thus allowing the straightforward application of the method to mid-to-large-size systems. Although the method is exact only for equilibrium stochastic processes defined on trees, it turns out to be effective also for out-of-equilibrium processes on sparse loopy networks.

  4. Certain aspects of the reactivity of carotenoids. Redox processes and complexation

    International Nuclear Information System (INIS)

    Polyakov, Nikolay E; Leshina, Tatyana V

    2006-01-01

    The published data on the redox reactions of carotenoids, their supramolecular inclusion complexes and the composition, properties and practical application of these complexes are generalised. Special attention is given to the effect of complexation on radical processes involving carotenoids and on the antioxidant activity of carotenoids.

  5. A quantitative approach to modeling the information processing of NPP operators under input information overload

    International Nuclear Information System (INIS)

    Kim, Jong Hyun; Seong, Poong Hyun

    2002-01-01

    This paper proposes a quantitative approach to modeling the information processing of NPP operators. The aim of this work is to derive the amount of the information processed during a certain control task under input information overload. We primarily develop the information processing model having multiple stages, which contains information flow. Then the uncertainty of the information is quantified using the Conant's model, a kind of information theory. We also investigate the applicability of this approach to quantifying the information reduction of operators under the input information overload

  6. Information Processing of Trauma.

    Science.gov (United States)

    Hartman, Carol R.; Burgess, Ann W.

    1993-01-01

    This paper presents a neuropsychosocial model of information processing to explain a victimization experience, specifically child sexual abuse. It surveys the relation of sensation, perception, and cognition as a systematic way to provide a framework for studying human behavior and describing human response to traumatic events. (Author/JDD)

  7. Novel Complexity Indicator of Manufacturing Process Chains and Its Relations to Indirect Complexity Indicators

    Directory of Open Access Journals (Sweden)

    Vladimir Modrak

    2017-01-01

    Full Text Available Manufacturing systems can be considered as a network of machines/workstations, where parts are produced in flow shop or job shop environment, respectively. Such network of machines/workstations can be depicted as a graph, with machines as nodes and material flow between the nodes as links. The aim of this paper is to use sequences of operations and machine network to measure static complexity of manufacturing processes. In this order existing approaches to measure the static complexity of manufacturing systems are analyzed and subsequently compared. For this purpose, analyzed competitive complexity indicators were tested on two different manufacturing layout examples. A subsequent analysis showed relevant potential of the proposed method.

  8. Foundations for Streaming Model Transformations by Complex Event Processing.

    Science.gov (United States)

    Dávid, István; Ráth, István; Varró, Dániel

    2018-01-01

    Streaming model transformations represent a novel class of transformations to manipulate models whose elements are continuously produced or modified in high volume and with rapid rate of change. Executing streaming transformations requires efficient techniques to recognize activated transformation rules over a live model and a potentially infinite stream of events. In this paper, we propose foundations of streaming model transformations by innovatively integrating incremental model query, complex event processing (CEP) and reactive (event-driven) transformation techniques. Complex event processing allows to identify relevant patterns and sequences of events over an event stream. Our approach enables event streams to include model change events which are automatically and continuously populated by incremental model queries. Furthermore, a reactive rule engine carries out transformations on identified complex event patterns. We provide an integrated domain-specific language with precise semantics for capturing complex event patterns and streaming transformations together with an execution engine, all of which is now part of the Viatra reactive transformation framework. We demonstrate the feasibility of our approach with two case studies: one in an advanced model engineering workflow; and one in the context of on-the-fly gesture recognition.

  9. Strategic Information Processing from Behavioural Data in Iterated Games

    Directory of Open Access Journals (Sweden)

    Michael S. Harré

    2018-01-01

    Full Text Available Iterated games are an important framework of economic theory and application, at least since the original work of Axelrod’s computational tournaments of the early 80’s. Recent theoretical results have shown that games (the economic context and game theory (the decision-making process are both formally equivalent to computational logic gates. Here these results are extended to behavioural data obtained from an experiment in which rhesus monkeys sequentially played thousands of the “matching pennies” game, an empirical example similar to Axelrod’s tournaments in which algorithms played against one another. The results show that the monkeys exhibit a rich variety of behaviours, both between and within subjects when playing opponents of varying complexity. Despite earlier suggestions, there is no clear evidence that the win-stay, lose-switch strategy is used, however there is evidence of non-linear strategy-based interactions between the predictors of future choices. It is also shown that there is consistent evidence across protocols and across individuals that the monkeys extract non-markovian information, i.e., information from more than just the most recent state of the game. This work shows that the use of information theory in game theory can test important hypotheses that would otherwise be more difficult to extract using traditional statistical methods.

  10. Testing an alternate informed consent process.

    Science.gov (United States)

    Yates, Bernice C; Dodendorf, Diane; Lane, Judy; LaFramboise, Louise; Pozehl, Bunny; Duncan, Kathleen; Knodel, Kendra

    2009-01-01

    One of the main problems in conducting clinical trials is low participation rate due to potential participants' misunderstanding of the rationale for the clinical trial or perceptions of loss of control over treatment decisions. The objective of this study was to test an alternate informed consent process in cardiac rehabilitation participants that involved the use of a multimedia flip chart to describe a future randomized clinical trial and then asked, hypothetically, if they would participate in the future trial. An attractive and inviting visual presentation of the study was created in the form of a 23-page flip chart that included 24 color photographs displaying information about the purpose of the study, similarities and differences between the two treatment groups, and the data collection process. We tested the flip chart in 35 cardiac rehabilitation participants. Participants were asked if they would participate in this future study on two occasions: immediately after the description of the flip chart and 24 hours later, after reading through the informed consent document. Participants were also asked their perceptions of the flip chart and consent process. Of the 35 participants surveyed, 19 (54%) indicated that they would participate in the future study. No participant changed his or her decision 24 hours later after reading the full consent form. The participation rate improved 145% over that of an earlier feasibility study where the recruitment rate was 22%. Most participants stated that the flip chart was helpful and informative and that the photographs were effective in communicating the purpose of the study. Participation rates could be enhanced in future clinical trials by using a visual presentation to explain and describe the study as part of the informed consent process. More research is needed to test alternate methods of obtaining informed consent.

  11. Occurrence reporting and processing of operations information

    International Nuclear Information System (INIS)

    1997-01-01

    DOE O 232.1A, Occurrence Reporting and Processing of Operations Information, and 10 CFR 830.350, Occurrence Reporting and Processing of Operations Information (when it becomes effective), along with this manual, set forth occurrence reporting requirements for Department of Energy (DOE) Departmental Elements and contractors responsible for the management and operation of DOE-owned and -leased facilities. These requirements include categorization of occurrences related to safety, security, environment, health, or operations (''Reportable Occurrences''); DOE notification of these occurrences; and the development and submission of documented follow-up reports. This Manual provides detailed information for categorizing and reporting occurrences at DOE facilities. Information gathered by the Occurrence Reporting and processing System is used for analysis of the Department's performance in environmental protection, safeguards and security, and safety and health of its workers and the public. This information is also used to develop lessons learned and document events that significantly impact DOE operations

  12. Occurrence reporting and processing of operations information

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-07-21

    DOE O 232.1A, Occurrence Reporting and Processing of Operations Information, and 10 CFR 830.350, Occurrence Reporting and Processing of Operations Information (when it becomes effective), along with this manual, set forth occurrence reporting requirements for Department of Energy (DOE) Departmental Elements and contractors responsible for the management and operation of DOE-owned and -leased facilities. These requirements include categorization of occurrences related to safety, security, environment, health, or operations (``Reportable Occurrences``); DOE notification of these occurrences; and the development and submission of documented follow-up reports. This Manual provides detailed information for categorizing and reporting occurrences at DOE facilities. Information gathered by the Occurrence Reporting and processing System is used for analysis of the Department`s performance in environmental protection, safeguards and security, and safety and health of its workers and the public. This information is also used to develop lessons learned and document events that significantly impact DOE operations.

  13. Levels of Information Processing in a Fitts law task (LIPFitts)

    Science.gov (United States)

    Mosier, K. L.; Hart, S. G.

    1986-01-01

    State-of-the-art flight technology has restructured the task of human operators, decreasing the need for physical and sensory resources, and increasing the quantity of cognitive effort required, changing it qualitatively. Recent technological advances have the most potential for impacting a pilot in two areas: performance and mental workload. In an environment in which timing is critical, additional cognitive processing can cause performance decrements, and increase a pilot's perception of the mental workload involved. The effects of stimulus processing demands on motor response performance and subjective mental workload are examined, using different combinations of response selection and target acquisition tasks. The information processing demands of the response selection were varied (e.g., Sternberg memory set tasks, math equations, pattern matching), as was the difficulty of the response execution. Response latency as well as subjective workload ratings varied in accordance with the cognitive complexity of the task. Movement times varied according to the difficulty of the response execution task. Implications in terms of real-world flight situations are discussed.

  14. Processing of complex shapes with single-mode resonant frequency microwave applicators

    International Nuclear Information System (INIS)

    Fellows, L.A.; Delgado, R.; Hawley, M.C.

    1994-01-01

    Microwave processing is an alternative to conventional composite processing techniques. Single-mode microwave applicators efficiently couple microwave energy into the composite. The application of the microwave energy is greatly affected by the geometry of the composite. In the single mode microwave applicator, two types of modes are available. These modes are best suited to processing flat planar samples or cylindrical samples with geometries that align with the electric fields. Mode-switching is alternating between different electromagnetic modes with the intelligent selection of the modes to alleviate undesirable temperature profiles. This method has improved the microwave heating profiles of materials with complex shapes that do not align with either type of electric field. Parts with two different complex geometries were fabricated from a vinyl toluene/vinyl ester resin with a continuous glass fiber reinforcement by autoclaving and by microwave techniques. The flexural properties of the microwave processed samples were compared to the flexural properties of autoclaved samples. The trends of the mechanical properties for the complex shapes were consistent with the results of experiments with flat panels. This demonstrated that mode-switching techniques are as applicable for the complex shapes as they are for the simpler flat panel geometry

  15. The Complexity of Developmental Predictions from Dual Process Models

    Science.gov (United States)

    Stanovich, Keith E.; West, Richard F.; Toplak, Maggie E.

    2011-01-01

    Drawing developmental predictions from dual-process theories is more complex than is commonly realized. Overly simplified predictions drawn from such models may lead to premature rejection of the dual process approach as one of many tools for understanding cognitive development. Misleading predictions can be avoided by paying attention to several…

  16. New Product Development (Npd) Process In Subsidiary: Information Perspectives

    OpenAIRE

    Firmanzah

    2008-01-01

    Information is an important resource for new product development (NPD) process in subsidiary. However, we still lack of research to analyze NPD process from information perspective in subsidiary context. This research is an exploratory research and it exploited 8 cases of NPD process in consumer goods subsidiaries operating in Indonesian market. Three types of information have been identified and analyzed NPD process; global, regional and local information. The result of this research ...

  17. Characteristics analysis of acupuncture electroencephalograph based on mutual information Lempel—Ziv complexity

    International Nuclear Information System (INIS)

    Luo Xi-Liu; Wang Jiang; Deng Bin; Wei Xi-Le; Bian Hong-Rui; Han Chun-Xiao

    2012-01-01

    As a convenient approach to the characterization of cerebral cortex electrical information, electroencephalograph (EEG) has potential clinical application in monitoring the acupuncture effects. In this paper, a method composed of the mutual information method and Lempel—Ziv complexity method (MILZC) is proposed to investigate the effects of acupuncture on the complexity of information exchanges between different brain regions based on EEGs. In the experiments, eight subjects are manually acupunctured at ‘Zusanli’ acupuncture point (ST-36) with different frequencies (i.e., 50, 100, 150, and 200 times/min) and the EEGs are recorded simultaneously. First, MILZC values are compared in general. Then average brain connections are used to quantify the effectiveness of acupuncture under the above four frequencies. Finally, significance index P values are used to study the spatiality of the acupuncture effect on the brain. Three main findings are obtained: (i) MILZC values increase during the acupuncture; (ii) manual acupunctures (MAs) with 100 times/min and 150 times/min are more effective than with 50 times/min and 200 times/min; (iii) contralateral hemisphere activation is more prominent than ipsilateral hemisphere's. All these findings suggest that acupuncture contributes to the increase of brain information exchange complexity and the MILZC method can successfully describe these changes. (interdisciplinary physics and related areas of science and technology)

  18. A simplified computational memory model from information processing.

    Science.gov (United States)

    Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang

    2016-11-23

    This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view.

  19. Information Processing in Nursing Information Systems: An Evaluation Study from a Developing Country.

    Science.gov (United States)

    Samadbeik, Mahnaz; Shahrokhi, Nafiseh; Saremian, Marzieh; Garavand, Ali; Birjandi, Mahdi

    2017-01-01

    In recent years, information technology has been introduced in the nursing departments of many hospitals to support their daily tasks. Nurses are the largest end user group in Hospital Information Systems (HISs). This study was designed to evaluate data processing in the Nursing Information Systems (NISs) utilized in many university hospitals in Iran. This was a cross-sectional study. The population comprised all nurse managers and NIS users of the five training hospitals in Khorramabad city ( N = 71). The nursing subset of HIS-Monitor questionnaire was used to collect the data. Data were analyzed by the descriptive-analytical method and the inductive content analysis. The results indicated that the nurses participating in the study did not take a desirable advantage of paper (2.02) and computerized (2.34) information processing tools to perform nursing tasks. Moreover, the less work experience nurses have, the further they utilize computer tools for processing patient discharge information. The "readability of patient information" and "repetitive and time-consuming documentation" were stated as the most important expectations and problems regarding the HIS by the participating nurses, respectively. The nurses participating in the present study used to utilize paper and computerized information processing tools together to perform nursing practices. Therefore, it is recommended that the nursing process redesign coincides with NIS implementation in the health care centers.

  20. Evidence of complex contagion of information in social media: An experiment using Twitter bots.

    Directory of Open Access Journals (Sweden)

    Bjarke Mønsted

    Full Text Available It has recently become possible to study the dynamics of information diffusion in techno-social systems at scale, due to the emergence of online platforms, such as Twitter, with millions of users. One question that systematically recurs is whether information spreads according to simple or complex dynamics: does each exposure to a piece of information have an independent probability of a user adopting it (simple contagion, or does this probability depend instead on the number of sources of exposure, increasing above some threshold (complex contagion? Most studies to date are observational and, therefore, unable to disentangle the effects of confounding factors such as social reinforcement, homophily, limited attention, or network community structure. Here we describe a novel controlled experiment that we performed on Twitter using 'social bots' deployed to carry out coordinated attempts at spreading information. We propose two Bayesian statistical models describing simple and complex contagion dynamics, and test the competing hypotheses. We provide experimental evidence that the complex contagion model describes the observed information diffusion behavior more accurately than simple contagion. Future applications of our results include more effective defenses against malicious propaganda campaigns on social media, improved marketing and advertisement strategies, and design of effective network intervention techniques.

  1. Evidence of complex contagion of information in social media: An experiment using Twitter bots.

    Science.gov (United States)

    Mønsted, Bjarke; Sapieżyński, Piotr; Ferrara, Emilio; Lehmann, Sune

    2017-01-01

    It has recently become possible to study the dynamics of information diffusion in techno-social systems at scale, due to the emergence of online platforms, such as Twitter, with millions of users. One question that systematically recurs is whether information spreads according to simple or complex dynamics: does each exposure to a piece of information have an independent probability of a user adopting it (simple contagion), or does this probability depend instead on the number of sources of exposure, increasing above some threshold (complex contagion)? Most studies to date are observational and, therefore, unable to disentangle the effects of confounding factors such as social reinforcement, homophily, limited attention, or network community structure. Here we describe a novel controlled experiment that we performed on Twitter using 'social bots' deployed to carry out coordinated attempts at spreading information. We propose two Bayesian statistical models describing simple and complex contagion dynamics, and test the competing hypotheses. We provide experimental evidence that the complex contagion model describes the observed information diffusion behavior more accurately than simple contagion. Future applications of our results include more effective defenses against malicious propaganda campaigns on social media, improved marketing and advertisement strategies, and design of effective network intervention techniques.

  2. Processing of complex auditory patterns in musicians and nonmusicians.

    Science.gov (United States)

    Boh, Bastiaan; Herholz, Sibylle C; Lappe, Claudia; Pantev, Christo

    2011-01-01

    In the present study we investigated the capacity of the memory store underlying the mismatch negativity (MMN) response in musicians and nonmusicians for complex tone patterns. While previous studies have focused either on the kind of information that can be encoded or on the decay of the memory trace over time, we studied capacity in terms of the length of tone sequences, i.e., the number of individual tones that can be fully encoded and maintained. By means of magnetoencephalography (MEG) we recorded MMN responses to deviant tones that could occur at any position of standard tone patterns composed of four, six or eight tones during passive, distracted listening. Whereas there was a reliable MMN response to deviant tones in the four-tone pattern in both musicians and nonmusicians, only some individuals showed MMN responses to the longer patterns. This finding of a reliable capacity of the short-term auditory store underlying the MMN response is in line with estimates of a three to five item capacity of the short-term memory trace from behavioural studies, although pitch and contour complexity covaried with sequence length, which might have led to an understatement of the reported capacity. Whereas there was a tendency for an enhancement of the pattern MMN in musicians compared to nonmusicians, a strong advantage for musicians could be shown in an accompanying behavioural task of detecting the deviants while attending to the stimuli for all pattern lengths, indicating that long-term musical training differentially affects the memory capacity of auditory short-term memory for complex tone patterns with and without attention. Also, a left-hemispheric lateralization of MMN responses in the six-tone pattern suggests that additional networks that help structuring the patterns in the temporal domain might be recruited for demanding auditory processing in the pitch domain.

  3. Processing of complex auditory patterns in musicians and nonmusicians.

    Directory of Open Access Journals (Sweden)

    Bastiaan Boh

    Full Text Available In the present study we investigated the capacity of the memory store underlying the mismatch negativity (MMN response in musicians and nonmusicians for complex tone patterns. While previous studies have focused either on the kind of information that can be encoded or on the decay of the memory trace over time, we studied capacity in terms of the length of tone sequences, i.e., the number of individual tones that can be fully encoded and maintained. By means of magnetoencephalography (MEG we recorded MMN responses to deviant tones that could occur at any position of standard tone patterns composed of four, six or eight tones during passive, distracted listening. Whereas there was a reliable MMN response to deviant tones in the four-tone pattern in both musicians and nonmusicians, only some individuals showed MMN responses to the longer patterns. This finding of a reliable capacity of the short-term auditory store underlying the MMN response is in line with estimates of a three to five item capacity of the short-term memory trace from behavioural studies, although pitch and contour complexity covaried with sequence length, which might have led to an understatement of the reported capacity. Whereas there was a tendency for an enhancement of the pattern MMN in musicians compared to nonmusicians, a strong advantage for musicians could be shown in an accompanying behavioural task of detecting the deviants while attending to the stimuli for all pattern lengths, indicating that long-term musical training differentially affects the memory capacity of auditory short-term memory for complex tone patterns with and without attention. Also, a left-hemispheric lateralization of MMN responses in the six-tone pattern suggests that additional networks that help structuring the patterns in the temporal domain might be recruited for demanding auditory processing in the pitch domain.

  4. Quantum teleportation for continuous variables and related quantum information processing

    International Nuclear Information System (INIS)

    Furusawa, Akira; Takei, Nobuyuki

    2007-01-01

    Quantum teleportation is one of the most important subjects in quantum information science. This is because quantum teleportation can be regarded as not only quantum information transfer but also a building block for universal quantum information processing. Furthermore, deterministic quantum information processing is very important for efficient processing and it can be realized with continuous-variable quantum information processing. In this review, quantum teleportation for continuous variables and related quantum information processing are reviewed from these points of view

  5. Information Processing in Auto-regulated Systems

    Directory of Open Access Journals (Sweden)

    Karl Javorszky

    2003-06-01

    Full Text Available Abstract: We present a model of information processing which is based on two concurrent ways of describing the world, where a description in one of the languages limits the possibilities for realisations in the other language. The two describing dimensions appear in our common sense as dichotomies of perspectives: subjective - objective; diversity - similarity; individual - collective. We abstract from the subjective connotations and treat the test theoretical case of an interval on which several concurrent categories can be introduced. We investigate multidimensional partitions as potential carriers of information and compare their efficiency to that of sequenced carriers. We regard the same assembly once as a contemporary collection, once as a longitudinal sequence and find promising inroads towards understanding information processing by auto-regulated systems. Information is understood to point out that what is the case from among alternatives, which could be the case. We have translated these ideas into logical operations on the set of natural numbers and have found two equivalence points on N where matches between sequential and commutative ways of presenting a state of the world can agree in a stable fashion: a flip-flop mechanism is envisioned. By following this new approach, a mathematical treatment of some poignant biomathematical problems is allowed. Also, the concepts presented in this treatise may well have relevance and applications within the information processing and the theory of language fields.

  6. Information Technology Process Improvement Decision-Making: An Exploratory Study from the Perspective of Process Owners and Process Managers

    Science.gov (United States)

    Lamp, Sandra A.

    2012-01-01

    There is information available in the literature that discusses information technology (IT) governance and investment decision making from an executive-level perception, yet there is little information available that offers the perspective of process owners and process managers pertaining to their role in IT process improvement and investment…

  7. Design Process Control for Improved Surface Finish of Metal Additive Manufactured Parts of Complex Build Geometry

    Directory of Open Access Journals (Sweden)

    Mikdam Jamal

    2017-12-01

    Full Text Available Metal additive manufacturing (AM is increasingly used to create complex 3D components at near net shape. However, the surface finish (SF of the metal AM part is uneven, with surface roughness being variable over the facets of the design. Standard post-processing methods such as grinding and linishing often meet with major challenges in finishing parts of complex shape. This paper reports on research that demonstrated that mass finishing (MF processes are able to deliver high-quality surface finishes (Ra and Sa on AM-generated parts of a relatively complex geometry (both internal features and external facets under select conditions. Four processes were studied in this work: stream finishing, high-energy (HE centrifuge, drag finishing and disc finishing. Optimisation of the drag finishing process was then studied using a structured design of experiments (DOE. The effects of a range of finishing parameters were evaluated and optimal parameters and conditions were determined. The study established that the proposed method can be successfully applied in drag finishing to optimise the surface roughness in an industrial application and that it is an economical way of obtaining the maximum amount of information in a short period of time with a small number of tests. The study has also provided an important step in helping understand the requirements of MF to deliver AM-generated parts to a target quality finish and cycle time.

  8. Justification of computational methods to ensure information management systems

    Directory of Open Access Journals (Sweden)

    E. D. Chertov

    2016-01-01

    Full Text Available Summary. Due to the diversity and complexity of organizational management tasks a large enterprise, the construction of an information management system requires the establishment of interconnected complexes of means, implementing the most efficient way collect, transfer, accumulation and processing of information necessary drivers handle different ranks in the governance process. The main trends of the construction of integrated logistics management information systems can be considered: the creation of integrated data processing systems by centralizing storage and processing of data arrays; organization of computer systems to realize the time-sharing; aggregate-block principle of the integrated logistics; Use a wide range of peripheral devices with the unification of information and hardware communication. Main attention is paid to the application of the system of research of complex technical support, in particular, the definition of quality criteria for the operation of technical complex, the development of information base analysis methods of management information systems and define the requirements for technical means, as well as methods of structural synthesis of the major subsystems of integrated logistics. Thus, the aim is to study on the basis of systematic approach of integrated logistics management information system and the development of a number of methods of analysis and synthesis of complex logistics that are suitable for use in the practice of engineering systems design. The objective function of the complex logistics management information systems is the task of gathering systems, transmission and processing of specified amounts of information in the regulated time intervals with the required degree of accuracy while minimizing the reduced costs for the establishment and operation of technical complex. Achieving the objective function of the complex logistics to carry out certain organization of interaction of information

  9. Cognitive Structures in Vocational Information Processing and Decision Making.

    Science.gov (United States)

    Nevill, Dorothy D.; And Others

    1986-01-01

    Tested the assumptions that the structural features of vocational schemas affect vocational information processing and career self-efficacy. Results indicated that effective vocational information processing was facilitated by well-integrated systems that processed information along fewer dimensions. The importance of schematic organization on the…

  10. Expectation, information processing, and subjective duration.

    Science.gov (United States)

    Simchy-Gross, Rhimmon; Margulis, Elizabeth Hellmuth

    2018-01-01

    In research on psychological time, it is important to examine the subjective duration of entire stimulus sequences, such as those produced by music (Teki, Frontiers in Neuroscience, 10, 2016). Yet research on the temporal oddball illusion (according to which oddball stimuli seem longer than standard stimuli of the same duration) has examined only the subjective duration of single events contained within sequences, not the subjective duration of sequences themselves. Does the finding that oddballs seem longer than standards translate to entire sequences, such that entire sequences that contain oddballs seem longer than those that do not? Is this potential translation influenced by the mode of information processing-whether people are engaged in direct or indirect temporal processing? Two experiments aimed to answer both questions using different manipulations of information processing. In both experiments, musical sequences either did or did not contain oddballs (auditory sliding tones). To manipulate information processing, we varied the task (Experiment 1), the sequence event structure (Experiments 1 and 2), and the sequence familiarity (Experiment 2) independently within subjects. Overall, in both experiments, the sequences that contained oddballs seemed shorter than those that did not when people were engaged in direct temporal processing, but longer when people were engaged in indirect temporal processing. These findings support the dual-process contingency model of time estimation (Zakay, Attention, Perception & Psychophysics, 54, 656-664, 1993). Theoretical implications for attention-based and memory-based models of time estimation, the pacemaker accumulator and coding efficiency hypotheses of time perception, and dynamic attending theory are discussed.

  11. Mathematics of Information Processing and the Internet

    Science.gov (United States)

    Hart, Eric W.

    2010-01-01

    The mathematics of information processing and the Internet can be organized around four fundamental themes: (1) access (finding information easily); (2) security (keeping information confidential); (3) accuracy (ensuring accurate information); and (4) efficiency (data compression). In this article, the author discusses each theme with reference to…

  12. Holledge gauge failure testing using concurrent information processing algorithm

    International Nuclear Information System (INIS)

    Weeks, G.E.; Daniel, W.E.; Edwards, R.E.; Jannarone, R.J.; Joshi, S.N.; Palakodety, S.S.; Qian, D.

    1996-01-01

    For several decades, computerized information processing systems and human information processing models have developed with a good deal of mutual influence. Any comprehensive psychology text in this decade uses terms that originated in the computer industry, such as ''cache'' and ''memory'', to describe human information processing. Likewise, many engineers today are using ''artificial intelligence''and ''artificial neural network'' computing tools that originated as models of human thought to solve industrial problems. This paper concerns a recently developed human information processing model, called ''concurrent information processing'' (CIP), and a related set of computing tools for solving industrial problems. The problem of focus is adaptive gauge monitoring; the application is pneumatic pressure repeaters (Holledge gauges) used to measure liquid level and density in the Defense Waste Processing Facility and the Integrated DWPF Melter System

  13. Extending the NIF DISCO framework to automate complex workflow: coordinating the harvest and integration of data from diverse neuroscience information resources.

    Science.gov (United States)

    Marenco, Luis N; Wang, Rixin; Bandrowski, Anita E; Grethe, Jeffrey S; Shepherd, Gordon M; Miller, Perry L

    2014-01-01

    This paper describes how DISCO, the data aggregator that supports the Neuroscience Information Framework (NIF), has been extended to play a central role in automating the complex workflow required to support and coordinate the NIF's data integration capabilities. The NIF is an NIH Neuroscience Blueprint initiative designed to help researchers access the wealth of data related to the neurosciences available via the Internet. A central component is the NIF Federation, a searchable database that currently contains data from 231 data and information resources regularly harvested, updated, and warehoused in the DISCO system. In the past several years, DISCO has greatly extended its functionality and has evolved to play a central role in automating the complex, ongoing process of harvesting, validating, integrating, and displaying neuroscience data from a growing set of participating resources. This paper provides an overview of DISCO's current capabilities and discusses a number of the challenges and future directions related to the process of coordinating the integration of neuroscience data within the NIF Federation.

  14. Basic disturbances of information processing in psychosis prediction.

    Science.gov (United States)

    Bodatsch, Mitja; Klosterkötter, Joachim; Müller, Ralf; Ruhrmann, Stephan

    2013-01-01

    The basic symptoms (BS) approach provides a valid instrument in predicting psychosis onset and represents moreover a significant heuristic framework for research. The term "basic symptoms" denotes subtle changes of cognition and perception in the earliest and prodromal stages of psychosis development. BS are thought to correspond to disturbances of neural information processing. Following the heuristic implications of the BS approach, the present paper aims at exploring disturbances of information processing, revealed by functional magnetic resonance imaging (fMRI) and electro-encephalographic as characteristics of the at-risk state of psychosis. Furthermore, since high-risk studies employing ultra-high-risk criteria revealed non-conversion rates commonly exceeding 50%, thus warranting approaches that increase specificity, the potential contribution of neural information processing disturbances to psychosis prediction is reviewed. In summary, the at-risk state seems to be associated with information processing disturbances. Moreover, fMRI investigations suggested that disturbances of language processing domains might be a characteristic of the prodromal state. Neurophysiological studies revealed that disturbances of sensory processing may assist psychosis prediction in allowing for a quantification of risk in terms of magnitude and time. The latter finding represents a significant advancement since an estimation of the time to event has not yet been achieved by clinical approaches. Some evidence suggests a close relationship between self-experienced BS and neural information processing. With regard to future research, the relationship between neural information processing disturbances and different clinical risk concepts warrants further investigations. Thereby, a possible time sequence in the prodromal phase might be of particular interest.

  15. Knowledge diffusion in complex networks by considering time-varying information channels

    Science.gov (United States)

    Zhu, He; Ma, Jing

    2018-03-01

    In this article, based on a model of epidemic spreading, we explore the knowledge diffusion process with an innovative mechanism for complex networks by considering time-varying information channels. To cover the knowledge diffusion process in homogeneous and heterogeneous networks, two types of networks (the BA network and the ER network) are investigated. The mean-field theory is used to theoretically draw the knowledge diffusion threshold. Numerical simulation demonstrates that the knowledge diffusion threshold is almost linearly correlated with the mean of the activity rate. In addition, under the influence of the activity rate and distinct from the classic Susceptible-Infected-Susceptible (SIS) model, the density of knowers almost linearly grows with the spreading rate. Finally, in consideration of the ubiquitous mechanism of innovation, we further study the evolution of knowledge in our proposed model. The results suggest that compared with the effect of the spreading rate, the average knowledge version of the population is affected more by the innovation parameter and the mean of the activity rate. Furthermore, in the BA network, the average knowledge version of individuals with higher degree is always newer than those with lower degree.

  16. ORGANIZATION OF INFORMATION INTERACTION OF AIRPORT PRODUCTION PROCESSES

    Directory of Open Access Journals (Sweden)

    Yakov Mikhajlovich Dalinger

    2017-01-01

    Full Text Available The organization of service production attributed to airports activity is analyzed. The importance and the actuality of information interaction problem solution between productive processes as a problem of organization of modern produc- tion are shown.Possibilities and features of information interaction system construction in form of multi-level hierarchical struc- ture have been shown. The airport is considered as an enterprise aimed at service production where it is necessary to analyze much in- formation in a limited time-frame. The production schedule often changes under the influence of many factors. This leads to the increase of the role of computerization and informatization of production processes what predetermines automation of production, creation of information environment and organization of information interaction needed for realization of production processes. The integrated organization form is proposed because it is oriented to the integration of different processes into a universal production system and it allows to conduct the coordination of local goals of particular processes in the context of the global purpose aimed at the improvement of the effectiveness of the airport activity. The main conditions needed for organization of information interaction between production processes and techno- logical operations are considered, and the list of the following problems is determined. The attention is paid to the necessity of compatibility of structure and organization of interaction system in the conditions of the airline and the necessity of be- ing its reflection in the information space of the airline. The usefulness of the intergrated organization form of information interaction based on information exchange between processes and service customers according to the network structure is explained. Multi-level character of this structure confirms its advantage over other items, however it also has a series of features presented

  17. Algorithmic information theory mathematics of digital information processing

    CERN Document Server

    Seibt, Peter

    2007-01-01

    Treats the Mathematics of many important areas in digital information processing. This book covers, in a unified presentation, five topics: Data Compression, Cryptography, Sampling (Signal Theory), Error Control Codes, Data Reduction. It is useful for teachers, students and practitioners in Electronic Engineering, Computer Science and Mathematics.

  18. A simplified computational memory model from information processing

    Science.gov (United States)

    Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang

    2016-01-01

    This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view. PMID:27876847

  19. Effects of information processing speed on learning, memory, and executive functioning in people living with HIV/AIDS.

    Science.gov (United States)

    Fellows, Robert P; Byrd, Desiree A; Morgello, Susan

    2014-01-01

    It is unclear whether or to what degree literacy, aging, and other neurologic abnormalities relate to cognitive deficits among people living with HIV/AIDS in the combined antiretroviral therapy (CART) era. The primary aim of this study was to simultaneously examine the association of age, HIV-associated motor abnormalities, major depressive disorder, and reading level with information processing speed, learning, memory, and executive functions, and to determine whether processing speed mediated any of the relationships between cognitive and noncognitive variables. Participants were 186 racially and ethnically diverse men and women living with HIV/AIDS who underwent comprehensive neurological, neuropsychological, and medical evaluations. Structural equation modeling was utilized to assess the extent to which information processing speed mediated the relationship between age, motor abnormalities, major depressive disorder, and reading level with other cognitive abilities. Age, motor dysfunction, reading level, and current major depressive disorder were all significantly associated with information processing speed. Information processing speed fully mediated the effects of age on learning, memory, and executive functioning and partially mediated the effect of major depressive disorder on learning and memory. The effect of motor dysfunction on learning and memory was fully mediated by processing speed. These findings provide support for information processing speed as a primary deficit, which may account, at least in part, for many of the other cognitive abnormalities recognized in complex HIV/AIDS populations. The association of age and information processing speed may account for HIV/aging synergies in the generation of CART-era cognitive abnormalities.

  20. A tool for filtering information in complex systems

    OpenAIRE

    Tumminello, M.; Aste, T.; Di Matteo, T.; Mantegna, R. N.

    2005-01-01

    We introduce a technique to filter out complex data sets by extracting a subgraph of representative links. Such a filtering can be tuned up to any desired level by controlling the genus of the resulting graph. We show that this technique is especially suitable for correlation-based graphs, giving filtered graphs that preserve the hierarchical organization of the minimum spanning tree but containing a larger amount of information in their internal structure. In particular in the case of planar...

  1. Measuring information processing in a client with extreme agitation following traumatic brain injury using the Perceive, Recall, Plan and Perform System of Task Analysis.

    Science.gov (United States)

    Nott, Melissa T; Chapparo, Christine

    2008-09-01

    Agitation following traumatic brain injury is characterised by a heightened state of activity with disorganised information processing that interferes with learning and achieving functional goals. This study aimed to identify information processing problems during task performance of a severely agitated adult using the Perceive, Recall, Plan and Perform (PRPP) System of Task Analysis. Second, this study aimed to examine the sensitivity of the PRPP System to changes in task performance over a short period of rehabilitation, and third, to evaluate the guidance provided by the PRPP in directing intervention. A case study research design was employed. The PRPP System of Task Analysis was used to assess changes in task embedded information processing capacity during occupational therapy intervention with a severely agitated adult in a rehabilitation context. Performance is assessed on three selected tasks over a one-month period. Information processing difficulties during task performance can be clearly identified when observing a severely agitated adult following a traumatic brain injury. Processing skills involving attention, sensory processing and planning were most affected at this stage of rehabilitation. These processing difficulties are linked to established descriptions of agitated behaviour. Fluctuations in performance across three tasks of differing processing complexity were evident, leading to hypothesised relationships between task complexity, environment and novelty with information processing errors. Changes in specific information processing capacity over time were evident based on repeated measures using the PRPP System of Task Analysis. This lends preliminary support for its utility as an outcome measure, and raises hypotheses about the type of therapy required to enhance information processing in people with severe agitation. The PRPP System is sensitive to information processing changes in severely agitated adults when used to reassess performance

  2. Complexity, Methodology and Method: Crafting a Critical Process of Research

    Science.gov (United States)

    Alhadeff-Jones, Michel

    2013-01-01

    This paper defines a theoretical framework aiming to support the actions and reflections of researchers looking for a "method" in order to critically conceive the complexity of a scientific process of research. First, it starts with a brief overview of the core assumptions framing Morin's "paradigm of complexity" and Le…

  3. Complex Problem Solving in Teams: The Impact of Collective Orientation on Team Process Demands.

    Science.gov (United States)

    Hagemann, Vera; Kluge, Annette

    2017-01-01

    Complex problem solving is challenging and a high-level cognitive process for individuals. When analyzing complex problem solving in teams, an additional, new dimension has to be considered, as teamwork processes increase the requirements already put on individual team members. After introducing an idealized teamwork process model, that complex problem solving teams pass through, and integrating the relevant teamwork skills for interdependently working teams into the model and combining it with the four kinds of team processes (transition, action, interpersonal, and learning processes), the paper demonstrates the importance of fulfilling team process demands for successful complex problem solving within teams. Therefore, results from a controlled team study within complex situations are presented. The study focused on factors that influence action processes, like coordination, such as emergent states like collective orientation, cohesion, and trust and that dynamically enable effective teamwork in complex situations. Before conducting the experiments, participants were divided by median split into two-person teams with either high ( n = 58) or low ( n = 58) collective orientation values. The study was conducted with the microworld C3Fire, simulating dynamic decision making, and acting in complex situations within a teamwork context. The microworld includes interdependent tasks such as extinguishing forest fires or protecting houses. Two firefighting scenarios had been developed, which takes a maximum of 15 min each. All teams worked on these two scenarios. Coordination within the team and the resulting team performance were calculated based on a log-file analysis. The results show that no relationships between trust and action processes and team performance exist. Likewise, no relationships were found for cohesion. Only collective orientation of team members positively influences team performance in complex environments mediated by action processes such as

  4. Complex processing of rubber waste through energy recovery

    Directory of Open Access Journals (Sweden)

    Roman Smelík

    2015-12-01

    Full Text Available This article deals with the applied energy recovery solutions for complex processing of rubber waste for energy recovery. It deals specifically with the solution that could maximize possible use of all rubber waste and does not create no additional waste that disposal would be expensive and dangerous for the environment. The project is economically viable and energy self-sufficient. The outputs of the process could replace natural gas and crude oil products. The other part of the process is also the separation of metals, which can be returned to the metallurgical secondary production.

  5. Information Systems to Support a Decision Process at Stanford.

    Science.gov (United States)

    Chaffee, Ellen Earle

    1982-01-01

    When a rational decision process is desired, information specialists can contribute information and also contribute to the process in which that information is used, thereby promoting rational decision-making. The contribution of Stanford's information specialists to rational decision-making is described. (MLW)

  6. Complexity characterization in a probabilistic approach to dynamical systems through information geometry and inductive inference

    International Nuclear Information System (INIS)

    Ali, S A; Kim, D-H; Cafaro, C; Giffin, A

    2012-01-01

    Information geometric techniques and inductive inference methods hold great promise for solving computational problems of interest in classical and quantum physics, especially with regard to complexity characterization of dynamical systems in terms of their probabilistic description on curved statistical manifolds. In this paper, we investigate the possibility of describing the macroscopic behavior of complex systems in terms of the underlying statistical structure of their microscopic degrees of freedom by the use of statistical inductive inference and information geometry. We review the maximum relative entropy formalism and the theoretical structure of the information geometrodynamical approach to chaos on statistical manifolds M S . Special focus is devoted to a description of the roles played by the sectional curvature K M S , the Jacobi field intensity J M S and the information geometrodynamical entropy S M S . These quantities serve as powerful information-geometric complexity measures of information-constrained dynamics associated with arbitrary chaotic and regular systems defined on M S . Finally, the application of such information-geometric techniques to several theoretical models is presented.

  7. Theory and research in audiology education: understanding and representing complexity through informed methodological decisions.

    Science.gov (United States)

    Ng, Stella L

    2013-05-01

    The discipline of audiology has the opportunity to embark on research in education from an informed perspective, learning from professions that began this journey decades ago. The goal of this article is to position our discipline as a new member in the academic field of health professional education (HPE), with much to learn and contribute. In this article, I discuss the need for theory in informing HPE research. I also stress the importance of balancing our research goals by selecting appropriate methodologies for relevant research questions, to ensure that we respect the complexity of social processes inherent in HPE. Examples of relevant research questions are used to illustrate the need to consider alternative methodologies and to rethink the traditional hierarchy of evidence. I also provide an example of the thought processes and decisions that informed the design of an educational research study using a constructivist grounded theory methodology. As audiology enters the scholarly field of HPE, we need to arm ourselves with some of the knowledge and perspective that informs the field. Thus, we need to broaden our conceptions of what we consider to be appropriate styles of academic writing, relevant research questions, and valid evidence. Also, if we are to embark on qualitative inquiry into audiology education (or other audiology topics), we need to ensure that we conduct this research with an adequate understanding of the theories and methodologies informing such approaches. We must strive to conduct high quality, rigorous qualitative research more often than uninformed, generic qualitative research. These goals are imperative to the advancement of the theoretical landscape of audiology education and evolving the place of audiology in the field of HPE. American Academy of Audiology.

  8. A report on the medieval mining and ore processing complex: Zilan valley, Van, Turkey.

    Science.gov (United States)

    Ateş, Yusuf; Kılıη, Sinan

    Literature has records of the use of obsidian that shows the existence of a knowledge base on raw material resources around Lake Van extending to very ancient times. Against this background, very little information can be obtained from literature about accurate location of historical mining activities in the region today. An ancient mining and processing complex, located northwest of the city of Van (Turkey) has been discovered by chance in 2007. The purpose of this article is to describe this historical mining area. The site contains mining structures such as shafts and galleries, and heaves of stone chips indicating some ore enrichment activities taking place there. The XRD and chemical analyses show the samples taken from the ore vein are rich in Manganese (Mn) and Barium (Ba), and it is concluded that the Zilan Valley Mining and Processing Complex was for Pyrolusite (MnO2), Barium or both. The site is being described for the first time in the literature and offers an opportunity to fulfill the gap in literature regarding mining history. The discovery and the description of the site would also have implications in a wide multidisciplinary scientific community, including metallurgy, archeology, and world heritage.

  9. Application of complex inoculants in improving the process-ability of grey cast iron for cylinder blocks

    Directory of Open Access Journals (Sweden)

    LIU Wei-ming

    2006-05-01

    Full Text Available Effect of several complex inoculants on mechanical properties, process-ability and sensibility of grey cast iron used in cylinder block were investigated. The experimental results showed that the grey cast iron treated with 60%FeSi75+40%RE complex inoculants has tensile strength consistently at about 295 MPa along with good hardness and improved metallurgy quality. While the grey cast iron inoculated with 20%FeSi75+80%Sr compound inoculants has the best process-ability, the lowest cross-section sensibility and the least microhardness difference. The wear amount of the drill increases correspondingly with the increase of the microhardness difference of matrix structure, indicating the great effect of homogeneousness of matrix structure in the grey cast iron on the machinability of the grey cast iron.

  10. Photonic Architecture for Scalable Quantum Information Processing in Diamond

    Directory of Open Access Journals (Sweden)

    Kae Nemoto

    2014-08-01

    Full Text Available Physics and information are intimately connected, and the ultimate information processing devices will be those that harness the principles of quantum mechanics. Many physical systems have been identified as candidates for quantum information processing, but none of them are immune from errors. The challenge remains to find a path from the experiments of today to a reliable and scalable quantum computer. Here, we develop an architecture based on a simple module comprising an optical cavity containing a single negatively charged nitrogen vacancy center in diamond. Modules are connected by photons propagating in a fiber-optical network and collectively used to generate a topological cluster state, a robust substrate for quantum information processing. In principle, all processes in the architecture can be deterministic, but current limitations lead to processes that are probabilistic but heralded. We find that the architecture enables large-scale quantum information processing with existing technology.

  11. High-Dimensional Quantum Information Processing with Linear Optics

    Science.gov (United States)

    Fitzpatrick, Casey A.

    Quantum information processing (QIP) is an interdisciplinary field concerned with the development of computers and information processing systems that utilize quantum mechanical properties of nature to carry out their function. QIP systems have become vastly more practical since the turn of the century. Today, QIP applications span imaging, cryptographic security, computation, and simulation (quantum systems that mimic other quantum systems). Many important strategies improve quantum versions of classical information system hardware, such as single photon detectors and quantum repeaters. Another more abstract strategy engineers high-dimensional quantum state spaces, so that each successful event carries more information than traditional two-level systems allow. Photonic states in particular bring the added advantages of weak environmental coupling and data transmission near the speed of light, allowing for simpler control and lower system design complexity. In this dissertation, numerous novel, scalable designs for practical high-dimensional linear-optical QIP systems are presented. First, a correlated photon imaging scheme using orbital angular momentum (OAM) states to detect rotational symmetries in objects using measurements, as well as building images out of those interactions is reported. Then, a statistical detection method using chains of OAM superpositions distributed according to the Fibonacci sequence is established and expanded upon. It is shown that the approach gives rise to schemes for sorting, detecting, and generating the recursively defined high-dimensional states on which some quantum cryptographic protocols depend. Finally, an ongoing study based on a generalization of the standard optical multiport for applications in quantum computation and simulation is reported upon. The architecture allows photons to reverse momentum inside the device. This in turn enables realistic implementation of controllable linear-optical scattering vertices for

  12. Information processing theory in the early design stages

    DEFF Research Database (Denmark)

    Cash, Philip; Kreye, Melanie

    2014-01-01

    suggestions for improvements and support. One theory that may be particularly applicable to the early design stages is Information Processing Theory (IPT) as it is linked to the design process with regard to the key concepts considered. IPT states that designers search for information if they perceive......, the new knowledge is shared between the design team to reduce ambiguity with regards to its meaning and to build a shared understanding – reducing perceived uncertainty. Thus, we propose that Information-Processing Theory is suitable to describe designer activity in the early design stages...... uncertainty with regard to the knowledge necessary to solve a design challenge. They then process this information and compare if the new knowledge they have gained covers the previous knowledge gap. In engineering design, uncertainty plays a key role, particularly in the early design stages which has been...

  13. Complex diffusion process for noise reduction

    DEFF Research Database (Denmark)

    Nadernejad, Ehsan; Barari, A.

    2014-01-01

    equations (PDEs) in image restoration and de-noising prompted many researchers to search for an improvement in the technique. In this paper, a new method is presented for signal de-noising, based on PDEs and Schrodinger equations, named as complex diffusion process (CDP). This method assumes that variations...... for signal de-noising. To evaluate the performance of the proposed method, a number of experiments have been performed using Sinusoid, multi-component and FM signals cluttered with noise. The results indicate that the proposed method outperforms the approaches for signal de-noising known in prior art....

  14. WORK ALLOCATION IN COMPLEX PRODUCTION PROCESSES: A METHODOLOGY FOR DECISION SUPPORT

    OpenAIRE

    de Mello, Adriana Marotti; School of Economics, Business and Accounting at the University of São Paulo; Marx, Roberto; Polytechnic School, University of São Paulo; Zilbovicius, Mauro; Polytechnic School – University of São Paulo

    2013-01-01

    This article presents the development of a Methodology of Decision Support for Work Allocation in complex production processes. It is known that this decision is frequently taken empirically and that the methodologies available to support it are few and restricted in terms of its conceptual basis. The study of Times and Motion is one of these methodologies, but its applicability is restricted in cases of more complex production processes. The method presented here was developed as a result of...

  15. The minimal work cost of information processing

    Science.gov (United States)

    Faist, Philippe; Dupuis, Frédéric; Oppenheim, Jonathan; Renner, Renato

    2015-07-01

    Irreversible information processing cannot be carried out without some inevitable thermodynamical work cost. This fundamental restriction, known as Landauer's principle, is increasingly relevant today, as the energy dissipation of computing devices impedes the development of their performance. Here we determine the minimal work required to carry out any logical process, for instance a computation. It is given by the entropy of the discarded information conditional to the output of the computation. Our formula takes precisely into account the statistically fluctuating work requirement of the logical process. It enables the explicit calculation of practical scenarios, such as computational circuits or quantum measurements. On the conceptual level, our result gives a precise and operational connection between thermodynamic and information entropy, and explains the emergence of the entropy state function in macroscopic thermodynamics.

  16. Entropy type complexity of quantum processes

    International Nuclear Information System (INIS)

    Watanabe, Noboru

    2014-01-01

    von Neumann entropy represents the amount of information in the quantum state, and this was extended by Ohya for general quantum systems [10]. Umegaki first defined the quantum relative entropy for σ-finite von Neumann algebras, which was extended by Araki, and Uhlmann, for general von Neumann algebras and *-algebras, respectively. In 1983 Ohya introduced the quantum mutual entropy by using compound states; this describes the amount of information correctly transmitted through the quantum channel, which was also extended by Ohya for general quantum systems. In this paper, we briefly explain Ohya's S-mixing entropy and the quantum mutual entropy for general quantum systems. By using structure equivalent class, we will introduce entropy type functionals based on quantum information theory to improve treatment for the Gaussian communication process. (paper)

  17. Clues as information, the semiotic gap, and inferential investigative processes, or making a (very small) contribution to the new discipline, Forensic Semiotics

    DEFF Research Database (Denmark)

    Sørensen, Bent; Thellefsen, Torkild Leo; Thellefsen, Martin Muderspach

    2017-01-01

    In this article, we try to contribute to the new discipline Forensic Semiotics – a discipline introduced by the Canadian polymath Marcel Danesi. We focus on clues as information and criminal investigative processes as inferential. These inferential (and Peircean) processes have a certain complexity...

  18. Information processing and routing in wireless sensor networks

    CERN Document Server

    Yu, Yang; Krishnamachari, Bhaskar

    2006-01-01

    This book presents state-of-the-art cross-layer optimization techniques for energy-efficient information processing and routing in wireless sensor networks. Besides providing a survey on this important research area, three specific topics are discussed in detail - information processing in a collocated cluster, information transport over a tree substrate, and information routing for computationally intensive applications. The book covers several important system knobs for cross-layer optimization, including voltage scaling, rate adaptation, and tunable compression. By exploring tradeoffs of en

  19. Process Description and Operating History for the CPP-601/-640/-627 Fuel Reprocessing Complex at the Idaho National Engineering and Environmental Laboratory

    International Nuclear Information System (INIS)

    Wagner, E.P.

    1999-01-01

    The Fuel Reprocessing Complex (FRC) at the Idaho Nuclear Technology and Engineering Center at the Idaho National Engineering and Environmental Laboratory was used for reprocessing spent nuclear fuel from the early 1950's until 1992. The reprocessing facilities are now scheduled to be deactivated. As part of the deactivation process, three Resource Conservation and Recovery Act (RCRA) interim status units located in the complex must be closed. This document gathers the historical information necessary to provide a rational basis for the preparation of a comprehensive closure plan. Included are descriptions of process operations and the operating history of the FRC. A set of detailed tables record the service history and present status of the process vessels and transfer lines

  20. Knowledge acquisition process as an issue in information sciences

    Directory of Open Access Journals (Sweden)

    Boris Bosančić

    2016-07-01

    Full Text Available The paper presents an overview of some problems of information science which are explicitly portrayed in literature. It covers the following issues: information explosion, information flood and data deluge, information retrieval and relevance of information, and finally, the problem of scientific communication. The purpose of this paper is to explain why knowledge acquisition, can be considered as an issue in information sciences. The existing theoretical foundation within the information sciences, i.e. the DIKW hierarchy and its key concepts - data, information, knowledge and wisdom, is recognized as a symbolic representation as well as the theoretical foundation of the knowledge acquisition process. Moreover, it seems that the relationship between the DIKW hierarchy and the knowledge acquisition process is essential for a stronger foundation of information sciences in the 'body' of the overall human knowledge. In addition, the history of both the human and machine knowledge acquisition has been considered, as well as a proposal that the DIKW hierarchy take place as a symbol of general knowledge acquisition process, which could equally relate to both human and machine knowledge acquisition. To achieve this goal, it is necessary to modify the existing concept of the DIKW hierarchy. The appropriate modification of the DIKW hierarchy (one of which is presented in this paper could result in a much more solid theoretical foundation of the knowledge acquisition process and information sciences as a whole. The theoretical assumptions on which the knowledge acquisition process may be established as a problem of information science are presented at the end of the paper. The knowledge acquisition process does not necessarily have to be the subject of epistemology. It may establish a stronger link between the concepts of data and knowledge; furthermore, it can be used in the context of scientific research, but on the more primitive level than conducting

  1. Complex Problem Solving in Teams: The Impact of Collective Orientation on Team Process Demands

    Science.gov (United States)

    Hagemann, Vera; Kluge, Annette

    2017-01-01

    Complex problem solving is challenging and a high-level cognitive process for individuals. When analyzing complex problem solving in teams, an additional, new dimension has to be considered, as teamwork processes increase the requirements already put on individual team members. After introducing an idealized teamwork process model, that complex problem solving teams pass through, and integrating the relevant teamwork skills for interdependently working teams into the model and combining it with the four kinds of team processes (transition, action, interpersonal, and learning processes), the paper demonstrates the importance of fulfilling team process demands for successful complex problem solving within teams. Therefore, results from a controlled team study within complex situations are presented. The study focused on factors that influence action processes, like coordination, such as emergent states like collective orientation, cohesion, and trust and that dynamically enable effective teamwork in complex situations. Before conducting the experiments, participants were divided by median split into two-person teams with either high (n = 58) or low (n = 58) collective orientation values. The study was conducted with the microworld C3Fire, simulating dynamic decision making, and acting in complex situations within a teamwork context. The microworld includes interdependent tasks such as extinguishing forest fires or protecting houses. Two firefighting scenarios had been developed, which takes a maximum of 15 min each. All teams worked on these two scenarios. Coordination within the team and the resulting team performance were calculated based on a log-file analysis. The results show that no relationships between trust and action processes and team performance exist. Likewise, no relationships were found for cohesion. Only collective orientation of team members positively influences team performance in complex environments mediated by action processes such as

  2. Complex Problem Solving in Teams: The Impact of Collective Orientation on Team Process Demands

    Directory of Open Access Journals (Sweden)

    Vera Hagemann

    2017-09-01

    Full Text Available Complex problem solving is challenging and a high-level cognitive process for individuals. When analyzing complex problem solving in teams, an additional, new dimension has to be considered, as teamwork processes increase the requirements already put on individual team members. After introducing an idealized teamwork process model, that complex problem solving teams pass through, and integrating the relevant teamwork skills for interdependently working teams into the model and combining it with the four kinds of team processes (transition, action, interpersonal, and learning processes, the paper demonstrates the importance of fulfilling team process demands for successful complex problem solving within teams. Therefore, results from a controlled team study within complex situations are presented. The study focused on factors that influence action processes, like coordination, such as emergent states like collective orientation, cohesion, and trust and that dynamically enable effective teamwork in complex situations. Before conducting the experiments, participants were divided by median split into two-person teams with either high (n = 58 or low (n = 58 collective orientation values. The study was conducted with the microworld C3Fire, simulating dynamic decision making, and acting in complex situations within a teamwork context. The microworld includes interdependent tasks such as extinguishing forest fires or protecting houses. Two firefighting scenarios had been developed, which takes a maximum of 15 min each. All teams worked on these two scenarios. Coordination within the team and the resulting team performance were calculated based on a log-file analysis. The results show that no relationships between trust and action processes and team performance exist. Likewise, no relationships were found for cohesion. Only collective orientation of team members positively influences team performance in complex environments mediated by action processes

  3. Ethnographic methods for process evaluations of complex health behaviour interventions.

    Science.gov (United States)

    Morgan-Trimmer, Sarah; Wood, Fiona

    2016-05-04

    This article outlines the contribution that ethnography could make to process evaluations for trials of complex health-behaviour interventions. Process evaluations are increasingly used to examine how health-behaviour interventions operate to produce outcomes and often employ qualitative methods to do this. Ethnography shares commonalities with the qualitative methods currently used in health-behaviour evaluations but has a distinctive approach over and above these methods. It is an overlooked methodology in trials of complex health-behaviour interventions that has much to contribute to the understanding of how interventions work. These benefits are discussed here with respect to three strengths of ethnographic methodology: (1) producing valid data, (2) understanding data within social contexts, and (3) building theory productively. The limitations of ethnography within the context of process evaluations are also discussed.

  4. An empirical test of social information processing theory and emotions in violent situations

    Directory of Open Access Journals (Sweden)

    Kendra N. Bowen

    2017-03-01

    Full Text Available Objective to study the decisionmaking process in highriskforviolence situations. Methods formallegal sociological method of hierarchical generalized linear modeling. Results criminological research has favored the rational choice perspective in studying offender decision making. However this theoretical approach does not take into account the complex interplay of situational cognitive emotional and person factors that likely influence criminal decision making. To that end the current study examines decision making in highriskforviolence situations focusing on social information processing and emotional state variables. The current study utilizes a sample of 236 newly incarcerated jailed inmates who provide personal level data and situational reports of violent and avoided violence situations n 466. Scientific novelty the findings for the first time show that several situational social information processing and emotion variables such as intent interpretation goal and response generation are significant predictors of the escalation of violence hence increasing the probability of committing a crime. Practical significance the main provisions and conclusions of the article can be used in scientific and lawenforcement activities when considering the issues of identifying and eliminating the reasons and conditions of crime committing as well as with influencing the individuals in order to prevent crimes or antisocial behavior.

  5. A Mashup Application to Support Complex Decision Making for Retail Consumers

    OpenAIRE

    Steven Walczak; Deborah L. Kellogg; Dawn G. Gregg

    2010-01-01

    Purchase processes often require complex decision making and consumers frequently use Web information sources to support these decisions. However, increasing amounts of information can make finding appropriate information problematic. This information overload, coupled with decision complexity, can increase time required to make a decision and reduce decision quality. This creates a need for tools that support these decision-making processes. Online tools that bring together data and partial ...

  6. Classicality of quantum information processing

    International Nuclear Information System (INIS)

    Poulin, David

    2002-01-01

    The ultimate goal of the classicality program is to quantify the amount of quantumness of certain processes. Here, classicality is studied for a restricted type of process: quantum information processing (QIP). Under special conditions, one can force some qubits of a quantum computer into a classical state without affecting the outcome of the computation. The minimal set of conditions is described and its structure is studied. Some implications of this formalism are the increase of noise robustness, a proof of the quantumness of mixed state quantum computing, and a step forward in understanding the very foundation of QIP

  7. Information processing in bacteria: memory, computation, and statistical physics: a key issues review

    International Nuclear Information System (INIS)

    Lan, Ganhui; Tu, Yuhai

    2016-01-01

    Living systems have to constantly sense their external environment and adjust their internal state in order to survive and reproduce. Biological systems, from as complex as the brain to a single E. coli cell, have to process these data in order to make appropriate decisions. How do biological systems sense external signals? How do they process the information? How do they respond to signals? Through years of intense study by biologists, many key molecular players and their interactions have been identified in different biological machineries that carry out these signaling functions. However, an integrated, quantitative understanding of the whole system is still lacking for most cellular signaling pathways, not to say the more complicated neural circuits. To study signaling processes in biology, the key thing to measure is the input-output relationship. The input is the signal itself, such as chemical concentration, external temperature, light (intensity and frequency), and more complex signals such as the face of a cat. The output can be protein conformational changes and covalent modifications (phosphorylation, methylation, etc), gene expression, cell growth and motility, as well as more complex output such as neuron firing patterns and behaviors of higher animals. Due to the inherent noise in biological systems, the measured input-output dependence is often noisy. These noisy data can be analysed by using powerful tools and concepts from information theory such as mutual information, channel capacity, and the maximum entropy hypothesis. This information theory approach has been successfully used to reveal the underlying correlations between key components of biological networks, to set bounds for network performance, and to understand possible network architecture in generating observed correlations. Although the information theory approach provides a general tool in analysing noisy biological data and may be used to suggest possible network architectures in

  8. Information processing in bacteria: memory, computation, and statistical physics: a key issues review

    Science.gov (United States)

    Lan, Ganhui; Tu, Yuhai

    2016-05-01

    Living systems have to constantly sense their external environment and adjust their internal state in order to survive and reproduce. Biological systems, from as complex as the brain to a single E. coli cell, have to process these data in order to make appropriate decisions. How do biological systems sense external signals? How do they process the information? How do they respond to signals? Through years of intense study by biologists, many key molecular players and their interactions have been identified in different biological machineries that carry out these signaling functions. However, an integrated, quantitative understanding of the whole system is still lacking for most cellular signaling pathways, not to say the more complicated neural circuits. To study signaling processes in biology, the key thing to measure is the input-output relationship. The input is the signal itself, such as chemical concentration, external temperature, light (intensity and frequency), and more complex signals such as the face of a cat. The output can be protein conformational changes and covalent modifications (phosphorylation, methylation, etc), gene expression, cell growth and motility, as well as more complex output such as neuron firing patterns and behaviors of higher animals. Due to the inherent noise in biological systems, the measured input-output dependence is often noisy. These noisy data can be analysed by using powerful tools and concepts from information theory such as mutual information, channel capacity, and the maximum entropy hypothesis. This information theory approach has been successfully used to reveal the underlying correlations between key components of biological networks, to set bounds for network performance, and to understand possible network architecture in generating observed correlations. Although the information theory approach provides a general tool in analysing noisy biological data and may be used to suggest possible network architectures in

  9. Information processing in bacteria: memory, computation, and statistical physics: a key issues review.

    Science.gov (United States)

    Lan, Ganhui; Tu, Yuhai

    2016-05-01

    Living systems have to constantly sense their external environment and adjust their internal state in order to survive and reproduce. Biological systems, from as complex as the brain to a single E. coli cell, have to process these data in order to make appropriate decisions. How do biological systems sense external signals? How do they process the information? How do they respond to signals? Through years of intense study by biologists, many key molecular players and their interactions have been identified in different biological machineries that carry out these signaling functions. However, an integrated, quantitative understanding of the whole system is still lacking for most cellular signaling pathways, not to say the more complicated neural circuits. To study signaling processes in biology, the key thing to measure is the input-output relationship. The input is the signal itself, such as chemical concentration, external temperature, light (intensity and frequency), and more complex signals such as the face of a cat. The output can be protein conformational changes and covalent modifications (phosphorylation, methylation, etc), gene expression, cell growth and motility, as well as more complex output such as neuron firing patterns and behaviors of higher animals. Due to the inherent noise in biological systems, the measured input-output dependence is often noisy. These noisy data can be analysed by using powerful tools and concepts from information theory such as mutual information, channel capacity, and the maximum entropy hypothesis. This information theory approach has been successfully used to reveal the underlying correlations between key components of biological networks, to set bounds for network performance, and to understand possible network architecture in generating observed correlations. Although the information theory approach provides a general tool in analysing noisy biological data and may be used to suggest possible network architectures in

  10. Integrating technology into complex intervention trial processes: a case study.

    Science.gov (United States)

    Drew, Cheney J G; Poile, Vincent; Trubey, Rob; Watson, Gareth; Kelson, Mark; Townson, Julia; Rosser, Anne; Hood, Kerenza; Quinn, Lori; Busse, Monica

    2016-11-17

    Trials of complex interventions are associated with high costs and burdens in terms of paperwork, management, data collection, validation, and intervention fidelity assessment occurring across multiple sites. Traditional data collection methods rely on paper-based forms, where processing can be time-consuming and error rates high. Electronic source data collection can potentially address many of these inefficiencies, but has not routinely been used in complex intervention trials. Here we present the use of an on-line system for managing all aspects of data handling and for the monitoring of trial processes in a multicentre trial of a complex intervention. We custom built a web-accessible software application for the delivery of ENGAGE-HD, a multicentre trial of a complex physical therapy intervention. The software incorporated functionality for participant randomisation, data collection and assessment of intervention fidelity. It was accessible to multiple users with differing levels of access depending on required usage or to maintain blinding. Each site was supplied with a 4G-enabled iPad for accessing the system. The impact of this system was quantified through review of data quality and collation of feedback from site coordinators and assessors through structured process interviews. The custom-built system was an efficient tool for collecting data and managing trial processes. Although the set-up time required was significant, using the system resulted in an overall data completion rate of 98.5% with a data query rate of 0.1%, the majority of which were resolved in under a week. Feedback from research staff indicated that the system was highly acceptable for use in a research environment. This was a reflection of the portability and accessibility of the system when using the iPad and its usefulness in aiding accurate data collection, intervention fidelity and general administration. A combination of commercially available hardware and a bespoke online database

  11. Research and Measurement of Software Complexity Based on Wuli, Shili, Renli (WSR and Information Entropy

    Directory of Open Access Journals (Sweden)

    Rong Jiang

    2015-04-01

    Full Text Available Complexity is an important factor throughout the software life cycle. It is increasingly difficult to guarantee software quality, cost and development progress with the increase in complexity. Excessive complexity is one of the main reasons for the failure of software projects, so effective recognition, measurement and control of complexity becomes the key of project management. At first, this paper analyzes the current research situation of software complexity systematically and points out existing problems in current research. Then, it proposes a WSR framework of software complexity, which divides the complexity of software into three levels of Wuli (WL, Shili (SL and Renli (RL, so that the staff in different roles may have a better understanding of complexity. Man is the main source of complexity, but the current research focuses on WL complexity, and the research of RL complexity is extremely scarce, so this paper emphasizes the research of RL complexity of software projects. This paper not only analyzes the composing factors of RL complexity, but also provides the definition of RL complexity. Moreover, it puts forward a quantitative measurement method of the complexity of personnel organization hierarchy and the complexity of personnel communication information based on information entropy first and analyzes and validates the scientificity and rationality of this measurement method through a large number of cases.

  12. Career information processing strategies of secondary school ...

    African Journals Online (AJOL)

    This study examined the strategies commonly adopted by Osun state secondary school students in processing career information. It specifically examined the sources of career information available to the students, the uses to which the students put the information collected and how their career decision making skills can be ...

  13. Impact of background noise and sentence complexity on cognitive processing demands

    DEFF Research Database (Denmark)

    Wendt, Dorothea; Dau, Torsten; Hjortkjær, Jens

    2015-01-01

    Speech comprehension in adverse listening conditions requires cognitive processingdemands. Processing demands can increase with acoustically degraded speech but also depend on linguistic aspects of the speech signal, such as syntactic complexity. In the present study, pupil dilations were recorded...... in 19 normal-hearing participants while processing sentences that were either syntactically simple or complex and presented in either high- or low-level background noise. Furthermore, the participants were asked to rate the subjectively perceived difficulty of sentence comprehension. The results showed...

  14. Impact of background noise and sentence complexity on cognitive processing effort

    DEFF Research Database (Denmark)

    Wendt, Dorothea; Dau, Torsten; Hjortkjær, Jens

    2015-01-01

    Speech comprehension in adverse listening conditions requires cognitive pro- cessing demands. Processing demands can increase with acoustically degraded speech but also depend on linguistic aspects of the speech signal, such as syntactic complexity. In the present study, pupil dilations were...... recorded in 19 normal-hearing participants while processing sentences that were either syntactically simple or complex and presented in either high- or low-level background noise. Furthermore, the participants were asked to rate the sub- jectively perceived difficulty of sentence comprehension. The results...

  15. FACET: A simulation software framework for modeling complex societal processes and interactions

    Energy Technology Data Exchange (ETDEWEB)

    Christiansen, J. H.

    2000-06-02

    FACET, the Framework for Addressing Cooperative Extended Transactions, was developed at Argonne National Laboratory to address the need for a simulation software architecture in the style of an agent-based approach, but with sufficient robustness, expressiveness, and flexibility to be able to deal with the levels of complexity seen in real-world social situations. FACET is an object-oriented software framework for building models of complex, cooperative behaviors of agents. It can be used to implement simulation models of societal processes such as the complex interplay of participating individuals and organizations engaged in multiple concurrent transactions in pursuit of their various goals. These transactions can be patterned on, for example, clinical guidelines and procedures, business practices, government and corporate policies, etc. FACET can also address other complex behaviors such as biological life cycles or manufacturing processes. To date, for example, FACET has been applied to such areas as land management, health care delivery, avian social behavior, and interactions between natural and social processes in ancient Mesopotamia.

  16. Attachment in Middle Childhood: Associations with Information Processing

    Science.gov (United States)

    Zimmermann, Peter; Iwanski, Alexandra

    2015-01-01

    Attachment theory suggests that internal working models of self and significant others influence adjustment during development by controlling information processing and self-regulation. We provide a conceptual overview on possible mechanisms linking attachment and information processing and review the current literature in middle childhood.…

  17. MATHEMATICAL МODELLING OF SELECTING INFORMATIVE FEATURES FOR ANALYZING THE LIFE CYCLE PROCESSES OF RADIO-ELECTRONIC MEANS

    Directory of Open Access Journals (Sweden)

    Николай Григорьевич Стародубцев

    2017-09-01

    Full Text Available The subject of the study are methods and models for extracting information about the processes of the life cycle of radio electronic means at the design, production and operation stages. The goal is to develop the fundamentals of the theory of holistic monitoring of the life cycle of radio electronic means at the stages of their design, production and operation, in particular the development of information models for monitoring life cycle indicators in the production of radio electronic means. The attainment of this goal is achieved by solving such problems: research and development of a methodology for solving the problems of selecting informative features characterizing the state of the life cycle of radio electronic means; choice of informative features characterizing the state of the life cycle processes of radio electronic means; identification of the state of the life cycle processes of radio electronic means. To solve these problems, general scientific methods were used: the main provisions of functional analysis, nonequilibrium thermodynamics, estimation and prediction of random processes, optimization methods, pattern recognition. The following results are obtained. Methods for solving the problems of selecting informative features for monitoring the life cycle of radioelectronic facilities are developed by classifying the states of radioelectronic means and the processes of LC in the space of characteristics, each of which has a certain significance, which allowed finding a complex criterion and formalizing the selection procedures. When the number of a priori data is insufficient for a correct classification, heuristic methods of selection according to the criteria for using basic prototypes and information priorities are proposed. Conclusions. The solution of the problem of mathematical modeling of the efficiency functions of the processes of the life cycle of radioelectronic facilities and the choice of informative features for

  18. Oxidation mechanism of diethyl ether: a complex process for a simple molecule.

    Science.gov (United States)

    Di Tommaso, Stefania; Rotureau, Patricia; Crescenzi, Orlando; Adamo, Carlo

    2011-08-28

    A large number of organic compounds, such as ethers, spontaneously form unstable peroxides through a self-propagating process of autoxidation (peroxidation). Although the hazards of organic peroxides are well known, the oxidation mechanisms of peroxidizable compounds like ethers reported in the literature are vague and often based on old experiments, carried out in very different conditions (e.g. atmospheric, combustion). With the aim to (partially) fill the lack of information, in this paper we present an extensive Density Functional Theory (DFT) study of autoxidation reaction of diethyl ether (DEE), a chemical that is largely used as solvent in laboratories, and which is considered to be responsible for various accidents. The aim of the work is to investigate the most probable reaction paths involved in the autoxidation process and to identify all potential hazardous intermediates, such as peroxides. Beyond the determination of a complex oxidation mechanism for such a simple molecule, our results suggest that the two main reaction channels open in solution are the direct decomposition (β-scission) of DEE radical issued of the initiation step and the isomerization of the peroxy radical formed upon oxygen attack (DEEOO˙). A simple kinetic evaluation of these two competing reaction channels hints that radical isomerization may play an unexpectedly important role in the global DEE oxidation process. Finally industrial hazards could be related to the hydroperoxide formation and accumulation during the chain propagation step. The resulting information may contribute to the understanding of the accidental risks associated with the use of diethyl ether.

  19. Process system of radiometric and magnetometric aerial information

    International Nuclear Information System (INIS)

    Bazua Rueda, L.F.

    1985-01-01

    The author has been working first in the National Institute of Nuclear Energy (Mexico) and then in URAMEX (Uranio Mexicano) since 1975 to 1983, integrated to radiometric and magnetometric aerial prospecting projects in computerized processing of information aspects. During this period the author participated in the work out of computing systems, information processing and mathematical procedures definition for the geophysical reduction of the calibration equipment data. With cumulated experience, in this thesis are presented aspects concerning to management and operation of computerized processing of information systems. Operation handbooks of the majority of modules are presented. Program lists are not included. (Author)

  20. Penalised Complexity Priors for Stationary Autoregressive Processes

    KAUST Repository

    Sø rbye, Sigrunn Holbek; Rue, Haavard

    2017-01-01

    The autoregressive (AR) process of order p(AR(p)) is a central model in time series analysis. A Bayesian approach requires the user to define a prior distribution for the coefficients of the AR(p) model. Although it is easy to write down some prior, it is not at all obvious how to understand and interpret the prior distribution, to ensure that it behaves according to the users' prior knowledge. In this article, we approach this problem using the recently developed ideas of penalised complexity (PC) priors. These prior have important properties like robustness and invariance to reparameterisations, as well as a clear interpretation. A PC prior is computed based on specific principles, where model component complexity is penalised in terms of deviation from simple base model formulations. In the AR(1) case, we discuss two natural base model choices, corresponding to either independence in time or no change in time. The latter case is illustrated in a survival model with possible time-dependent frailty. For higher-order processes, we propose a sequential approach, where the base model for AR(p) is the corresponding AR(p-1) model expressed using the partial autocorrelations. The properties of the new prior distribution are compared with the reference prior in a simulation study.

  1. Penalised Complexity Priors for Stationary Autoregressive Processes

    KAUST Repository

    Sørbye, Sigrunn Holbek

    2017-05-25

    The autoregressive (AR) process of order p(AR(p)) is a central model in time series analysis. A Bayesian approach requires the user to define a prior distribution for the coefficients of the AR(p) model. Although it is easy to write down some prior, it is not at all obvious how to understand and interpret the prior distribution, to ensure that it behaves according to the users\\' prior knowledge. In this article, we approach this problem using the recently developed ideas of penalised complexity (PC) priors. These prior have important properties like robustness and invariance to reparameterisations, as well as a clear interpretation. A PC prior is computed based on specific principles, where model component complexity is penalised in terms of deviation from simple base model formulations. In the AR(1) case, we discuss two natural base model choices, corresponding to either independence in time or no change in time. The latter case is illustrated in a survival model with possible time-dependent frailty. For higher-order processes, we propose a sequential approach, where the base model for AR(p) is the corresponding AR(p-1) model expressed using the partial autocorrelations. The properties of the new prior distribution are compared with the reference prior in a simulation study.

  2. Setting a new paradigm in cognitive science information: contributions to the process of knowing the information professional

    Directory of Open Access Journals (Sweden)

    Paula Regina Dal' Evedove

    2013-05-01

    Full Text Available Introduction: Studies about human cognition represent a relevant perspective in information science, considering the subjective actions of information professionals and dialogic process that should permeate the activity of subjects dealing with the organization and representation of information.Objective: Explore the approach of the cognitive perspective in information science and their new settings by contemporary needs of information to reflect on the process of meeting the professional information through the social reality that permeates the contexts of information.Methodology: Reflection on theoretical aspects that deal with the cognitive development to discuss the implications of the cognitive approach in information science and its evolution in the scope of the representation and processing of information.Results: Research in Information Science must consider issues of cognitive and social order that underlie information processing and the process of knowing the information professional as knowledge structures must be explained from the social context of knowing subjects.Conclusions: There is a need to investigate the process of knowing the information professional in the bias of socio-cognitive approach, targeting new elements for the understanding of the relationship information (cognitive manifestations and its implications on the social dimension.

  3. Using life cycle information in process discovery

    NARCIS (Netherlands)

    Leemans, S.J.J.; Fahland, D.; Van Der Aalst, W.M.P.; Reichert, M.; Reijers, H.A.

    2016-01-01

    Understanding the performance of business processes is an important part of any business process intelligence project. From historical information recorded in event logs, performance can be measured and visualized on a discovered process model. Thereby the accuracy of the measured performance, e.g.,

  4. Influence Business Process On The Quality Of Accounting Information System

    Directory of Open Access Journals (Sweden)

    Meiryani

    2015-01-01

    Full Text Available Abstract The purpose of this study was to determine the influence of business process to the quality of the accounting information system. This study aims to examine the influence of business process on the quality of the information system of accounting information system. The study was theoritical research which considered the roles of business process on quality of accounting information system which use secondary data collection. The results showed that the business process have a significant effect on the quality of accounting information systems.

  5. Efficient Simulation Modeling of an Integrated High-Level-Waste Processing Complex

    International Nuclear Information System (INIS)

    Gregory, Michael V.; Paul, Pran K.

    2000-01-01

    An integrated computational tool named the Production Planning Model (ProdMod) has been developed to simulate the operation of the entire high-level-waste complex (HLW) at the Savannah River Site (SRS) over its full life cycle. ProdMod is used to guide SRS management in operating the waste complex in an economically efficient and environmentally sound manner. SRS HLW operations are modeled using coupled algebraic equations. The dynamic nature of plant processes is modeled in the form of a linear construct in which the time dependence is implicit. Batch processes are modeled in discrete event-space, while continuous processes are modeled in time-space. The ProdMod methodology maps between event-space and time-space such that the inherent mathematical discontinuities in batch process simulation are avoided without sacrificing any of the necessary detail in the batch recipe steps. Modeling the processes separately in event- and time-space using linear constructs, and then coupling the two spaces, has accelerated the speed of simulation compared to a typical dynamic simulation. The ProdMod simulator models have been validated against operating data and other computer codes. Case studies have demonstrated the usefulness of the ProdMod simulator in developing strategies that demonstrate significant cost savings in operating the SRS HLW complex and in verifying the feasibility of newly proposed processes

  6. Tracing the development of complex problems and the methods of its information support

    International Nuclear Information System (INIS)

    Belenki, A.; Ryjov, A.

    1999-01-01

    This article is dedicated to the development of a technology for information monitoring of complex problems such as IAEA safeguards tasks. The main purpose of this technology is to create human-machine systems for monitoring problems with complex subject areas such as political science, social science, business, ecology and etc. (author)

  7. Informational Entropy and Bridge Scour Estimation under Complex Hydraulic Scenarios

    Science.gov (United States)

    Pizarro, Alonso; Link, Oscar; Fiorentino, Mauro; Samela, Caterina; Manfreda, Salvatore

    2017-04-01

    Bridges are important for society because they allow social, cultural and economic connectivity. Flood events can compromise the safety of bridge piers up to the complete collapse. The Bridge Scour phenomena has been described by empirical formulae deduced from hydraulic laboratory experiments. The range of applicability of such models is restricted by the specific hydraulic conditions or flume geometry used for their derivation (e.g., water depth, mean flow velocity, pier diameter and sediment properties). We seek to identify a general formulation able to capture the main dynamic of the process in order to cover a wide range of hydraulic and geometric configuration, allowing to extend our analysis in different contexts. Therefore, exploiting the Principle of Maximum Entropy (POME) and applying it on the recently proposed dimensionless Effective flow work, W*, we derived a simple model characterized by only one parameter. The proposed Bridge Scour Entropic (BRISENT) model shows good performances under complex hydraulic conditions as well as under steady-state flow. Moreover, the model was able to capture the evolution of scour in several hydraulic configurations even if the model contains only one parameter. Furthermore, results show that the model parameter is controlled by the geometric configurations of the experiment. This offers a possible strategy to obtain a priori model parameter calibration. The BRISENT model represents a good candidate for estimating the time-dependent scour depth under complex hydraulic scenarios. The authors are keen to apply this idea for describing the scour behavior during a real flood event. Keywords: Informational entropy, Sediment transport, Bridge pier scour, Effective flow work.

  8. Informing Hospital Change Processes through Visualization and Simulation: A Case Study at a Children's Emergency Clinic.

    Science.gov (United States)

    Persson, Johanna; Dalholm, Elisabeth Hornyánszky; Johansson, Gerd

    2014-01-01

    To demonstrate the use of visualization and simulation tools in order to involve stakeholders and inform the process in hospital change processes, illustrated by an empirical study from a children's emergency clinic. Reorganization and redevelopment of a hospital is a complex activity that involves many stakeholders and demands. Visualization and simulation tools have proven useful for involving practitioners and eliciting relevant knowledge. More knowledge is desired about how these tools can be implemented in practice for hospital planning processes. A participatory planning process including practitioners and researchers was executed over a 3-year period to evaluate a combination of visualization and simulation tools to involve stakeholders in the planning process and to elicit knowledge about needs and requirements. The initial clinic proposal from the architect was discarded as a result of the empirical study. Much general knowledge about the needs of the organization was extracted by means of the adopted tools. Some of the tools proved to be more accessible than others for the practitioners participating in the study. The combination of tools added value to the process by presenting information in alternative ways and eliciting questions from different angles. Visualization and simulation tools inform a planning process (or other types of change processes) by providing the means to see beyond present demands and current work structures. Long-term involvement in combination with accessible tools is central for creating a participatory setting where the practitioners' knowledge guides the process. © 2014 Vendome Group, LLC.

  9. Advanced information processing system

    Science.gov (United States)

    Lala, J. H.

    1984-01-01

    Design and performance details of the advanced information processing system (AIPS) for fault and damage tolerant data processing on aircraft and spacecraft are presented. AIPS comprises several computers distributed throughout the vehicle and linked by a damage tolerant data bus. Most I/O functions are available to all the computers, which run in a TDMA mode. Each computer performs separate specific tasks in normal operation and assumes other tasks in degraded modes. Redundant software assures that all fault monitoring, logging and reporting are automated, together with control functions. Redundant duplex links and damage-spread limitation provide the fault tolerance. Details of an advanced design of a laboratory-scale proof-of-concept system are described, including functional operations.

  10. When complex is easy on the mind: internal repetition of visual information in complex objects is a source of perceptual fluency

    NARCIS (Netherlands)

    Linda Steg; Roos Pals; Ayça Berfu Ünal; Yannick Joye

    2015-01-01

    Across 3 studies, we investigated whether visual complexity deriving from internally repeating visual information over many scale levels is a source of perceptual fluency. Such continuous repetition of visual information is formalized in fractal geometry and is a key-property of natural structures.

  11. The information exchange between moduluses in the system of module programming of the computation complexes

    International Nuclear Information System (INIS)

    Zinin, A.I.; Kolesov, V.E.; Nevinitsa, A.I.

    1975-01-01

    The report contains description of the method of construction of computer programs complexes for computation purposes for M-220 computers using the ALGOL-60 code for programming. The complex is organised on the modulus system principle and can include substantial number of modulus programs. The information exchange between separate moduli is done by means of special interpreting program and the information unit exchanged is a specially arranged file of data. For addressing to the interpreting program in the ALGOL-60 frameworks small number of specially created procedure-codes is used. The method proposed gives possibilities to program separate moduli of the complex independently and to expand the complex if necessary. In this case separate moduli or groups of moduli depending on the method of segmentation of the general problem solved by the complex will be of the independent interest and could be used out of the complex as traditional programs. (author)

  12. Understanding the Information Research Process of Experienced Online Information Researchers to Inform Development of a Scholars Portal

    Directory of Open Access Journals (Sweden)

    Martha Whitehead

    2009-06-01

    Full Text Available Objective - The main purpose of this study was to understand the information research process of experienced online information researchers in a variety of disciplines, gather their ideas for improvement and as part of this to validate a proposed research framework for use in future development of Ontario’s Scholars Portal.Methods - This was a qualitative research study in which sixty experienced online information researchers participated in face-to-face workshops that included a collaborative design component. The sessions were conducted and recorded by usability specialists who subsequently analyzed the data and identified patterns and themes.Results - Key themes included the similarities of the information research process across all disciplines, the impact of interdisciplinarity, the social aspect of research and opportunities for process improvement. There were many specific observations regarding current and ideal processes. Implications for portal development and further research included: supporting a common process while accommodating user-defined differences; supporting citation chaining practices with new opportunities for data linkage and granularity; enhancing keyword searching with various types of intervention; exploring trusted social networks; exploring new mental models for data manipulation while retaining traditional objects; improving citation and document management. Conclusion – The majority of researchers in the study had almost no routine in their information research processes, had developed few techniques to assist themselves and had very little awareness of the tools available to help them. There are many opportunities to aid researchers in the research process that can be explored when developing scholarly research portals. That development will be well guided by the framework ‘discover, gather, synthesize, create, share.’

  13. Information processing in the vertebrate habenula.

    Science.gov (United States)

    Fore, Stephanie; Palumbo, Fabrizio; Pelgrims, Robbrecht; Yaksi, Emre

    2018-06-01

    The habenula is a brain region that has gained increasing popularity over the recent years due to its role in processing value-related and experience-dependent information with a strong link to depression, addiction, sleep and social interactions. This small diencephalic nucleus is proposed to act as a multimodal hub or a switchboard, where inputs from different brain regions converge. These diverse inputs to the habenula carry information about the sensory world and the animal's internal state, such as reward expectation or mood. However, it is not clear how these diverse habenular inputs interact with each other and how such interactions contribute to the function of habenular circuits in regulating behavioral responses in various tasks and contexts. In this review, we aim to discuss how information processing in habenular circuits, can contribute to specific behavioral programs that are attributed to the habenula. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Complex plasmochemical processing of solid fuel

    Directory of Open Access Journals (Sweden)

    Vladimir Messerle

    2012-12-01

    Full Text Available Technology of complex plasmaochemical processing of solid fuel by Ecibastuz bituminous and Turgay brown coals is presented. Thermodynamic and experimental study of the technology was fulfilled. Use of this technology allows producing of synthesis gas from organic mass of coal and valuable components (technical silicon, ferrosilicon, aluminum and silicon carbide and microelements of rare metals: uranium, molybdenum, vanadium etc. from mineral mass of coal. Produced a high-calorific synthesis gas can be used for methanol synthesis, as high-grade reducing gas instead of coke, as well as energy gas in thermal power plants.

  15. A Process Model for Goal-Based Information Retrieval

    Directory of Open Access Journals (Sweden)

    Harvey Hyman

    2014-12-01

    Full Text Available In this paper we examine the domain of information search and propose a "goal-based" approach to study search strategy. We describe "goal-based information search" using a framework of Knowledge Discovery. We identify two Information Retrieval (IR goals using the constructs of Knowledge Acquisition (KA and Knowledge Explanation (KE. We classify these constructs into two specific information problems: An exploration-exploitation problem and an implicit-explicit problem. Our proposed framework is an extension of prior work in this domain, applying an IR Process Model originally developed for Legal-IR and adapted to Medical-IR. The approach in this paper is guided by the recent ACM-SIG Medical Information Retrieval (MedIR Workshop definition: "methodologies and technologies that seek to improve access to medical information archives via a process of information retrieval."

  16. The Readability and Complexity of District-Provided School-Choice Information

    Science.gov (United States)

    Stein, Marc L.; Nagro, Sarah

    2015-01-01

    Public school choice has become a common feature in American school districts. Any potential benefits that could be derived from these policies depend heavily on the ability of parents and students to make informed and educated decisions about their school options. We examined the readability and complexity of school-choice guides across a sample…

  17. About the role of stylistic and syntactic devices of expansion in the informational complex of dicteme of a German advertising text

    Directory of Open Access Journals (Sweden)

    Артур Нарманович Мамедов

    2012-12-01

    Full Text Available The article highlights stylistic and syntactic devices of expansion, which act as compositional means, vary normative syntactic structure of an advertising text, contribute to sense formation, creating conditions for the purpose of advertiser’s intent. By means of these language elements expressing invariant tactic sense the advertiser consciously expands and/or complicates the informative complex of dicteme, an acting text unit, transmitting superfluous impressive information together with factual one. Combination of factual and impressive items of information activates both rational and emotional perceptional channels of prospective consumer, intensifies the positioning process of an advertised article.

  18. The use of information theory in evolutionary biology.

    Science.gov (United States)

    Adami, Christoph

    2012-05-01

    Information is a key concept in evolutionary biology. Information stored in a biological organism's genome is used to generate the organism and to maintain and control it. Information is also that which evolves. When a population adapts to a local environment, information about this environment is fixed in a representative genome. However, when an environment changes, information can be lost. At the same time, information is processed by animal brains to survive in complex environments, and the capacity for information processing also evolves. Here, I review applications of information theory to the evolution of proteins and to the evolution of information processing in simulated agents that adapt to perform a complex task. © 2012 New York Academy of Sciences.

  19. Quantum information processing with trapped ions

    International Nuclear Information System (INIS)

    Haeffner, H.; Haensel, W.; Rapol, U.; Koerber, T.; Benhelm, J.; Riebe, M.; Chek-al-Kar, D.; Schmidt-Kaler, F.; Becher, C.; Roos, C.; Blatt, R.

    2005-01-01

    Single Ca + ions and crystals of Ca + ions are confined in a linear Paul trap and are investigated for quantum information processing. Here we report on recent experimental advancements towards a quantum computer with such a system. Laser-cooled trapped ions are ideally suited systems for the investigation and implementation of quantum information processing as one can gain almost complete control over their internal and external degrees of freedom. The combination of a Paul type ion trap with laser cooling leads to unique properties of trapped cold ions, such as control of the motional state down to the zero-point of the trapping potential, a high degree of isolation from the environment and thus a very long time available for manipulations and interactions at the quantum level. The very same properties make single trapped atoms and ions well suited for storing quantum information in long lived internal states, e.g. by encoding a quantum bit (qubit) of information within the coherent superposition of the S 1/2 ground state and the metastable D 5/2 excited state of Ca + . Recently we have achieved the implementation of simple algorithms with up to 3 qubits on an ion-trap quantum computer. We will report on methods to implement single qubit rotations, the realization of a two-qubit universal quantum gate (Cirac-Zoller CNOT-gate), the deterministic generation of multi-particle entangled states (GHZ- and W-states), their full tomographic reconstruction, the realization of deterministic quantum teleportation, its quantum process tomography and the encoding of quantum information in decoherence-free subspaces with coherence times exceeding 20 seconds. (author)

  20. Can complex cellular processes be governed by simple linear rules?

    Science.gov (United States)

    Selvarajoo, Kumar; Tomita, Masaru; Tsuchiya, Masa

    2009-02-01

    Complex living systems have shown remarkably well-orchestrated, self-organized, robust, and stable behavior under a wide range of perturbations. However, despite the recent generation of high-throughput experimental datasets, basic cellular processes such as division, differentiation, and apoptosis still remain elusive. One of the key reasons is the lack of understanding of the governing principles of complex living systems. Here, we have reviewed the success of perturbation-response approaches, where without the requirement of detailed in vivo physiological parameters, the analysis of temporal concentration or activation response unravels biological network features such as causal relationships of reactant species, regulatory motifs, etc. Our review shows that simple linear rules govern the response behavior of biological networks in an ensemble of cells. It is daunting to know why such simplicity could hold in a complex heterogeneous environment. Provided physical reasons can be explained for these phenomena, major advancement in the understanding of basic cellular processes could be achieved.

  1. The Effects of Syntactic Complexity on Processing Sentences in Noise

    Science.gov (United States)

    Carroll, Rebecca; Ruigendijk, Esther

    2013-01-01

    This paper discusses the influence of stationary (non-fluctuating) noise on processing and understanding of sentences, which vary in their syntactic complexity (with the factors canonicity, embedding, ambiguity). It presents data from two RT-studies with 44 participants testing processing of German sentences in silence and in noise. Results show a…

  2. Valid knowledge for the professional design of large and complex design processes

    NARCIS (Netherlands)

    Aken, van J.E.

    2004-01-01

    The organization and planning of design processes, which we may regard as design process design, is an important issue. Especially for large and complex design-processes traditional approaches to process design may no longer suffice. The design literature gives quite some design process models. As

  3. Horizontal information drives the behavioural signatures of face processing

    Directory of Open Access Journals (Sweden)

    Valerie Goffaux

    2010-09-01

    Full Text Available Recent psychophysical evidence indicates that the vertical arrangement of horizontal information is particularly important for encoding facial identity. In this paper we extend this notion to examine the role that information at different (particularly cardinal orientations might play in a number of established phenomena each a behavioural “signature” of face processing. In particular we consider (a the face inversion effect (FIE, (b the facial identity after-effect, (c face-matching across viewpoint, and (d interactive, so-called holistic, processing of face parts. We report that filtering faces to remove all but the horizontal information largely preserves these effects but conversely, retaining vertical information generally diminishes or abolishes them. We conclude that preferential processing of horizontal information is a central feature of human face processing that supports many of the behavioural signatures of this critical visual operation.

  4. Influence Business Process On The Quality Of Accounting Information System

    OpenAIRE

    Meiryani; Muhammad Syaifullah

    2015-01-01

    Abstract The purpose of this study was to determine the influence of business process to the quality of the accounting information system. This study aims to examine the influence of business process on the quality of the information system of accounting information system. The study was theoritical research which considered the roles of business process on quality of accounting information system which use secondary data collection. The results showed that the business process have a signifi...

  5. Information Processing Capacity of Dynamical Systems

    Science.gov (United States)

    Dambre, Joni; Verstraeten, David; Schrauwen, Benjamin; Massar, Serge

    2012-07-01

    Many dynamical systems, both natural and artificial, are stimulated by time dependent external signals, somehow processing the information contained therein. We demonstrate how to quantify the different modes in which information can be processed by such systems and combine them to define the computational capacity of a dynamical system. This is bounded by the number of linearly independent state variables of the dynamical system, equaling it if the system obeys the fading memory condition. It can be interpreted as the total number of linearly independent functions of its stimuli the system can compute. Our theory combines concepts from machine learning (reservoir computing), system modeling, stochastic processes, and functional analysis. We illustrate our theory by numerical simulations for the logistic map, a recurrent neural network, and a two-dimensional reaction diffusion system, uncovering universal trade-offs between the non-linearity of the computation and the system's short-term memory.

  6. Information Processing Capacity of Dynamical Systems

    Science.gov (United States)

    Dambre, Joni; Verstraeten, David; Schrauwen, Benjamin; Massar, Serge

    2012-01-01

    Many dynamical systems, both natural and artificial, are stimulated by time dependent external signals, somehow processing the information contained therein. We demonstrate how to quantify the different modes in which information can be processed by such systems and combine them to define the computational capacity of a dynamical system. This is bounded by the number of linearly independent state variables of the dynamical system, equaling it if the system obeys the fading memory condition. It can be interpreted as the total number of linearly independent functions of its stimuli the system can compute. Our theory combines concepts from machine learning (reservoir computing), system modeling, stochastic processes, and functional analysis. We illustrate our theory by numerical simulations for the logistic map, a recurrent neural network, and a two-dimensional reaction diffusion system, uncovering universal trade-offs between the non-linearity of the computation and the system's short-term memory. PMID:22816038

  7. Attachment affects social information processing: Specific electrophysiological effects of maternal stimuli.

    Science.gov (United States)

    Wu, Lili; Gu, Ruolei; Zhang, Jianxin

    2016-01-01

    Attachment is critical to each individual. It affects the cognitive-affective processing of social information. The present study examines how attachment affects the processing of social information, specifically maternal information. We assessed the behavioral and electrophysiological responses to maternal information (compared to non-specific others) in a Go/No-go Association Task (GNAT) with 22 participants. The results illustrated that attachment affected maternal information processing during three sequential stages of information processing. First, attachment affected visual perception, reflected by enhanced P100 and N170 elicited by maternal information as compared to others information. Second, compared to others, mother obtained more attentional resources, reflected by faster behavioral response to maternal information and larger P200 and P300. Finally, mother was evaluated positively, reflected by shorter P300 latency in a mother + good condition as compared to a mother + bad condition. These findings indicated that the processing of attachment-relevant information is neurologically differentiated from other types of social information from an early stage of perceptual processing to late high-level processing.

  8. Using measures of information content and complexity of time series as hydrologic metrics

    Science.gov (United States)

    The information theory has been previously used to develop metrics that allowed to characterize temporal patterns in soil moisture dynamics, and to evaluate and to compare performance of soil water flow models. The objective of this study was to apply information and complexity measures to characte...

  9. Principles of neural information processing

    CERN Document Server

    Seelen, Werner v

    2016-01-01

    In this fundamental book the authors devise a framework that describes the working of the brain as a whole. It presents a comprehensive introduction to the principles of Neural Information Processing as well as recent and authoritative research. The books´ guiding principles are the main purpose of neural activity, namely, to organize behavior to ensure survival, as well as the understanding of the evolutionary genesis of the brain. Among the developed principles and strategies belong self-organization of neural systems, flexibility, the active interpretation of the world by means of construction and prediction as well as their embedding into the world, all of which form the framework of the presented description. Since, in brains, their partial self-organization, the lifelong adaptation and their use of various methods of processing incoming information are all interconnected, the authors have chosen not only neurobiology and evolution theory as a basis for the elaboration of such a framework, but also syst...

  10. Exploring the dynamics of formal and informal networks in complex multi-team development projects

    DEFF Research Database (Denmark)

    Kratzer, J.; Gemuenden, H. G.; Lettl, Christopher

    2007-01-01

    The increasing number of complex multi-team projects and the scarcity of knowledge about how to run them successfully, create a need for systematic empirical studies. We attempt to lessen this empirical gap by examining the overlap and structure of formally ascribed design interfaces and informal...... communication networks between participating teams in two complex multi-team projects in the space industry. We study the two projects longitudinally throughout the design and integration phases of product development. There are three major findings. First, formally ascribed design interfaces and informal...... communication networks overlap only marginally. Second, the structure of informal communication remains largely stable in the transition from the design to the integration phase. The third and most intriguing finding is that the weak overlap between formally ascribed design interfaces and the informal...

  11. Living is information processing: from molecules to global systems

    OpenAIRE

    Farnsworth, Keith D.; Nelson, John; Gershenson, Carlos

    2012-01-01

    We extend the concept that life is an informational phenomenon, at every level of organisation, from molecules to the global ecological system. According to this thesis: (a) living is information processing, in which memory is maintained by both molecular states and ecological states as well as the more obvious nucleic acid coding; (b) this information processing has one overall function - to perpetuate itself; and (c) the processing method is filtration (cognition) of, and synthesis of, info...

  12. Intelligent Transportation Control based on Proactive Complex Event Processing

    OpenAIRE

    Wang Yongheng; Geng Shaofeng; Li Qian

    2016-01-01

    Complex Event Processing (CEP) has become the key part of Internet of Things (IoT). Proactive CEP can predict future system states and execute some actions to avoid unwanted states which brings new hope to intelligent transportation control. In this paper, we propose a proactive CEP architecture and method for intelligent transportation control. Based on basic CEP technology and predictive analytic technology, a networked distributed Markov decision processes model with predicting states is p...

  13. Methodology for Simulation and Analysis of Complex Adaptive Supply Network Structure and Dynamics Using Information Theory

    Directory of Open Access Journals (Sweden)

    Joshua Rodewald

    2016-10-01

    Full Text Available Supply networks existing today in many industries can behave as complex adaptive systems making them more difficult to analyze and assess. Being able to fully understand both the complex static and dynamic structures of a complex adaptive supply network (CASN are key to being able to make more informed management decisions and prioritize resources and production throughout the network. Previous efforts to model and analyze CASN have been impeded by the complex, dynamic nature of the systems. However, drawing from other complex adaptive systems sciences, information theory provides a model-free methodology removing many of those barriers, especially concerning complex network structure and dynamics. With minimal information about the network nodes, transfer entropy can be used to reverse engineer the network structure while local transfer entropy can be used to analyze the network structure’s dynamics. Both simulated and real-world networks were analyzed using this methodology. Applying the methodology to CASNs allows the practitioner to capitalize on observations from the highly multidisciplinary field of information theory which provides insights into CASN’s self-organization, emergence, stability/instability, and distributed computation. This not only provides managers with a more thorough understanding of a system’s structure and dynamics for management purposes, but also opens up research opportunities into eventual strategies to monitor and manage emergence and adaption within the environment.

  14. Expert (Peer) Reviews at the Waste Isolation Pilot Plant (WIPP): Making Complex Information and Decision Making Transparent

    International Nuclear Information System (INIS)

    Eriksson, Leif G.

    2001-01-01

    On the 18th of May 1998, based on the information provided by the United Sates Department of Energy (DOE) in support of the 1996 Waste Isolation Pilot Plant (WIPP) Compliance Certification Application, the U.S. Environmental Protection Agency certified the proposed deep geological repository for disposal of long-lived, defense-generated, transuranic radioactive waste at the WIPP site in New Mexico, United States of America, was compliant with all applicable radioactive waste disposal regulations. Seven domestic and one joint international peer reviews commissioned by the DOE were instrumental in making complex scientific and engineering information, as well as the related WIPP decision-making process, both credible and transparent to the majority of affected and interested parties and, ultimately, to the regulator

  15. Expert (Peer) Reviews at the Waste Isolation Pilot Plant (WIPP): Making Complex Information and Decision Making Transparent

    Energy Technology Data Exchange (ETDEWEB)

    Eriksson, Leif G. [GRAM, Inc., Albuquerque, NM (United States)

    2001-07-01

    On the 18th of May 1998, based on the information provided by the United Sates Department of Energy (DOE) in support of the 1996 Waste Isolation Pilot Plant (WIPP) Compliance Certification Application, the U.S. Environmental Protection Agency certified the proposed deep geological repository for disposal of long-lived, defense-generated, transuranic radioactive waste at the WIPP site in New Mexico, United States of America, was compliant with all applicable radioactive waste disposal regulations. Seven domestic and one joint international peer reviews commissioned by the DOE were instrumental in making complex scientific and engineering information, as well as the related WIPP decision-making process, both credible and transparent to the majority of affected and interested parties and, ultimately, to the regulator.

  16. Development of time sensitivity and information processing speed.

    Directory of Open Access Journals (Sweden)

    Sylvie Droit-Volet

    Full Text Available The aim of this study was to examine whether age-related changes in the speed of information processing are the best predictors of the increase in sensitivity to time throughout childhood. Children aged 5 and 8 years old, as well adults, were given two temporal bisection tasks, one with short (0.5/1-s and the other with longer (4/8-s anchor durations. In addition, the participants' scores on different neuropsychological tests assessing both information processing speed and other dimensions of cognitive control (short-term memory, working memory, selective attention were calculated. The results showed that the best predictor of individual variances in sensitivity to time was information processing speed, although working memory also accounted for some of the individual differences in time sensitivity, albeit to a lesser extent. In sum, the faster the information processing speed of the participants, the higher their sensitivity to time was. These results are discussed in the light of the idea that the development of temporal capacities has its roots in the maturation of the dynamic functioning of the brain.

  17. Splash, pop, sizzle: Information processing with phononic computing

    Directory of Open Access Journals (Sweden)

    Sophia R. Sklan

    2015-05-01

    Full Text Available Phonons, the quanta of mechanical vibration, are important to the transport of heat and sound in solid materials. Recent advances in the fundamental control of phonons (phononics have brought into prominence the potential role of phonons in information processing. In this review, the many directions of realizing phononic computing and information processing are examined. Given the relative similarity of vibrational transport at different length scales, the related fields of acoustic, phononic, and thermal information processing are all included, as are quantum and classical computer implementations. Connections are made between the fundamental questions in phonon transport and phononic control and the device level approach to diodes, transistors, memory, and logic.

  18. Theory of Neural Information Processing Systems

    International Nuclear Information System (INIS)

    Galla, Tobias

    2006-01-01

    It is difficult not to be amazed by the ability of the human brain to process, to structure and to memorize information. Even by the toughest standards the behaviour of this network of about 10 11 neurons qualifies as complex, and both the scientific community and the public take great interest in the growing field of neuroscience. The scientific endeavour to learn more about the function of the brain as an information processing system is here a truly interdisciplinary one, with important contributions from biology, computer science, physics, engineering and mathematics as the authors quite rightly point out in the introduction of their book. The role of the theoretical disciplines here is to provide mathematical models of information processing systems and the tools to study them. These models and tools are at the centre of the material covered in the book by Coolen, Kuehn and Sollich. The book is divided into five parts, providing basic introductory material on neural network models as well as the details of advanced techniques to study them. A mathematical appendix complements the main text. The range of topics is extremely broad, still the presentation is concise and the book well arranged. To stress the breadth of the book let me just mention a few keywords here: the material ranges from the basics of perceptrons and recurrent network architectures to more advanced aspects such as Bayesian learning and support vector machines; Shannon's theory of information and the definition of entropy are discussed, and a chapter on Amari's information geometry is not missing either. Finally the statistical mechanics chapters cover Gardner theory and the replica analysis of the Hopfield model, not without being preceded by a brief introduction of the basic concepts of equilibrium statistical physics. The book also contains a part on effective theories of the macroscopic dynamics of neural networks. Many dynamical aspects of neural networks are usually hard to find in the

  19. Information processing in the CNS: a supramolecular chemistry?

    Science.gov (United States)

    Tozzi, Arturo

    2015-10-01

    How does central nervous system process information? Current theories are based on two tenets: (a) information is transmitted by action potentials, the language by which neurons communicate with each other-and (b) homogeneous neuronal assemblies of cortical circuits operate on these neuronal messages where the operations are characterized by the intrinsic connectivity among neuronal populations. In this view, the size and time course of any spike is stereotypic and the information is restricted to the temporal sequence of the spikes; namely, the "neural code". However, an increasing amount of novel data point towards an alternative hypothesis: (a) the role of neural code in information processing is overemphasized. Instead of simply passing messages, action potentials play a role in dynamic coordination at multiple spatial and temporal scales, establishing network interactions across several levels of a hierarchical modular architecture, modulating and regulating the propagation of neuronal messages. (b) Information is processed at all levels of neuronal infrastructure from macromolecules to population dynamics. For example, intra-neuronal (changes in protein conformation, concentration and synthesis) and extra-neuronal factors (extracellular proteolysis, substrate patterning, myelin plasticity, microbes, metabolic status) can have a profound effect on neuronal computations. This means molecular message passing may have cognitive connotations. This essay introduces the concept of "supramolecular chemistry", involving the storage of information at the molecular level and its retrieval, transfer and processing at the supramolecular level, through transitory non-covalent molecular processes that are self-organized, self-assembled and dynamic. Finally, we note that the cortex comprises extremely heterogeneous cells, with distinct regional variations, macromolecular assembly, receptor repertoire and intrinsic microcircuitry. This suggests that every neuron (or group of

  20. Selective exposure to information: how different modes of decision making affect subsequent confirmatory information processing.

    Science.gov (United States)

    Fischer, Peter; Fischer, Julia; Weisweiler, Silke; Frey, Dieter

    2010-12-01

    We investigated whether different modes of decision making (deliberate, intuitive, distracted) affect subsequent confirmatory processing of decision-consistent and inconsistent information. Participants showed higher levels of confirmatory information processing when they made a deliberate or an intuitive decision versus a decision under distraction (Studies 1 and 2). As soon as participants have a cognitive (i.e., deliberate cognitive analysis) or affective (i.e., intuitive and gut feeling) reason for their decision, the subjective confidence in the validity of their decision increases, which results in increased levels of confirmatory information processing (Study 2). In contrast, when participants are distracted during decision making, they are less certain about the validity of their decision and thus are subsequently more balanced in the processing of decision-relevant information.

  1. Automated complex spectra processing of actinide α-radiation

    International Nuclear Information System (INIS)

    Anichenkov, S.V.; Popov, Yu.S.; Tselishchev, I.V.; Mishenev, V.B.; Timofeev, G.A.

    1989-01-01

    Earlier described algorithms of automated processing of complex α - spectra of actinides with the use of Ehlektronika D3-28 computer line, connected with ICA-070 multichannel amplitude pulse analyzer, were realized. The developed program enables to calculated peak intensity and the relative isotope content, to conduct energy calibration of spectra, to calculate peak center of gravity and energy resolution, to perform integral counting in particular part of the spectrum. Error of the method of automated processing depens on the degree of spectrum complication and lies within the limits of 1-12%. 8 refs.; 4 figs.; 2 tabs

  2. 10 CFR 1017.28 - Processing on Automated Information Systems (AIS).

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Processing on Automated Information Systems (AIS). 1017.28... UNCLASSIFIED CONTROLLED NUCLEAR INFORMATION Physical Protection Requirements § 1017.28 Processing on Automated Information Systems (AIS). UCNI may be processed or produced on any AIS that complies with the guidance in OMB...

  3. ENERGETIC CHARGE OF AN INFORMATION PROCESS

    Directory of Open Access Journals (Sweden)

    Popova T.M.

    2009-12-01

    Full Text Available Main laws of technical thermodynamics are universal and could be applied to processes other than thermodynamic ones. The results of the comparison of peculiarities of irreversible informational and thermodynamic processes are presented in the article and a new term “Infopy” is used. A more precise definition of “infopy” as an energetic charge is given in the article.

  4. The Acquisition Process as a Vehicle for Enabling Knowledge Management in the Lifecycle of Complex Federal Systems

    Science.gov (United States)

    Stewart, Helen; Spence, Matt Chew; Holm, Jeanne; Koga, Dennis (Technical Monitor)

    2001-01-01

    This white paper explores how to increase the success and operation of critical, complex, national systems by effectively capturing knowledge management requirements within the federal acquisition process. Although we focus on aerospace flight systems, the principles outlined within may have a general applicability to other critical federal systems as well. Fundamental design deficiencies in federal, mission-critical systems have contributed to recent, highly visible system failures, such as the V-22 Osprey and the Delta rocket family. These failures indicate that the current mechanisms for knowledge management and risk management are inadequate to meet the challenges imposed by the rising complexity of critical systems. Failures of aerospace system operations and vehicles may have been prevented or lessened through utilization of better knowledge management and information management techniques.

  5. Barriers to data quality resulting from the process of coding health information to administrative data: a qualitative study.

    Science.gov (United States)

    Lucyk, Kelsey; Tang, Karen; Quan, Hude

    2017-11-22

    Administrative health data are increasingly used for research and surveillance to inform decision-making because of its large sample sizes, geographic coverage, comprehensivity, and possibility for longitudinal follow-up. Within Canadian provinces, individuals are assigned unique personal health numbers that allow for linkage of administrative health records in that jurisdiction. It is therefore necessary to ensure that these data are of high quality, and that chart information is accurately coded to meet this end. Our objective is to explore the potential barriers that exist for high quality data coding through qualitative inquiry into the roles and responsibilities of medical chart coders. We conducted semi-structured interviews with 28 medical chart coders from Alberta, Canada. We used thematic analysis and open-coded each transcript to understand the process of administrative health data generation and identify barriers to its quality. The process of generating administrative health data is highly complex and involves a diverse workforce. As such, there are multiple points in this process that introduce challenges for high quality data. For coders, the main barriers to data quality occurred around chart documentation, variability in the interpretation of chart information, and high quota expectations. This study illustrates the complex nature of barriers to high quality coding, in the context of administrative data generation. The findings from this study may be of use to data users, researchers, and decision-makers who wish to better understand the limitations of their data or pursue interventions to improve data quality.

  6. Modeling Stochastic Complexity in Complex Adaptive Systems: Non-Kolmogorov Probability and the Process Algebra Approach.

    Science.gov (United States)

    Sulis, William H

    2017-10-01

    Walter Freeman III pioneered the application of nonlinear dynamical systems theories and methodologies in his work on mesoscopic brain dynamics.Sadly, mainstream psychology and psychiatry still cling to linear correlation based data analysis techniques, which threaten to subvert the process of experimentation and theory building. In order to progress, it is necessary to develop tools capable of managing the stochastic complexity of complex biopsychosocial systems, which includes multilevel feedback relationships, nonlinear interactions, chaotic dynamics and adaptability. In addition, however, these systems exhibit intrinsic randomness, non-Gaussian probability distributions, non-stationarity, contextuality, and non-Kolmogorov probabilities, as well as the absence of mean and/or variance and conditional probabilities. These properties and their implications for statistical analysis are discussed. An alternative approach, the Process Algebra approach, is described. It is a generative model, capable of generating non-Kolmogorov probabilities. It has proven useful in addressing fundamental problems in quantum mechanics and in the modeling of developing psychosocial systems.

  7. The Generalization of Mutual Information as the Information between a Set of Variables: The Information Correlation Function Hierarchy and the Information Structure of Multi-Agent Systems

    Science.gov (United States)

    Wolf, David R.

    2004-01-01

    The topic of this paper is a hierarchy of information-like functions, here named the information correlation functions, where each function of the hierarchy may be thought of as the information between the variables it depends upon. The information correlation functions are particularly suited to the description of the emergence of complex behaviors due to many- body or many-agent processes. They are particularly well suited to the quantification of the decomposition of the information carried among a set of variables or agents, and its subsets. In more graphical language, they provide the information theoretic basis for understanding the synergistic and non-synergistic components of a system, and as such should serve as a forceful toolkit for the analysis of the complexity structure of complex many agent systems. The information correlation functions are the natural generalization to an arbitrary number of sets of variables of the sequence starting with the entropy function (one set of variables) and the mutual information function (two sets). We start by describing the traditional measures of information (entropy) and mutual information.

  8. Just-in-time information presentation and the acquisition of complex cognitive skills

    NARCIS (Netherlands)

    Kester, Liesbeth; Kirschner, Paul A.; Van Merriënboer, Jeroen; Bäumer, Anita

    2008-01-01

    Kester, L., Kirschner, P., van Merriënboer, J. J. G., & Bäumer, A. (2001). Just-in-time information presentation and the acquisition of complex cognitive skills. Computers in Human Behavior, 17, 373-391.

  9. On the fragmentation of process information : challenges, solutions, and outlook

    NARCIS (Netherlands)

    Aa, van der J.H.; Leopold, H.; Mannhardt, F.; Reijers, H.A.; Gaaloul, K.; Schmidt, R.; Nurcan, S.; Guerreiro, S.; Ma, Q.

    2015-01-01

    An organization’s knowledge on its business processes represents valuable corporate knowledge because it can be used to enhance the performance of these processes. In many organizations, documentation of process knowledge is scattered around various process information sources. Such information

  10. Supporting risk-informed decisions during business process execution

    NARCIS (Netherlands)

    Conforti, R.; Leoni, de M.; La Rosa, M.; Aalst, van der W.M.P.; Salinesi, C.; Norrie, M.C.; Pastor, O.

    2013-01-01

    This paper proposes a technique that supports process participants in making risk-informed decisions, with the aim to reduce the process risks. Risk reduction involves decreasing the likelihood and severity of a process fault from occurring. Given a process exposed to risks, e.g. a financial process

  11. BRAND program complex

    International Nuclear Information System (INIS)

    Androsenko, A.A.; Androsenko, P.A.

    1983-01-01

    A description is given of the structure, input procedure and recording rules of initial data for the BRAND programme complex intended for the Monte Carlo simulation of neutron physics experiments. The BRAND complex ideology is based on non-analogous simulation of the neutron and photon transport process (statistic weights are used, absorption and escape of particles from the considered region is taken into account, shifted readouts from a coordinate part of transition nucleus density are applied, local estimations, etc. are used). The preparation of initial data for three sections is described in detail: general information for Monte Carlo calculation, source definition and data for describing the geometry of the system. The complex is to be processed with the BESM-6 computer, the basic programming lan-- guage is FORTRAN, volume - more than 8000 operators

  12. Car monitoring information systems

    Directory of Open Access Journals (Sweden)

    Alica KALAŠOVÁ

    2008-01-01

    Full Text Available The objective of this contribution is to characterize alternatives of information systems used for managing, processing and evaluation of information related to company vehicles. Especially we focus on logging, transferring and processing of on-road vehicle movement information in inland and international transportation. This segment of company information system has to monitor the car movement – actively or passively – according to demand of the company and after the processing it has to evaluate and give the complex monitoring of a situation of all the company vehicles to the controller.

  13. [Information processing speed and influential factors in multiple sclerosis].

    Science.gov (United States)

    Zhang, M L; Xu, E H; Dong, H Q; Zhang, J W

    2016-04-19

    To study the information processing speed and the influential factors in multiple sclerosis (MS) patients. A total of 36 patients with relapsing-remitting MS (RRMS), 21 patients with secondary progressive MS (SPMS), and 50 healthy control subjects from Xuanwu Hospital of Capital Medical University between April 2010 and April 2012 were included into this cross-sectional study.Neuropsychological tests was conducted after the disease had been stable for 8 weeks, including information processing speed, memory, executive functions, language and visual perception.Correlation between information processing speed and depression, fatigue, Expanded Disability Status Scale (EDSS) were studied. (1)MS patient groups demonstrated cognitive deficits compared to healthy controls.The Symbol Digit Modalities Test (SDMT) (control group 57±12; RRMS group 46±17; SPMS group 35±10, Pinformation processing (Pinformation processing speed, verbal memory and executive functioning are seen in MS patients, especially in SPMS subtype, while visual-spatial function is relatively reserved.Age, white matter change scales, EDSS scores, depression are negatively associated with information processing speed.

  14. Mental Status Documentation: Information Quality and Data Processes.

    Science.gov (United States)

    Weir, Charlene; Gibson, Bryan; Taft, Teresa; Slager, Stacey; Lewis, Lacey; Staggers, Nancy

    2016-01-01

    Delirium is a fluctuating disturbance of cognition and/or consciousness associated with poor outcomes. Caring for patients with delirium requires integration of disparate information across clinicians, settings and time. The goal of this project was to characterize the information processes involved in nurses' assessment, documentation, decisionmaking and communication regarding patients' mental status in the inpatient setting. VA nurse managers of medical wards (n=18) were systematically selected across the US. A semi-structured telephone interview focused on current assessment, documentation, and communication processes, as well as clinical and administrative decision-making was conducted, audio-recorded and transcribed. A thematic analytic approach was used. Five themes emerged: 1) Fuzzy Concepts, 2) Grey Data, 3) Process Variability 4) Context is Critical and 5) Goal Conflict. This project describes the vague and variable information processes related to delirium and mental status that undermine effective risk, prevention, identification, communication and mitigation of harm.

  15. Kuhlthau’s Classic Research on the Information Search Process (ISP Provides Evidence for Information Seeking as a Constructivist Process. A review of: Kuhlthau, Carol C. “Inside the Search Process: Information Seeking from the User's Perspective.” Journal of the American Society for Information Science 42.5 (1991: 361‐71.

    Directory of Open Access Journals (Sweden)

    Shelagh K. Genuis

    2007-12-01

    general topic. A turning point occurs during focus formulation as constructs become clearer and uncertainty decreases. During information collection theuser is able to articulate focused need and is able to interact effectively with intermediaries and systems. Relief is commonly experienced at presentation stage when findings are presented or used. Although stages are laid out sequentially, Kuhlthau notes that the ISP is an iterative process in which stages merge and overlap.Central to this model is the premise that uncertainty is not due merely to a lack of familiarity with sources and technologies,but is an integral and critical part of a process of learning that culminates in finding meaning through personal synthesis of topic or problem. Conclusion – Kuhlthau provides evidence for a view of information seeking as an evolving, iterative process and presents amodel for purposeful information searching which, if understood by users, intermediaries and information system designers, provides a basis for productive interaction. While users will benefit from understanding the evolving nature of focus formulation and the affective dimensions of information seeking, intermediaries andsystems are challenged to improve information provision in the early formative stages of a search. Although Kuhlthau identifies this research on the ISP as exploratory in nature, this article affords methodological insight into the use of mixed methods for exploring complex user oriented issues, presents a model that effectively communicates an approximation of the common information‐seeking process of users, and provides ongoing impetus for exploring the user’s perspective on information seeking.

  16. Complex Ornament Machining Process on a CNC Router

    Directory of Open Access Journals (Sweden)

    Camelia COŞEREANU

    2014-03-01

    Full Text Available The paper investigates the CNC routering possibilities for three species of wood, namely ash (Fraxinus Excelsior, lime wood (Tilia cordata and fir wood (Abies Alba, in order to obtain right surfaces of Art Nouveau sculptured ornaments. Given the complexity of the CNC tool path for getting wavy shapes of Art Nouveau decorations, the choice of processing parameters for each processed species of wood requires a laborious research work to correlate these parameters. Two Art Nouveau ornaments are proposed for the investigation. They are CNC routered using two types of cutting tools. The processed parameters namely the spindle speed, feed speed and depth of cut were the three variables of the machining process for the three species of wood, which were combined so, to provide good surface finish as a quality attribute. There were totally forty six variants of combining the processing parameter which were applied for CNC routering the samples made of the three species of wood. At the end, an optimum combination of the processed parameters is recommended for each species of wood.

  17. Information processing capacity in psychopathy: Effects of anomalous attention.

    Science.gov (United States)

    Hamilton, Rachel K B; Newman, Joseph P

    2018-03-01

    Hamilton and colleagues (2015) recently proposed that an integrative deficit in psychopathy restricts simultaneous processing, thereby leaving fewer resources available for information encoding, narrowing the scope of attention, and undermining associative processing. The current study evaluated this parallel processing deficit proposal using the Simultaneous-Sequential paradigm. This investigation marks the first a priori test of the Hamilton et al.'s theoretical framework. We predicted that psychopathy would be associated with inferior performance (as indexed by lower accuracy and longer response time) on trials requiring simultaneous processing of visual information relative to trials necessitating sequential processing. Results were consistent with these predictions, supporting the proposal that psychopathy is characterized by a reduced capacity to process multicomponent perceptual information concurrently. We discuss the potential implications of impaired simultaneous processing for the conceptualization of the psychopathic deficit. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  18. Weather Information Processing

    Science.gov (United States)

    1991-01-01

    Science Communications International (SCI), formerly General Science Corporation, has developed several commercial products based upon experience acquired as a NASA Contractor. Among them are METPRO, a meteorological data acquisition and processing system, which has been widely used, RISKPRO, an environmental assessment system, and MAPPRO, a geographic information system. METPRO software is used to collect weather data from satellites, ground-based observation systems and radio weather broadcasts to generate weather maps, enabling potential disaster areas to receive advance warning. GSC's initial work for NASA Goddard Space Flight Center resulted in METPAK, a weather satellite data analysis system. METPAK led to the commercial METPRO system. The company also provides data to other government agencies, U.S. embassies and foreign countries.

  19. Value Assessment Frameworks for HTA Agencies: The Organization of Evidence-Informed Deliberative Processes.

    Science.gov (United States)

    Baltussen, Rob; Jansen, Maarten Paul Maria; Bijlmakers, Leon; Grutters, Janneke; Kluytmans, Anouck; Reuzel, Rob P; Tummers, Marcia; der Wilt, Gert Jan van

    2017-02-01

    Priority setting in health care has been long recognized as an intrinsically complex and value-laden process. Yet, health technology assessment agencies (HTAs) presently employ value assessment frameworks that are ill fitted to capture the range and diversity of stakeholder values and thereby risk compromising the legitimacy of their recommendations. We propose "evidence-informed deliberative processes" as an alternative framework with the aim to enhance this legitimacy. This framework integrates two increasingly popular and complementary frameworks for priority setting: multicriteria decision analysis and accountability for reasonableness. Evidence-informed deliberative processes are, on one hand, based on early, continued stakeholder deliberation to learn about the importance of relevant social values. On the other hand, they are based on rational decision-making through evidence-informed evaluation of the identified values. The framework has important implications for how HTA agencies should ideally organize their processes. First, HTA agencies should take the responsibility of organizing stakeholder involvement. Second, agencies are advised to integrate their assessment and appraisal phases, allowing for the timely collection of evidence on values that are considered relevant. Third, HTA agencies should subject their decision-making criteria to public scrutiny. Fourth, agencies are advised to use a checklist of potentially relevant criteria and to provide argumentation for how each criterion affected the recommendation. Fifth, HTA agencies must publish their argumentation and install options for appeal. The framework should not be considered a blueprint for HTA agencies but rather an aspirational goal-agencies can take incremental steps toward achieving this goal. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  20. Scalable Networked Information Processing Environment (SNIPE)

    Energy Technology Data Exchange (ETDEWEB)

    Fagg, G.E.; Moore, K. [Univ. of Tennessee, Knoxville, TN (United States). Dept. of Computer Science; Dongarra, J.J. [Univ. of Tennessee, Knoxville, TN (United States). Dept. of Computer Science]|[Oak Ridge National Lab., TN (United States). Computer Science and Mathematics Div.; Geist, A. [Oak Ridge National Lab., TN (United States). Computer Science and Mathematics Div.

    1997-11-01

    SNIPE is a metacomputing system that aims to provide a reliable, secure, fault tolerant environment for long term distributed computing applications and data stores across the global Internet. This system combines global naming and replication of both processing and data to support large scale information processing applications leading to better availability and reliability than currently available with typical cluster computing and/or distributed computer environments.

  1. Expectancy-Violation and Information-Theoretic Models of Melodic Complexity

    Directory of Open Access Journals (Sweden)

    Tuomas Eerola

    2016-07-01

    Full Text Available The present study assesses two types of models for melodic complexity: one based on expectancy violations and the other one related to an information-theoretic account of redundancy in music. Seven different datasets spanning artificial sequences, folk and pop songs were used to refine and assess the models. The refinement eliminated unnecessary components from both types of models. The final analysis pitted three variants of the two model types against each other and could explain from 46-74% of the variance in the ratings across the datasets. The most parsimonious models were identified with an information-theoretic criterion. This suggested that the simplified expectancy-violation models were the most efficient for these sets of data. However, the differences between all optimized models were subtle in terms both of performance and simplicity.

  2. Complex processing of antimony-mercury gold concentrates of Dzhizhikrut Deposit

    International Nuclear Information System (INIS)

    Abdusalyamova, M.N.; Gadoev, S.A.; Dreisinger, D.; Solozhenkin, P.M.

    2013-01-01

    Present article is devoted to complex processing of antimony-mercury gold concentrates of Dzhizhikrut Deposit. The purpose of research was obtaining the metallic mercury and antimony with further gold and thallium extraction.

  3. A method for work modeling at complex systems: towards applying information systems in family health care units.

    Science.gov (United States)

    Jatobá, Alessandro; de Carvalho, Paulo Victor R; da Cunha, Amauri Marques

    2012-01-01

    Work in organizations requires a minimum level of consensus on the understanding of the practices performed. To adopt technological devices to support the activities in environments where work is complex, characterized by the interdependence among a large number of variables, understanding about how work is done not only takes an even greater importance, but also becomes a more difficult task. Therefore, this study aims to present a method for modeling of work in complex systems, which allows improving the knowledge about the way activities are performed where these activities do not simply happen by performing procedures. Uniting techniques of Cognitive Task Analysis with the concept of Work Process, this work seeks to provide a method capable of providing a detailed and accurate vision of how people perform their tasks, in order to apply information systems for supporting work in organizations.

  4. COMPLEX SIMULATION MODEL OF TRAIN BREAKING-UP PROCESS AT THE HUMPS

    Directory of Open Access Journals (Sweden)

    E. B. Demchenko

    2015-11-01

    Full Text Available Purpose. One of the priorities of station sorting complex functioning improvement is the breaking-up process energy consumptions reduction, namely: fuel consumption for train pushing and electric energy consumption for cut braking. In this regard, an effective solution of the problem of energy consumption reduction at breaking-up subsystem requires a comprehensive handling of train pushing and cut rolling down processes. At the same time, the analysis showed that the current task of pushing process improvement and cut rolling down effectiveness increase are solved separately. To solve this problem it is necessary to develop the complex simulation model of train breaking up process at humps. Methodology. Pushing process simulation was done based on adapted under the shunting conditions traction calculations. In addition, the features of shunting locomotives work at the humps were taken into account. In order to realize the current pushing mode the special algorithm of hump locomotive controlling, which along with the safety shunting operation requirements takes into account behavioral factors associated with engineer control actions was applied. This algorithm provides train smooth acceleration and further movement with speed, which is close to the set speed. Hump locomotive fuel consumptions were determined based on the amount of mechanical work performed by locomotive traction. Findings. The simulation model of train pushing process was developed and combined with existing cut rolling down model. Cut initial velocity is determined during simulation process. The obtained initial velocity is used for further cut rolling process modeling. In addition, the modeling resulted in sufficiently accurate determination of the fuel rates consumed for train breaking-up. Originality. The simulation model of train breaking-up process at the humps, which in contrast to the existing models allows reproducing complexly all the elements of this process in detail

  5. 1st International Conference on Cognitive Systems and Information Processing

    CERN Document Server

    Hu, Dewen; Liu, Huaping

    2014-01-01

    "Foundations and Practical Applications of Cognitive Systems and Information Processing" presents selected papers from the First International Conference on Cognitive Systems and Information Processing, held in Beijing, China on December 15-17, 2012 (CSIP2012). The aim of this conference is to bring together experts from different fields of expertise to discuss the state-of-the-art in artificial cognitive systems and advanced information processing, and to present new findings and perspectives on future development. This book introduces multidisciplinary perspectives on the subject areas of Cognitive Systems and Information Processing, including cognitive sciences and technology, autonomous vehicles, cognitive psychology, cognitive metrics, information fusion, image/video understanding, brain-computer interfaces, visual cognitive processing, neural computation, bioinformatics, etc. The book will be beneficial for both researchers and practitioners in the fields of Cognitive Science, Computer Science and Cogni...

  6. Complexity Level Analysis Revisited: What Can 30 Years of Hindsight Tell Us about How the Brain Might Represent Visual Information?

    Directory of Open Access Journals (Sweden)

    John K. Tsotsos

    2017-08-01

    Full Text Available Much has been written about how the biological brain might represent and process visual information, and how this might inspire and inform machine vision systems. Indeed, tremendous progress has been made, and especially during the last decade in the latter area. However, a key question seems too often, if not mostly, be ignored. This question is simply: do proposed solutions scale with the reality of the brain's resources? This scaling question applies equally to brain and to machine solutions. A number of papers have examined the inherent computational difficulty of visual information processing using theoretical and empirical methods. The main goal of this activity had three components: to understand the deep nature of the computational problem of visual information processing; to discover how well the computational difficulty of vision matches to the fixed resources of biological seeing systems; and, to abstract from the matching exercise the key principles that lead to the observed characteristics of biological visual performance. This set of components was termed complexity level analysis in Tsotsos (1987 and was proposed as an important complement to Marr's three levels of analysis. This paper revisits that work with the advantage that decades of hindsight can provide.

  7. Working memory capacity and redundant information processing efficiency.

    Science.gov (United States)

    Endres, Michael J; Houpt, Joseph W; Donkin, Chris; Finn, Peter R

    2015-01-01

    Working memory capacity (WMC) is typically measured by the amount of task-relevant information an individual can keep in mind while resisting distraction or interference from task-irrelevant information. The current research investigated the extent to which differences in WMC were associated with performance on a novel redundant memory probes (RMP) task that systematically varied the amount of to-be-remembered (targets) and to-be-ignored (distractor) information. The RMP task was designed to both facilitate and inhibit working memory search processes, as evidenced by differences in accuracy, response time, and Linear Ballistic Accumulator (LBA) model estimates of information processing efficiency. Participants (N = 170) completed standard intelligence tests and dual-span WMC tasks, along with the RMP task. As expected, accuracy, response-time, and LBA model results indicated memory search and retrieval processes were facilitated under redundant-target conditions, but also inhibited under mixed target/distractor and redundant-distractor conditions. Repeated measures analyses also indicated that, while individuals classified as high (n = 85) and low (n = 85) WMC did not differ in the magnitude of redundancy effects, groups did differ in the efficiency of memory search and retrieval processes overall. Results suggest that redundant information reliably facilitates and inhibits the efficiency or speed of working memory search, and these effects are independent of more general limits and individual differences in the capacity or space of working memory.

  8. Design and analysis of information model hotel complex

    Directory of Open Access Journals (Sweden)

    Garyaev Nikolai

    2016-01-01

    Full Text Available The article analyzes the innovation in 3D modeling and development of process design approaches based on visualization of information technology and computer-aided design systems. The problems arising in the modern design and the approach to address them.

  9. Study of relaxation processes in 'metal-ligand' complexes after electron capture decay using the 111In isotope as an example

    International Nuclear Information System (INIS)

    Shpinkova, L.G.; Golubeva, A.S.; Ryasny, O.K.; Nikitin, S.M.; Sorokin, A.A.; Uzbyakova, A.S.

    2003-01-01

    Full text: Complexes of metals with organic ligands are widely used in different scientific and industrial applications, such as analytical chemistry, oil production and refinery, power engineering, water treatment, agriculture, etc. Several hundred complexes are commercially available and there are still intensive works on synthesizing new complexones in order to obtain complexes with required properties. In this connection, it is important to investigate the molecular characteristics of complexes and their structures and correlate with their properties. One of the methods, which proved to provide useful information about metal-ligand complexes behaviour, is the method of time differential perturbed angular γγ-correlation (Tdpa). Numerous works have been devoted to studies of different complexes by this technique. In the Inst. of Nuclear Physics, Lomonosov Moscow State Univ., Moscow, this method was applied to studies of electron capture (EC) after-effects and their influence on the indium-ligand complexes in aqueous solutions. The present work is devoted to studies of relaxation processes in daughter 111 Cd-ligand complexes formed after 111 In EC decay. EC leaves a hole in an inner electronic shell of the daughter atom, which is followed by Auger-process. This process leads to a highly excited state of a daughter atom, which either causes disintegration of a complex into small fragments or relaxation to a stable complex with the daughter atom. TDPAC measurements were performed for a number of 111 In-complexes with acetic and phosphonic ligands. All measurements were performed for neutral aqueous solutions of complexes at room temperature. Three types of molecules containing radioactive daughter 11C d atoms were observed after 111 In decay for all studied complexes. One fraction corresponds to the intact complexes, the second one - to fully disintegrated complexes. The third fraction was characterized by a fast relaxation parameter indicating high transient

  10. Information processing speed in obstructive sleep apnea syndrome: a review.

    Science.gov (United States)

    Kilpinen, R; Saunamäki, T; Jehkonen, M

    2014-04-01

    To provide a comprehensive review of studies on information processing speed in patients with obstructive sleep apnea syndrome (OSAS) as compared to healthy controls and normative data, and to determine whether continuous positive airway pressure (CPAP) treatment improves information processing speed. A systematic review was performed on studies drawn from Medline and PsycINFO (January 1990-December 2011) and identified from lists of references in these studies. After inclusion criteria, 159 articles were left for abstract review, and after exclusion criteria 44 articles were fully reviewed. The number of patients in the studies reviewed ranged from 10 to 157 and the study samples consisted mainly of men. Half of the studies reported that patients with OSAS showed reduced information processing speed when compared to healthy controls. Reduced information processing speed was seen more often (75%) when compared to norm-referenced data. Psychomotor speed seemed to be particularly liable to change. CPAP treatment improved processing speed, but the improvement was marginal when compared to placebo or conservative treatment. Patients with OSAS are affected by reduced information processing speed, which may persist despite CPAP treatment. Information processing is usually assessed as part of other cognitive functioning, not as a cognitive domain per se. However, it is important to take account of information processing speed when assessing other aspects of cognitive functioning. This will make it possible to determine whether cognitive decline in patients with OSAS is based on lower-level or higher-level cognitive processes or both. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  11. Reinforcing Visual Grouping Cues to Communicate Complex Informational Structure.

    Science.gov (United States)

    Bae, Juhee; Watson, Benjamin

    2014-12-01

    In his book Multimedia Learning [7], Richard Mayer asserts that viewers learn best from imagery that provides them with cues to help them organize new information into the correct knowledge structures. Designers have long been exploiting the Gestalt laws of visual grouping to deliver viewers those cues using visual hierarchy, often communicating structures much more complex than the simple organizations studied in psychological research. Unfortunately, designers are largely practical in their work, and have not paused to build a complex theory of structural communication. If we are to build a tool to help novices create effective and well structured visuals, we need a better understanding of how to create them. Our work takes a first step toward addressing this lack, studying how five of the many grouping cues (proximity, color similarity, common region, connectivity, and alignment) can be effectively combined to communicate structured text and imagery from real world examples. To measure the effectiveness of this structural communication, we applied a digital version of card sorting, a method widely used in anthropology and cognitive science to extract cognitive structures. We then used tree edit distance to measure the difference between perceived and communicated structures. Our most significant findings are: 1) with careful design, complex structure can be communicated clearly; 2) communicating complex structure is best done with multiple reinforcing grouping cues; 3) common region (use of containers such as boxes) is particularly effective at communicating structure; and 4) alignment is a weak structural communicator.

  12. Strategic-Decision Quality in Public Organizations: An Information Processing Perspective.

    NARCIS (Netherlands)

    B.R.J. George (Bert); S. Desmidt (Sebastian)

    2016-01-01

    textabstractThis study draws on information processing theory to investigate predictors of strategic-decision quality in public organizations. Information processing theory argues that (a) rational planning practices contribute to strategic-decision quality by injecting information into decision

  13. Membrane Processes Based on Complexation Reactions of Pollutants as Sustainable Wastewater Treatments

    Directory of Open Access Journals (Sweden)

    Teresa Poerio

    2009-11-01

    Full Text Available Water is today considered to be a vital and limited resource due to industrial development and population growth. Developing appropriate water treatment techniques, to ensure a sustainable management, represents a key point in the worldwide strategies. By removing both organic and inorganic species using techniques based on coupling membrane processes and appropriate complexing agents to bind pollutants are very important alternatives to classical separation processes in water treatment. Supported Liquid Membrane (SLM and Complexation Ultrafiltration (CP-UF based processes meet the sustainability criteria because they require low amounts of energy compared to pressure driven membrane processes, low amounts of complexing agents and they allow recovery of water and some pollutants (e.g., metals. A more interesting process, on the application point of view, is the Stagnant Sandwich Liquid Membrane (SSwLM, introduced as SLM implementation. It has been studied in the separation of the drug gemfibrozil (GEM and of copper(II as organic and inorganic pollutants in water. Obtained results showed in both cases the higher efficiency of SSwLM with respect to the SLM system configuration. Indeed higher stability (335.5 vs. 23.5 hours for GEM; 182.7 vs. 49.2 for copper(II and higher fluxes (0.662 vs. 0.302 mmol·h-1·m-2 for GEM; 43.3 vs. 31.0 for copper(II were obtained by using the SSwLM. Concerning the CP-UF process, its feasibility was studied in the separation of metals from waters (e.g., from soil washing, giving particular attention to process sustainability such as water and polymer recycle, free metal and water recovery. The selectivity of the CP-UF process was also validated in the separate removal of copper(II and nickel(II both contained in synthetic and real aqueous effluents. Thus, complexation reactions involved in the SSwLM and the CP-UF processes play a key role to meet the sustainability criteria.

  14. Optimal Information Processing in Biochemical Networks

    Science.gov (United States)

    Wiggins, Chris

    2012-02-01

    A variety of experimental results over the past decades provide examples of near-optimal information processing in biological networks, including in biochemical and transcriptional regulatory networks. Computing information-theoretic quantities requires first choosing or computing the joint probability distribution describing multiple nodes in such a network --- for example, representing the probability distribution of finding an integer copy number of each of two interacting reactants or gene products while respecting the `intrinsic' small copy number noise constraining information transmission at the scale of the cell. I'll given an overview of some recent analytic and numerical work facilitating calculation of such joint distributions and the associated information, which in turn makes possible numerical optimization of information flow in models of noisy regulatory and biochemical networks. Illustrating cases include quantification of form-function relations, ideal design of regulatory cascades, and response to oscillatory driving.

  15. Information Design for Synchronization and Co-ordination of Modern, Complex, Multi-National Operations

    Science.gov (United States)

    2011-06-01

    1 16th ICCRTS Information design for synchronization and co-ordination of modern, complex, multi- national operations “Collective C2 in...REPORT DATE JUN 2011 2. REPORT TYPE 3. DATES COVERED 00-00-2011 to 00-00-2011 4. TITLE AND SUBTITLE Information design for synchronization and co...at 11th ICCRTS) who emphasise that information needs to be designed, not merely found or catalogued, to achieve synchronizations and co-ordinations

  16. Effects of spectral complexity and sound duration on automatic complex-sound pitch processing in humans - a mismatch negativity study.

    Science.gov (United States)

    Tervaniemi, M; Schröger, E; Saher, M; Näätänen, R

    2000-08-18

    The pitch of a spectrally rich sound is known to be more easily perceived than that of a sinusoidal tone. The present study compared the importance of spectral complexity and sound duration in facilitated pitch discrimination. The mismatch negativity (MMN), which reflects automatic neural discrimination, was recorded to a 2. 5% pitch change in pure tones with only one sinusoidal frequency component (500 Hz) and in spectrally rich tones with three (500-1500 Hz) and five (500-2500 Hz) harmonic partials. During the recordings, subjects concentrated on watching a silent movie. In separate blocks, stimuli were of 100 and 250 ms in duration. The MMN amplitude was enhanced with both spectrally rich sounds when compared with pure tones. The prolonged sound duration did not significantly enhance the MMN. This suggests that increased spectral rather than temporal information facilitates pitch processing of spectrally rich sounds.

  17. LanguageNet: A Novel Framework for Processing Unstructured Text Information

    DEFF Research Database (Denmark)

    Qureshi, Pir Abdul Rasool; Memon, Nasrullah; Wiil, Uffe Kock

    2011-01-01

    In this paper we present LanguageNet—a novel framework for processing unstructured text information from human generated content. The state of the art information processing frameworks have some shortcomings: modeled in generalized form, trained on fixed (limited) data sets, and leaving...... the specialization necessary for information consolidation to the end users. The proposed framework is the first major attempt to address these shortcomings. LanguageNet provides extended support of graphical methods contributing added value to the capabilities of information processing. We discuss the benefits...... of the framework and compare it with the available state of the art. We also describe how the framework improves the information gathering process and contribute towards building systems with better performance in the domain of Open Source Intelligence....

  18. Information technology, knowledge processes, and innovation success

    NARCIS (Netherlands)

    Song, X.M.; Zang, F.; Bij, van der J.D.; Weggeman, M.C.D.P.

    2001-01-01

    Despite the obvious linkage between information technologies (IT) and knowledge processes and the apparent strategic importance of both, little research has done to explicitly examine how, if at all, IT and knowledge processes affect firm outcomes. The purpose of this study is to bridge this

  19. Utility-based early modulation of processing distracting stimulus information.

    Science.gov (United States)

    Wendt, Mike; Luna-Rodriguez, Aquiles; Jacobsen, Thomas

    2014-12-10

    Humans are selective information processors who efficiently prevent goal-inappropriate stimulus information to gain control over their actions. Nonetheless, stimuli, which are both unnecessary for solving a current task and liable to cue an incorrect response (i.e., "distractors"), frequently modulate task performance, even when consistently paired with a physical feature that makes them easily discernible from target stimuli. Current models of cognitive control assume adjustment of the processing of distractor information based on the overall distractor utility (e.g., predictive value regarding the appropriate response, likelihood to elicit conflict with target processing). Although studies on distractor interference have supported the notion of utility-based processing adjustment, previous evidence is inconclusive regarding the specificity of this adjustment for distractor information and the stage(s) of processing affected. To assess the processing of distractors during sensory-perceptual phases we applied EEG recording in a stimulus identification task, involving successive distractor-target presentation, and manipulated the overall distractor utility. Behavioral measures replicated previously found utility modulations of distractor interference. Crucially, distractor-evoked visual potentials (i.e., posterior N1) were more pronounced in high-utility than low-utility conditions. This effect generalized to distractors unrelated to the utility manipulation, providing evidence for item-unspecific adjustment of early distractor processing to the experienced utility of distractor information. Copyright © 2014 the authors 0270-6474/14/3416720-06$15.00/0.

  20. Methods of Complex Data Processing from Technical Means of Monitoring

    Directory of Open Access Journals (Sweden)

    Serhii Tymchuk

    2017-03-01

    Full Text Available The problem of processing the information from different types of monitoring equipment was examined. The use of generalized methods of information processing, based on the techniques of clustering combined territorial information sources for monitoring and the use of framing model of knowledge base for identification of monitoring objects was proposed as a possible solution of the problem. Clustering methods were formed on the basis of Lance-Williams hierarchical agglomerative procedure using the Ward metrics. Frame model of knowledge base was built using the tools of object-oriented modeling.

  1. INFORMATION SYSTEM OF AUTOMATION OF PREPARATION EDUCATIONAL PROCESS DOCUMENTS

    Directory of Open Access Journals (Sweden)

    V. A. Matyushenko

    2016-01-01

    Full Text Available Information technology is rapidly conquering the world, permeating all spheres of human activity. Education is not an exception. An important direction of information of education is the development of university management systems. Modern information systems improve and facilitate the management of all types of activities of the institution. The purpose of this paper is development of system, which allows automating process of formation of accounting documents. The article describes the problem of preparation of the educational process documents. Decided to project and create the information system in Microsoft Access environment. The result is four types of reports obtained by using the developed system. The use of this system now allows you to automate the process and reduce the effort required to prepare accounting documents. All reports was implement in Microsoft Excel software product and can be used for further analysis and processing.

  2. Role of Information Anxiety and Information Load on Processing of Prescription Drug Information Leaflets.

    Science.gov (United States)

    Bapat, Shweta S; Patel, Harshali K; Sansgiry, Sujit S

    2017-10-16

    In this study, we evaluate the role of information anxiety and information load on the intention to read information from prescription drug information leaflets (PILs). These PILs were developed based on the principals of information load and consumer information processing. This was an experimental prospective repeated measures study conducted in the United States where 360 (62% response rate) university students (>18 years old) participated. Participants were presented with a scenario followed by exposure to the three drug product information sources used to operationalize information load. The three sources were: (i) current practice; (ii) pre-existing one-page text only; and (iii) interventional one-page prototype PILs designed for the study. Information anxiety was measured as anxiety experienced by the individual when encountering information. The outcome variable of intention to read PILs was defined as the likelihood that the patient will read the information provided in the leaflets. A survey questionnaire was used to capture the data and the objectives were analyzed by performing a repeated measures MANOVA using SAS version 9.3. When compared to current practice and one-page text only leaflets, one-page PILs had significantly lower scores on information anxiety ( p information load ( p Information anxiety and information load significantly impacted intention to read ( p < 0.001). Newly developed PILs increased patient's intention to read and can help in improving the counseling services provided by pharmacists.

  3. IMPROVING THE QUALITY OF MAINTENANCE PROCESSES USING INFORMATION TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    Zora Arsovski

    2008-06-01

    Full Text Available In essence, process of maintaining equipment is a support process, because it indirectly contributes to operational ability of the production process necessary for the supply chain of the new value. Taking into account increased levels of automatization and quality, this proces s becomes more and more significant and for some branches of industry, even crucial. Due to the fact that the quality of the entire process is more and more dependent on the maintenance process, these processes must be carefully designed and effectively im plemented. There are various techniques and approaches at our disposal, such as technical, logistical and intensive application of the information - communication technologies. This last approach is presented in this work. It begins with organizational goa ls, especially quality objectives. Then, maintenance processes and integrated information system structures are defined. Maintenance process quality and improvement processes are defined using a set of performances, with a special emphasis placed on effectiveness and quality economics. At the end of the work, information system for improving maintenance economics is structured. Besides theoretical analysis, work also presents results authors obtained analyzing food industry, metal processing industry an d building materials industry.

  4. Complexity in electronic negotiation support systems.

    Science.gov (United States)

    Griessmair, Michele; Strunk, Guido; Vetschera, Rudolf; Koeszegi, Sabine T

    2011-10-01

    It is generally acknowledged that the medium influences the way we communicate and negotiation research directs considerable attention to the impact of different electronic communication modes on the negotiation process and outcomes. Complexity theories offer models and methods that allow the investigation of how pattern and temporal sequences unfold over time in negotiation interactions. By focusing on the dynamic and interactive quality of negotiations as well as the information, choice, and uncertainty contained in the negotiation process, the complexity perspective addresses several issues of central interest in classical negotiation research. In the present study we compare the complexity of the negotiation communication process among synchronous and asynchronous negotiations (IM vs. e-mail) as well as an electronic negotiation support system including a decision support system (DSS). For this purpose, transcripts of 145 negotiations have been coded and analyzed with the Shannon entropy and the grammar complexity. Our results show that negotiating asynchronically via e-mail as well as including a DSS significantly reduces the complexity of the negotiation process. Furthermore, a reduction of the complexity increases the probability of reaching an agreement.

  5. ELECTRIC FACTORS INFLUENCING THE COMPLEX EROSION PROCESSING BY INTRODUCING THE ELECTROLYTE THROUGH THE TRANSFER OBJECT

    Directory of Open Access Journals (Sweden)

    Alin Nioata

    2014-05-01

    Full Text Available The electric and electrochemical complex erosion processing is influenced by a great number of factors acting in tight interdependence and mutually influencing one another for achieving the stability of the processing process and achieving the final technological characteristics.The values taking part in developing the fundamental phenomena of the mechanism of complex erosion prevailing and contributes to the definition of technological characteristics, are factors.The paper presents the U potential difference and electric strength I as determining factors of the complex erosion process as well as other factors deriving from them: the current density, the power of the supply source.

  6. The Role of Dysfunctional Myths in a Decision-Making Process under Bounded Rationality: A Complex Dynamical Systems Perspective.

    Science.gov (United States)

    Stamovlasis, Dimitrios; Vaiopoulou, Julie

    2017-07-01

    The present study examines the factors influencing a decision-making process, with specific focus on the role of dysfunctional myths (DM). DM are thoughts or beliefs that are rather irrational, however influential to people's decisions. In this paper a decision-making process regarding the career choice of university students majoring in natural sciences and education (N=496) is examined by analyzing survey data taken via Career Decision Making Difficulties Questionnaire (CDDQ). The difficulty of making the choice and the certainty about one's decision were the state variables, while the independent variables were factors related to the lack of information or knowledge needed, which actually reflect a bounded rationality. Cusp catastrophe analysis, based on both least squares and maximum likelihood procedures, showed that the nonlinear models predicting the two state variables were superior to linear alternatives. Factors related to lack of knowledge about the steps involved in the process of career decision-making, lack of information about the various occupations, lack of information about self and lack of motivation acted as asymmetry, while dysfunctional myths acted as bifurcation factor for both state variables. The catastrophe model, grounded in empirical data, revealed a unique role for DM and a better interpretation within the context of complexity and the notion of bounded rationality. The analysis opens the nonlinear dynamical systems (NDS) perspective in studying decision-making processes. Theoretical and practical implications are discussed.

  7. Markovian Processes for Quantitative Information Leakage

    DEFF Research Database (Denmark)

    Biondi, Fabrizio

    Quantification of information leakage is a successful approach for evaluating the security of a system. It models the system to be analyzed as a channel with the secret as the input and an output as observable by the attacker as the output, and applies information theory to quantify the amount...... and randomized processes with Markovian models and to compute their information leakage for a very general model of attacker. We present the QUAIL tool that automates such analysis and is able to compute the information leakage of an imperative WHILE language. Finally, we show how to use QUAIL to analyze some...... of information transmitted through such channel, thus effectively quantifying how many bits of the secret can be inferred by the attacker by analyzing the system’s output. Channels are usually encoded as matrices of conditional probabilities, known as channel matrices. Such matrices grow exponentially...

  8. Harvesting Social Signals to Inform Peace Processes Implementation and Monitoring.

    Science.gov (United States)

    Nigam, Aastha; Dambanemuya, Henry K; Joshi, Madhav; Chawla, Nitesh V

    2017-12-01

    Peace processes are complex, protracted, and contentious involving significant bargaining and compromising among various societal and political stakeholders. In civil war terminations, it is pertinent to measure the pulse of the nation to ensure that the peace process is responsive to citizens' concerns. Social media yields tremendous power as a tool for dialogue, debate, organization, and mobilization, thereby adding more complexity to the peace process. Using Colombia's final peace agreement and national referendum as a case study, we investigate the influence of two important indicators: intergroup polarization and public sentiment toward the peace process. We present a detailed linguistic analysis to detect intergroup polarization and a predictive model that leverages Tweet structure, content, and user-based features to predict public sentiment toward the Colombian peace process. We demonstrate that had proaccord stakeholders leveraged public opinion from social media, the outcome of the Colombian referendum could have been different.

  9. Synaptic plasticity, neural circuits, and the emerging role of altered short-term information processing in schizophrenia

    Science.gov (United States)

    Crabtree, Gregg W.; Gogos, Joseph A.

    2014-01-01

    Synaptic plasticity alters the strength of information flow between presynaptic and postsynaptic neurons and thus modifies the likelihood that action potentials in a presynaptic neuron will lead to an action potential in a postsynaptic neuron. As such, synaptic plasticity and pathological changes in synaptic plasticity impact the synaptic computation which controls the information flow through the neural microcircuits responsible for the complex information processing necessary to drive adaptive behaviors. As current theories of neuropsychiatric disease suggest that distinct dysfunctions in neural circuit performance may critically underlie the unique symptoms of these diseases, pathological alterations in synaptic plasticity mechanisms may be fundamental to the disease process. Here we consider mechanisms of both short-term and long-term plasticity of synaptic transmission and their possible roles in information processing by neural microcircuits in both health and disease. As paradigms of neuropsychiatric diseases with strongly implicated risk genes, we discuss the findings in schizophrenia and autism and consider the alterations in synaptic plasticity and network function observed in both human studies and genetic mouse models of these diseases. Together these studies have begun to point toward a likely dominant role of short-term synaptic plasticity alterations in schizophrenia while dysfunction in autism spectrum disorders (ASDs) may be due to a combination of both short-term and long-term synaptic plasticity alterations. PMID:25505409

  10. The complex nature of informal care in home-based heart failure management.

    Science.gov (United States)

    Clark, Alexander M; Reid, Margaret E; Morrison, Caroline E; Capewell, Simon; Murdoch, David L; McMurray, John J

    2008-02-01

    This paper is a report of a study to examine the complexities of informal caregiving for people with chronic heart failure. Little is known of the activities involved and underlying informal care. Heart failure is a common and burdensome condition in which carers play an important management role. Semi-structured interviews were carried out with 30 informal carers nominated by patients with mild-to-moderate heart failure (24 spouses, four children, one sibling and one neighbour). Interviews examined knowledge of heart failure, its effects, reported management practices and concerns, decision making and support. The data were collected in 2001. The management of heart failure was a shared and ongoing responsibility between the carer and patient. Carers' clinical knowledge of the condition and management was often limited, but they developed extensive knowledge of its personal effects on the patient. Invisible care activities included monitoring signs of symptom exacerbation and energy boundaries against perceived current and future demands and priorities. Visible care activities included medication management, dressing, bathing and help-seeking. Carers responded to patients' capacities, and adopted philosophies that sought to foster independence while facilitating as normal a life for the patient as was possible and safe. Interventions for informal carers around effective chronic heart failure management should address both visible and invisible informal caring. Future research is needed to develop interventions with carers to improve quality of care, reduce costs and improve patient quality of life. More research is needed to explore the complexities of lay caregiving and to explore the invisible dimensions of informal care further.

  11. An experimental-differential investigation of cognitive complexity

    Directory of Open Access Journals (Sweden)

    2009-12-01

    Full Text Available Cognitive complexity as defined by differential and experimental traditions was explored to investigate the theoretical advantage and utility of relational complexity (RC theory as a common framework for studying fluid cognitive functions. RC theory provides a domain general account of processing demand as a function of task complexity. In total, 142 participants completed two tasks in which RC was manipulated, and two tasks entailing manipulations of complexity derived from the differential psychology literature. A series of analyses indicated that, as expected, task manipulations influenced item difficulty. However, comparable changes in a psychometric index of complexity were not consistently observed. Active maintenance of information across multiple steps of the problem solving process, which entails strategic coordination of storage and processing that cannot be modelled under the RC framework was found to be an important component of cognitive complexity.

  12. Evaluation of EMG processing techniques using Information Theory.

    Science.gov (United States)

    Farfán, Fernando D; Politti, Julio C; Felice, Carmelo J

    2010-11-12

    Electromyographic signals can be used in biomedical engineering and/or rehabilitation field, as potential sources of control for prosthetics and orthotics. In such applications, digital processing techniques are necessary to follow efficient and effectively the changes in the physiological characteristics produced by a muscular contraction. In this paper, two methods based on information theory are proposed to evaluate the processing techniques. These methods determine the amount of information that a processing technique is able to extract from EMG signals. The processing techniques evaluated with these methods were: absolute mean value (AMV), RMS values, variance values (VAR) and difference absolute mean value (DAMV). EMG signals from the middle deltoid during abduction and adduction movement of the arm in the scapular plane was registered, for static and dynamic contractions. The optimal window length (segmentation), abduction and adduction movements and inter-electrode distance were also analyzed. Using the optimal segmentation (200 ms and 300 ms in static and dynamic contractions, respectively) the best processing techniques were: RMS, AMV and VAR in static contractions, and only the RMS in dynamic contractions. Using the RMS of EMG signal, variations in the amount of information between the abduction and adduction movements were observed. Although the evaluation methods proposed here were applied to standard processing techniques, these methods can also be considered as alternatives tools to evaluate new processing techniques in different areas of electrophysiology.

  13. 49 CFR 564.5 - Information filing; agency processing of filings.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 6 2010-10-01 2010-10-01 false Information filing; agency processing of filings... HIGHWAY TRAFFIC SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION REPLACEABLE LIGHT SOURCE INFORMATION (Eff. until 12-01-12) § 564.5 Information filing; agency processing of filings. (a) Each manufacturer...

  14. Affect and Persuasion: Effects on Motivation for Information Processing.

    Science.gov (United States)

    Leach, Mark M; Stoltenberg, Cal D.

    The relationship between mood and information processing, particularly when reviewing the Elaboration Likelihood Model of persuasion, lacks conclusive evidence. This study was designed to investigate the hypothesis that information processing would be greater for mood-topic congruence than non mood-topic congruence. Undergraduate students (N=216)…

  15. Increase in Complexity and Information through Molecular Evolution

    Directory of Open Access Journals (Sweden)

    Peter Schuster

    2016-11-01

    Full Text Available Biological evolution progresses by essentially three different mechanisms: (I optimization of properties through natural selection in a population of competitors; (II development of new capabilities through cooperation of competitors caused by catalyzed reproduction; and (III variation of genetic information through mutation or recombination. Simplified evolutionary processes combine two out of the three mechanisms: Darwinian evolution combines competition (I and variation (III and is represented by the quasispecies model, major transitions involve cooperation (II of competitors (I, and the third combination, cooperation (II and variation (III provides new insights in the role of mutations in evolution. A minimal kinetic model based on simple molecular mechanisms for reproduction, catalyzed reproduction and mutation is introduced, cast into ordinary differential equations (ODEs, and analyzed mathematically in form of its implementation in a flow reactor. Stochastic aspects are investigated through computer simulation of trajectories of the corresponding chemical master equations. The competition-cooperation model, mechanisms (I and (II, gives rise to selection at low levels of resources and leads to symbiontic cooperation in case the material required is abundant. Accordingly, it provides a kind of minimal system that can undergo a (major transition. Stochastic effects leading to extinction of the population through self-enhancing oscillations destabilize symbioses of four or more partners. Mutations (III are not only the basis of change in phenotypic properties but can also prevent extinction provided the mutation rates are sufficiently large. Threshold phenomena are observed for all three combinations: The quasispecies model leads to an error threshold, the competition-cooperation model allows for an identification of a resource-triggered bifurcation with the transition, and for the cooperation-mutation model a kind of stochastic threshold for

  16. Social dimension and complexity differentially influence brain responses during feedback processing.

    Science.gov (United States)

    Pfabigan, Daniela M; Gittenberger, Marianne; Lamm, Claus

    2017-10-30

    Recent research emphasizes the importance of social factors during performance monitoring. Thus, the current study investigated the impact of social stimuli -such as communicative gestures- on feedback processing. Moreover, it addressed a shortcoming of previous studies, which failed to consider stimulus complexity as potential confounding factor. Twenty-four volunteers performed a time estimation task while their electroencephalogram was recorded. Either social complex, social non-complex, non-social complex, or non-social non-complex stimuli were used to provide performance feedback. No effects of social dimension or complexity were found for task performance. In contrast, Feedback-Related Negativity (FRN) and P300 amplitudes were sensitive to both factors, with larger FRN and P300 amplitudes after social compared to non-social stimuli, and larger FRN amplitudes after complex positive than non-complex positive stimuli. P2 amplitudes were solely sensitive to feedback valence and social dimension. Subjectively, social complex stimuli were rated as more motivating than non-social complex ones. Independently of each other, social dimension and visual complexity influenced amplitude variation during performance monitoring. Social stimuli seem to be perceived as more salient, which is corroborated by P2, FRN and P300 results, as well as by subjective ratings. This could be explained due to their given relevance during every day social interactions.

  17. [Providing information to patient's families on the end of life process in the intensive care unit. Nursing evaluation].

    Science.gov (United States)

    Pascual-Fernández, M Cristina

    2014-01-01

    Informing is a process that includes many aspects and when it involves a family member at the end of life it becomes a complicated matter, not only for giving the information, but also for the mood of family members. Thus, the information should be adapted to the language and education of the patient and family. That information must be proper and suitable to the moment. To describe the aspects of information offered to relatives of patients in the end of life process in Intensive Care Units (ICU), and to determine the nursing evaluation in this process. To evaluate the professionals' attitude on this subject. An observational study conducted on nurses in pediatric and adult ICU nurses of a large public health hospital complexes in the city of Madrid. The data was collected using a questionnaire on the evaluation of care of children who died in pediatric ICU. The majority of the nurses, 71% (159), said that the information was given in a place alone with the doctor. More than half (52.4%, 118) considered that the information was sufficient/insufficient depending on the day. Significant differences were found as regards the behavior of the staff at the time of a death in (P<.01), with pediatric ICU professionals being more empathetic. ICU nurses believe that the information is appropriate for the prognosis and adapted to the patient situation. They also consider the place where the information is given and the attitude of the professionals in the end of life process are adequate. Copyright © 2013 Elsevier España, S.L. All rights reserved.

  18. Monkeys preferentially process body information while viewing affective displays.

    Science.gov (United States)

    Bliss-Moreau, Eliza; Moadab, Gilda; Machado, Christopher J

    2017-08-01

    Despite evolutionary claims about the function of facial behaviors across phylogeny, rarely are those hypotheses tested in a comparative context-that is, by evaluating how nonhuman animals process such behaviors. Further, while increasing evidence indicates that humans make meaning of faces by integrating contextual information, including that from the body, the extent to which nonhuman animals process contextual information during affective displays is unknown. In the present study, we evaluated the extent to which rhesus macaques (Macaca mulatta) process dynamic affective displays of conspecifics that included both facial and body behaviors. Contrary to hypotheses that they would preferentially attend to faces during affective displays, monkeys looked for longest, most frequently, and first at conspecifics' bodies rather than their heads. These findings indicate that macaques, like humans, attend to available contextual information during the processing of affective displays, and that the body may also be providing unique information about affective states. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  19. [Postdonation information: the French fourth hemovigilance sub-process].

    Science.gov (United States)

    Py, J-Y; Sandid, I; Jbilou, S; Dupuis, M; Adda, R; Narbey, D; Djoudi, R

    2014-11-01

    Postdonation information is the knowledge of information about the donor or his donation, occurring after it, which challenges quality or safety of the blood products stemming from this or other donations. Classical hemovigilance sub-processes concerning donors or recipients adverse events do not cover this topic. France is just about to make it official as a fourth sub-process. Less formal management of postdonation information is already set up for more than ten years. French data of the year 2013 are presented, including the regional notification level and the national reporting one. A significant level of heterogeneity is observed as for other hemovigilance sub-processes. It is mainly due to subjective rather than objective differences in risk appreciation. A real consensual work is expected about it in the future. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  20. Real-time information and processing system for radiation protection

    International Nuclear Information System (INIS)

    Oprea, I.; Oprea, M.; Stoica, M.; Badea, E.; Guta, V.

    1999-01-01

    The real-time information and processing system has as main task to record, collect, process and transmit the radiation level and weather data, being proposed for radiation protection, environmental monitoring around nuclear facilities and for civil defence. Such a system can offer information in order to provide mapping, data base, modelling and communication and to assess the consequences of nuclear accidents. The system incorporates a number of stationary or mobile radiation monitoring equipment, weather parameter measuring station, a GIS-based information processing center and the communication network, all running on a real-time operating system. It provides the automatic data collection on-line and off-line, remote diagnostic, advanced presentation techniques, including a graphically oriented executive support, which has the ability to respond to an emergency by geographical representation of the hazard zones on the map.The system can be integrated into national or international environmental monitoring systems, being based on local intelligent measuring and transmission units, simultaneous processing and data presentation using a real-time operating system for PC and geographical information system (GIS). Such an integrated system is composed of independent applications operating under the same computer, which is capable to improve the protection of the population and decision makers efforts, updating the remote GIS data base. All information can be managed directly from the map by multilevel data retrieving and presentation by using on-line dynamic evolution of the events, environment information, evacuation optimization, image and voice processing

  1. PREFACE: Complex Networks: from Biology to Information Technology

    Science.gov (United States)

    Barrat, A.; Boccaletti, S.; Caldarelli, G.; Chessa, A.; Latora, V.; Motter, A. E.

    2008-06-01

    for counting large directed loops. This work proposes a belief-propagation algorithm for counting long loops in directed networks, which is then applied to networks of different sizes and loop structure. In The anatomy of a large query graph, Baeza-Yates and Tiberi show that scale invariance is present also in the structure of a graph derived from query logs. This graph is determined not only by the queries but also by the subsequent actions of the users. The graph analysed in this study is generated by more than twenty million queries and is less sparse than suggested by previous studies. A different class of networks is considered by Travençolo and da F Costa in Hierarchical spatial organisation of geographical networks. This work proposes a hierarchical extension of the polygonality index as a means to characterise geographical planar networks and, in particular, to obtain more complete information about the spatial order of the network at progressive spatial scales. The paper Border trees of complex networks by Villas Boas et al focuses instead on the statistical properties of the boundary of graphs, constituted by the vertices of degree one (the leaves of border trees). The authors study the local properties, the depth, and the number of leaves of these border trees, finding that in some real networks more than half of the nodes belong to the border trees. The last contribution to the first section is The generation of random directed networks with prescribed 1-node and 2-node degree correlations by Zamora-López et al. This study deals with the generation of random directed networks and shows that often a large number of links cannot be 'randomised' without altering the degree correlations. This permits fast generation of ensembles of maximally random networks. In the section Methods: The Dynamics, significant attention is given to the study of synchronisation processes on networks: Díaz-Guilera's contribution Dynamics towards synchronisation in hierarchical

  2. Some considerations on Bible translation as complex process | Van ...

    African Journals Online (AJOL)

    It is argued that translation is a complex process: meaning is "created" by decoding the source text on several levels (for instance, grammatical; structural; literary; and socio-cultural levels). This "meaning" must then be encoded into the target language by means of the linguistic, literary, and cultural conventions of the target ...

  3. An experimental approach to estimation of human information processing capacity for diagnosis tasks in NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ji Tae

    2006-02-15

    The objectives of this research are 1) to determine the human's information processing capacity and 2) to describe the relationship between the information processing capacity and human factors. This research centers on the relationship, as experimentally determined, between an operator's mental workload and information flow during accident diagnosis tasks at nuclear power plants (NPPs). The relationship between the information flow rate and operator's mental workload is investigated experimentally. According to this relationship, the operator's information processing capacity can be established. Once the information processing capacity of a main control room (MCR) operator in a NPP is known, it is possible to apply it 1) to predict the operator's performance, 2) to design diagnosis tasks, and 3) to design human-machine interface. In advanced MCR, an operator's mental activity is more important than his or her physical activity. The mental workload is the portion of the operator's limited capacity that is actually required to perform a particular task. A high mental workload may cause an operator to make a mistake and consequently affect that the safe operation of NPPs. Thus, to predict an operator's performance is very important for the nuclear safety. The information processing capacity is the operator's ability to manage the amount of bits per second when an operator is diagnosing tasks or accidents. We can estimate the information processing capacity using the relationship between the information flow rate and human performance. That is, if the operator's performance decreases rapidly as the information flow rate (bit/sec) is increased, it is possible to determine the operator's information processing capacity. A diagnosis task is one of the most complex and mentally demanding tasks as well as a crucial part in maintaining the safe operation of NPPs. Diagnosis tasks refer to the overall tasks of finding the

  4. An experimental approach to estimation of human information processing capacity for diagnosis tasks in NPPs

    International Nuclear Information System (INIS)

    Kim, Ji Tae

    2006-02-01

    The objectives of this research are 1) to determine the human's information processing capacity and 2) to describe the relationship between the information processing capacity and human factors. This research centers on the relationship, as experimentally determined, between an operator's mental workload and information flow during accident diagnosis tasks at nuclear power plants (NPPs). The relationship between the information flow rate and operator's mental workload is investigated experimentally. According to this relationship, the operator's information processing capacity can be established. Once the information processing capacity of a main control room (MCR) operator in a NPP is known, it is possible to apply it 1) to predict the operator's performance, 2) to design diagnosis tasks, and 3) to design human-machine interface. In advanced MCR, an operator's mental activity is more important than his or her physical activity. The mental workload is the portion of the operator's limited capacity that is actually required to perform a particular task. A high mental workload may cause an operator to make a mistake and consequently affect that the safe operation of NPPs. Thus, to predict an operator's performance is very important for the nuclear safety. The information processing capacity is the operator's ability to manage the amount of bits per second when an operator is diagnosing tasks or accidents. We can estimate the information processing capacity using the relationship between the information flow rate and human performance. That is, if the operator's performance decreases rapidly as the information flow rate (bit/sec) is increased, it is possible to determine the operator's information processing capacity. A diagnosis task is one of the most complex and mentally demanding tasks as well as a crucial part in maintaining the safe operation of NPPs. Diagnosis tasks refer to the overall tasks of finding the root of cause of the faults or accidents. In this

  5. Effects of trial complexity on decision making.

    Science.gov (United States)

    Horowitz, I A; ForsterLee, L; Brolly, I

    1996-12-01

    The ability of a civil jury to render fair and rational decisions in complex trials has been questioned. However, the nature, dimensions, and effects of trial complexity on decision making have rarely been addressed. In this research, jury-eligible adults saw a videotape of a complex civil trial that varied in information load and complexity of the language of the witnesses. Information load and complexity differentially affected liability and compensatory decisions. An increase in the number of plaintiffs decreased blameworthiness assigned to the defendant despite contrary evidence and amount of probative evidence processed. Complex language did not affect memory but did affect jurors' ability to appropriately compensate differentially worthy plaintiffs. Jurors assigned compensatory awards commensurate with the plaintiffs' injuries only under low-load and less complex language conditions.

  6. Iterative Neighbour-Information Gathering for Ranking Nodes in Complex Networks

    Science.gov (United States)

    Xu, Shuang; Wang, Pei; Lü, Jinhu

    2017-01-01

    Designing node influence ranking algorithms can provide insights into network dynamics, functions and structures. Increasingly evidences reveal that node’s spreading ability largely depends on its neighbours. We introduce an iterative neighbourinformation gathering (Ing) process with three parameters, including a transformation matrix, a priori information and an iteration time. The Ing process iteratively combines priori information from neighbours via the transformation matrix, and iteratively assigns an Ing score to each node to evaluate its influence. The algorithm appropriates for any types of networks, and includes some traditional centralities as special cases, such as degree, semi-local, LeaderRank. The Ing process converges in strongly connected networks with speed relying on the first two largest eigenvalues of the transformation matrix. Interestingly, the eigenvector centrality corresponds to a limit case of the algorithm. By comparing with eight renowned centralities, simulations of susceptible-infected-removed (SIR) model on real-world networks reveal that the Ing can offer more exact rankings, even without a priori information. We also observe that an optimal iteration time is always in existence to realize best characterizing of node influence. The proposed algorithms bridge the gaps among some existing measures, and may have potential applications in infectious disease control, designing of optimal information spreading strategies.

  7. A Study on Improving Information Processing Abilities Based on PBL

    Science.gov (United States)

    Kim, Du Gyu; Lee, JaeMu

    2014-01-01

    This study examined an instruction method for the improvement of information processing abilities in elementary school students. Current elementary students are required to develop information processing abilities to create new knowledge for this digital age. There is, however, a shortage of instruction strategies for these information processing…

  8. A Conceptual Model of the Cognitive Processing of Environmental Distance Information

    Science.gov (United States)

    Montello, Daniel R.

    I review theories and research on the cognitive processing of environmental distance information by humans, particularly that acquired via direct experience in the environment. The cognitive processes I consider for acquiring and thinking about environmental distance information include working-memory, nonmediated, hybrid, and simple-retrieval processes. Based on my review of the research literature, and additional considerations about the sources of distance information and the situations in which it is used, I propose an integrative conceptual model to explain the cognitive processing of distance information that takes account of the plurality of possible processes and information sources, and describes conditions under which particular processes and sources are likely to operate. The mechanism of summing vista distances is identified as widely important in situations with good visual access to the environment. Heuristics based on time, effort, or other information are likely to play their most important role when sensory access is restricted.

  9. Information Propagation in Complex Networks : Structures and Dynamics

    NARCIS (Netherlands)

    Märtens, M.

    2018-01-01

    This thesis is a contribution to a deeper understanding of how information propagates and what this process entails. At its very core is the concept of the network: a collection of nodes and links, which describes the structure of the systems under investigation. The network is a mathematical model

  10. Information processing. [in human performance

    Science.gov (United States)

    Wickens, Christopher D.; Flach, John M.

    1988-01-01

    Theoretical models of sensory-information processing by the human brain are reviewed from a human-factors perspective, with a focus on their implications for aircraft and avionics design. The topics addressed include perception (signal detection and selection), linguistic factors in perception (context provision, logical reversals, absence of cues, and order reversals), mental models, and working and long-term memory. Particular attention is given to decision-making problems such as situation assessment, decision formulation, decision quality, selection of action, the speed-accuracy tradeoff, stimulus-response compatibility, stimulus sequencing, dual-task performance, task difficulty and structure, and factors affecting multiple task performance (processing modalities, codes, and stages).

  11. Towards a Tool for Assessing Supply-Chain Information Performance During Implementation of New Information Technologies

    NARCIS (Netherlands)

    Denolf, J.M.; Wognum, P.M.; Trienekens, J.H.; Vorst, van der J.G.A.J.; Omta, S.W.F.

    2012-01-01

    Based on improved information performance, agro-food companies and supply chains want to enhance their production processes. It creates the necessity to implement additional information technologies. The implementation of information technologies is, however, a complex task because of the

  12. Integrating Human Factors Engineering and Information Processing Approaches to Facilitate Evaluations in Criminal Justice Technology Research.

    Science.gov (United States)

    Salvemini, Anthony V; Piza, Eric L; Carter, Jeremy G; Grommon, Eric L; Merritt, Nancy

    2015-06-01

    Evaluations are routinely conducted by government agencies and research organizations to assess the effectiveness of technology in criminal justice. Interdisciplinary research methods are salient to this effort. Technology evaluations are faced with a number of challenges including (1) the need to facilitate effective communication between social science researchers, technology specialists, and practitioners, (2) the need to better understand procedural and contextual aspects of a given technology, and (3) the need to generate findings that can be readily used for decision making and policy recommendations. Process and outcome evaluations of technology can be enhanced by integrating concepts from human factors engineering and information processing. This systemic approach, which focuses on the interaction between humans, technology, and information, enables researchers to better assess how a given technology is used in practice. Examples are drawn from complex technologies currently deployed within the criminal justice system where traditional evaluations have primarily focused on outcome metrics. Although this evidence-based approach has significant value, it is vulnerable to fully account for human and structural complexities that compose technology operations. Guiding principles for technology evaluations are described for identifying and defining key study metrics, facilitating communication within an interdisciplinary research team, and for understanding the interaction between users, technology, and information. The approach posited here can also enable researchers to better assess factors that may facilitate or degrade the operational impact of the technology and answer fundamental questions concerning whether the technology works as intended, at what level, and cost. © The Author(s) 2015.

  13. Increasing complexity with quantum physics.

    Science.gov (United States)

    Anders, Janet; Wiesner, Karoline

    2011-09-01

    We argue that complex systems science and the rules of quantum physics are intricately related. We discuss a range of quantum phenomena, such as cryptography, computation and quantum phases, and the rules responsible for their complexity. We identify correlations as a central concept connecting quantum information and complex systems science. We present two examples for the power of correlations: using quantum resources to simulate the correlations of a stochastic process and to implement a classically impossible computational task.

  14. AGING, CAFFEINE, AND INFORMATION-PROCESSING - AN EVENT-RELATED POTENTIAL ANALYSIS

    NARCIS (Netherlands)

    LORIST, MM; SNEL, J; MULDER, G; KOK, A

    Structural and energetic processes in information processing were studied in young and elderly subjects. A visually focussed selective search task was used, in which subjects had to select relevant information, followed by controlled memory search processes to locate a target item. Caffeine was used

  15. ASPIE: A Framework for Active Sensing and Processing of Complex Events in the Internet of Manufacturing Things

    Directory of Open Access Journals (Sweden)

    Shaobo Li

    2018-03-01

    Full Text Available Rapid perception and processing of critical monitoring events are essential to ensure healthy operation of Internet of Manufacturing Things (IoMT-based manufacturing processes. In this paper, we proposed a framework (active sensing and processing architecture (ASPIE for active sensing and processing of critical events in IoMT-based manufacturing based on the characteristics of IoMT architecture as well as its perception model. A relation model of complex events in manufacturing processes, together with related operators and unified XML-based semantic definitions, are developed to effectively process the complex event big data. A template based processing method for complex events is further introduced to conduct complex event matching using the Apriori frequent item mining algorithm. To evaluate the proposed models and methods, we developed a software platform based on ASPIE for a local chili sauce manufacturing company, which demonstrated the feasibility and effectiveness of the proposed methods for active perception and processing of complex events in IoMT-based manufacturing.

  16. A Reaction Database for Small Molecule Pharmaceutical Processes Integrated with Process Information

    DEFF Research Database (Denmark)

    Papadakis, Emmanouil; Anantpinijwatna, Amata; Woodley, John

    2017-01-01

    This article describes the development of a reaction database with the objective to collect data for multiphase reactions involved in small molecule pharmaceutical processes with a search engine to retrieve necessary data in investigations of reaction-separation schemes, such as the role of organic......; compounds participating in the reaction; use of organic solvents and their function; information for single step and multistep reactions; target products; reaction conditions and reaction data. Information for reactor scale-up together with information for the separation and other relevant information...

  17. Perceived price complexity of dynamic energy tariffs: An investigation of antecedents and consequences

    International Nuclear Information System (INIS)

    Layer, Patrick; Feurer, Sven; Jochem, Patrick

    2017-01-01

    Dynamic tariffs have the potential to contribute to a successful shift from conventional to renewable energies, but tapping this potential in Europe ultimately depends on residential consumers selecting them. This study proposes and finds that consumer reactions to dynamic tariffs depend on the level of perceived price complexity that represents the cognitive effort consumers must engage in to compute the overall bill amount. An online experiment conducted with a representative sample of 664 German residential energy consumers examines how salient characteristics of dynamic tariffs contribute to perceived price complexity. Subsequently, a structural equation model (SEM) reveals that the depth of information processing is central to understand how price complexity relates to consumers’ behavioral intentions. The results suggest that it will be challenging to convince European consumers to select complex dynamic tariffs under the current legal framework. Policymakers will need to find ways to make these tariffs more attractive. - Highlights: • Little is known about the processes by which consumers evaluate dynamic tariffs. • In this evaluation process perceived price complexity plays a central role. • Tariff type, price endings, and discount presentation format drive price complexity. • Perceived price complexity decreases the depth of information processing. • A decreased depth of information processing ultimately leads to lower behavioral intentions.

  18. A flexible object-based software framework for modeling complex systems with interacting natural and societal processes.

    Energy Technology Data Exchange (ETDEWEB)

    Christiansen, J. H.

    2000-06-15

    The Dynamic Information Architecture System (DIAS) is a flexible, extensible, object-based framework for developing and maintaining complex multidisciplinary simulations. The DIAS infrastructure makes it feasible to build and manipulate complex simulation scenarios in which many thousands of objects can interact via dozens to hundreds of concurrent dynamic processes. The flexibility and extensibility of the DIAS software infrastructure stem mainly from (1) the abstraction of object behaviors, (2) the encapsulation and formalization of model functionality, and (3) the mutability of domain object contents. DIAS simulation objects are inherently capable of highly flexible and heterogeneous spatial realizations. Geospatial graphical representation of DIAS simulation objects is addressed via the GeoViewer, an object-based GIS toolkit application developed at ANL. DIAS simulation capabilities have been extended by inclusion of societal process models generated by the Framework for Addressing Cooperative Extended Transactions (FACET), another object-based framework developed at Argonne National Laboratory. By using FACET models to implement societal behaviors of individuals and organizations within larger DIAS-based natural systems simulations, it has become possible to conveniently address a broad range of issues involving interaction and feedback among natural and societal processes. Example DIAS application areas discussed in this paper include a dynamic virtual oceanic environment, detailed simulation of clinical, physiological, and logistical aspects of health care delivery, and studies of agricultural sustainability of urban centers under environmental stress in ancient Mesopotamia.

  19. Quantum information processing with atoms and photons

    International Nuclear Information System (INIS)

    Monroe, C.

    2003-01-01

    Quantum information processors exploit the quantum features of superposition and entanglement for applications not possible in classical devices, offering the potential for significant improvements in the communication and processing of information. Experimental realization of large-scale quantum information processors remains a long term vision, as the required nearly pure quantum behaviour is observed only in exotic hardware such as individual laser-cooled atoms and isolated photons. But recent theoretical and experimental advances suggest that cold atoms and individual photons may lead the way towards bigger and better quantum information processors, effectively building mesoscopic versions of Schroedinger's cat' from the bottom up. (author)

  20. Physics Colloquium: The optical route to quantum information processing

    CERN Multimedia

    Université de Genève

    2011-01-01

    Geneva University Physics Department 24, Quai Ernest Ansermet CH-1211 Geneva 4 Monday 11 April 2011 17h00 - Ecole de Physique, Auditoire Stückelberg The optical route to quantum information processing Prof. Terry Rudolph/Imperial College, London Photons are attractive as carriers of quantum information both because they travel, and can thus transmit information, but also because of their good coherence properties and ease in undergoing single-qubit manipulations. The main obstacle to their use in information processing is inducing an effective interaction between them in order to produce entanglement. The most promising approach in photon-based information processing architectures is so-called measurement-based quantum computing. This relies on creating upfront a multi-qubit highly entangled state (the cluster state) which has the remarkable property that, once prepared, it can be used to perform quantum computation by making only single qubit measurements. In this talk I will discuss generically the...

  1. Virtual HRD and National Culture: An Information Processing Perspective

    Science.gov (United States)

    Chung, Chih-Hung; Angnakoon, Putthachat; Li, Jessica; Allen, Jeff

    2016-01-01

    Purpose: The purpose of this study is to provide researchers with a better understanding of the cultural impact on information processing in virtual learning environment. Design/methodology/approach: This study uses a causal loop diagram to depict the cultural impact on information processing in the virtual human resource development (VHRD)…

  2. Strategic-decision quality in public organizations : an information processing perspective

    OpenAIRE

    George, Bert; Desmidt, Sebastian

    2018-01-01

    textabstractThis study draws on information processing theory to investigate predictors of strategic-decision quality in public organizations. Information processing theory argues that (a) rational planning practices contribute to strategic-decision quality by injecting information into decision making and (b) decision makers contribute to strategic-decision quality by exchanging information during decision making. These assumptions are tested upon 55 Flemish pupil guidance centers. Rational ...

  3. A Reaction Database for Small Molecule Pharmaceutical Processes Integrated with Process Information

    Directory of Open Access Journals (Sweden)

    Emmanouil Papadakis

    2017-10-01

    Full Text Available This article describes the development of a reaction database with the objective to collect data for multiphase reactions involved in small molecule pharmaceutical processes with a search engine to retrieve necessary data in investigations of reaction-separation schemes, such as the role of organic solvents in reaction performance improvement. The focus of this reaction database is to provide a data rich environment with process information available to assist during the early stage synthesis of pharmaceutical products. The database is structured in terms of reaction classification of reaction types; compounds participating in the reaction; use of organic solvents and their function; information for single step and multistep reactions; target products; reaction conditions and reaction data. Information for reactor scale-up together with information for the separation and other relevant information for each reaction and reference are also available in the database. Additionally, the retrieved information obtained from the database can be evaluated in terms of sustainability using well-known “green” metrics published in the scientific literature. The application of the database is illustrated through the synthesis of ibuprofen, for which data on different reaction pathways have been retrieved from the database and compared using “green” chemistry metrics.

  4. Disjunctive Information Flow for Communicating Processes

    DEFF Research Database (Denmark)

    Li, Ximeng; Nielson, Flemming; Nielson, Hanne Riis

    2016-01-01

    The security validation of practical computer systems calls for the ability to specify and verify information flow policies that are dependent on data content. Such policies play an important role in concurrent, communicating systems: consider a scenario where messages are sent to different...... processes according to their tagging. We devise a security type system that enforces content-dependent information flow policies in the presence of communication and concurrency. The type system soundly guarantees a compositional noninterference property. All theoretical results have been formally proved...

  5. Toward a Model of Human Information Processing for Decision-Making and Skill Acquisition in Laparoscopic Colorectal Surgery.

    Science.gov (United States)

    White, Eoin J; McMahon, Muireann; Walsh, Michael T; Coffey, J Calvin; O Sullivan, Leonard

    To create a human information-processing model for laparoscopic surgery based on already established literature and primary research to enhance laparoscopic surgical education in this context. We reviewed the literature for information-processing models most relevant to laparoscopic surgery. Our review highlighted the necessity for a model that accounts for dynamic environments, perception, allocation of attention resources between the actions of both hands of an operator, and skill acquisition and retention. The results of the literature review were augmented through intraoperative observations of 7 colorectal surgical procedures, supported by laparoscopic video analysis of 12 colorectal procedures. The Wickens human information-processing model was selected as the most relevant theoretical model to which we make adaptions for this specific application. We expanded the perception subsystem of the model to involve all aspects of perception during laparoscopic surgery. We extended the decision-making system to include dynamic decision-making to account for case/patient-specific and surgeon-specific deviations. The response subsystem now includes dual-task performance and nontechnical skills, such as intraoperative communication. The memory subsystem is expanded to include skill acquisition and retention. Surgical decision-making during laparoscopic surgery is the result of a highly complex series of processes influenced not only by the operator's knowledge, but also patient anatomy and interaction with the surgical team. Newer developments in simulation-based education must focus on the theoretically supported elements and events that underpin skill acquisition and affect the cognitive abilities of novice surgeons. The proposed human information-processing model builds on established literature regarding information processing, accounting for a dynamic environment of laparoscopic surgery. This revised model may be used as a foundation for a model describing robotic

  6. Springfield Processing Plant (SPP) Facility Information

    Energy Technology Data Exchange (ETDEWEB)

    Leach, Janice; Torres, Teresa M.

    2012-10-01

    The Springfield Processing Plant is a hypothetical facility. It has been constructed for use in training workshops. Information is provided about the facility and its surroundings, particularly security-related aspects such as target identification, threat data, entry control, and response force data.

  7. Usage of information safety requirements in improving tube bending process

    Science.gov (United States)

    Livshitz, I. I.; Kunakov, E.; Lontsikh, P. A.

    2018-05-01

    This article is devoted to an improvement of the technological process's analysis with the information security requirements implementation. The aim of this research is the competition increase analysis in aircraft industry enterprises due to the information technology implementation by the example of the tube bending technological process. The article analyzes tube bending kinds and current technique. In addition, a potential risks analysis in a tube bending technological process is carried out in terms of information security.

  8. Evaluation of EMG processing techniques using Information Theory

    Directory of Open Access Journals (Sweden)

    Felice Carmelo J

    2010-11-01

    Full Text Available Abstract Background Electromyographic signals can be used in biomedical engineering and/or rehabilitation field, as potential sources of control for prosthetics and orthotics. In such applications, digital processing techniques are necessary to follow efficient and effectively the changes in the physiological characteristics produced by a muscular contraction. In this paper, two methods based on information theory are proposed to evaluate the processing techniques. Methods These methods determine the amount of information that a processing technique is able to extract from EMG signals. The processing techniques evaluated with these methods were: absolute mean value (AMV, RMS values, variance values (VAR and difference absolute mean value (DAMV. EMG signals from the middle deltoid during abduction and adduction movement of the arm in the scapular plane was registered, for static and dynamic contractions. The optimal window length (segmentation, abduction and adduction movements and inter-electrode distance were also analyzed. Results Using the optimal segmentation (200 ms and 300 ms in static and dynamic contractions, respectively the best processing techniques were: RMS, AMV and VAR in static contractions, and only the RMS in dynamic contractions. Using the RMS of EMG signal, variations in the amount of information between the abduction and adduction movements were observed. Conclusions Although the evaluation methods proposed here were applied to standard processing techniques, these methods can also be considered as alternatives tools to evaluate new processing techniques in different areas of electrophysiology.

  9. Conditioning from an information processing perspective.

    Science.gov (United States)

    Gallistel, C R.

    2003-04-28

    The framework provided by Claude Shannon's [Bell Syst. Technol. J. 27 (1948) 623] theory of information leads to a quantitatively oriented reconceptualization of the processes that mediate conditioning. The focus shifts from processes set in motion by individual events to processes sensitive to the information carried by the flow of events. The conception of what properties of the conditioned and unconditioned stimuli are important shifts from the tangible properties to the intangible properties of number, duration, frequency and contingency. In this view, a stimulus becomes a CS if its onset substantially reduces the subject's uncertainty about the time of occurrence of the next US. One way to represent the subject's knowledge of that time of occurrence is by the cumulative probability function, which has two limiting forms: (1) The state of maximal uncertainty (minimal knowledge) is represented by the inverse exponential function for the random rate condition, in which the US is equally likely at any moment. (2) The limit to the subject's attainable certainty is represented by the cumulative normal function, whose momentary expectation is the CS-US latency minus the time elapsed since CS onset. Its standard deviation is the Weber fraction times the CS-US latency.

  10. Influence Processes for Information Technology Acceptance

    DEFF Research Database (Denmark)

    Bhattacherjee, Anol; Sanford, Clive Carlton

    2006-01-01

    This study examines how processes of external influence shape information technology acceptance among potential users, how such influence effects vary across a user population, and whether these effects are persistent over time. Drawing on the elaboration-likelihood model (ELM), we compared two...... alternative influence processes, the central and peripheral routes, in motivating IT acceptance. These processes were respectively operationalized using the argument quality and source credibility constructs, and linked to perceived usefulness and attitude, the core perceptual drivers of IT acceptance. We...... further examined how these influence processes were moderated by users' IT expertise and perceived job relevance and the temporal stability of such influence effects. Nine hypotheses thus developed were empirically validated using a field survey of document management system acceptance at an eastern...

  11. Crystallization process of a three-dimensional complex plasma

    Science.gov (United States)

    Steinmüller, Benjamin; Dietz, Christopher; Kretschmer, Michael; Thoma, Markus H.

    2018-05-01

    Characteristic timescales and length scales for phase transitions of real materials are in ranges where a direct visualization is unfeasible. Therefore, model systems can be useful. Here, the crystallization process of a three-dimensional complex plasma under gravity conditions is considered where the system ranges up to a large extent into the bulk plasma. Time-resolved measurements exhibit the process down to a single-particle level. Primary clusters, consisting of particles in the solid state, grow vertically and, secondarily, horizontally. The box-counting method shows a fractal dimension of df≈2.72 for the clusters. This value gives a hint that the formation process is a combination of local epitaxial and diffusion-limited growth. The particle density and the interparticle distance to the nearest neighbor remain constant within the clusters during crystallization. All results are in good agreement with former observations of a single-particle layer.

  12. Organizational restructuring in response to changes in information-processing technology

    OpenAIRE

    Andrzej Baniak; Jacek Cukrowski

    1999-01-01

    This paper examines the effects of changes in information-processing technology on the efficient organizational forms of data-processing in decision-making systems. Data-processing is modelled in the framework of the dynamic parallel processing model of associative computation with an endogenous set-up costs of the processors. In such a model, the conditions for efficient organization of information-processing are defined and the architecture of the efficient structures is considered. It is s...

  13. Improving biological understanding and complex trait prediction by integrating prior information in genomic feature models

    DEFF Research Database (Denmark)

    Edwards, Stefan McKinnon

    externally founded information, such as KEGG pathways, Gene Ontology gene sets, or genomic features, and estimate the joint contribution of the genetic variants within these sets to complex trait phenotypes. The analysis of complex trait phenotypes is hampered by the myriad of genes that control the trait...

  14. Mutually exclusive aspects of information carried by physical systems: Complementarity between local and nonlocal information

    International Nuclear Information System (INIS)

    Oppenheim, Jonathan; Horodecki, Karol; Horodecki, Michal; Horodecki, Ryszard; Horodecki, Pawel

    2003-01-01

    Complex physical systems contain information which, under some well-defined processes can differentiate between local and nonlocal information. Both these fundamental aspects of information are defined operationally. Local information is locally accessible and allows one to perform processes, such as physical work, while nonlocal information allows one to perform processes such as teleportation. It is shown that these two kinds of information are complementary in the sense that two parties can either gain access to the nonlocal information or to the local information but not both. This complementarity has a form similar to that expressed by entropic uncertainty relations. For pure states, the entanglement plays the role of Planck's constant. We also find another class of complementarity relations which applies to operators and is induced when two parties can only perform local operations and communicate classical (LOCC). In particular, observables such as the parity and phase of two qubits commute but under LOCC, they are complementary observables. It is also found this complementarity is pure in the sense that it can be ''decoupled'' from the uncertainty principle. It is suggested that these complementarities represent an essential extension of Bohr's complementarity to complex (distributed) systems which are entangled

  15. Informative gene selection using Adaptive Analytic Hierarchy Process (A2HP

    Directory of Open Access Journals (Sweden)

    Abhishek Bhola

    2017-12-01

    Full Text Available Gene expression dataset derived from microarray experiments are marked by large number of genes, which contains the gene expression values at different sample conditions/time-points. Selection of informative genes from these large datasets is an issue of major concern for various researchers and biologists. In this study, we propose a gene selection and dimensionality reduction method called Adaptive Analytic Hierarchy Process (A2HP. Traditional analytic hierarchy process is a multiple-criteria based decision analysis method whose result depends upon the expert knowledge or decision makers. It is mainly used to solve the decision problems in different fields. On the other hand, A2HP is a fused method that combines the outcomes of five individual gene selection ranking methods t-test, chi-square variance test, z-test, wilcoxon test and signal-to-noise ratio (SNR. At first, the preprocessing of gene expression dataset is done and then the reduced number of genes obtained, will be fed as input for A2HP. A2HP utilizes both quantitative and qualitative factors to select the informative genes. Results demonstrate that A2HP selects efficient number of genes as compared to the individual gene selection methods. The percentage of deduction in number of genes and time complexity are taken as the performance measure for the proposed method. And it is shown that A2HP outperforms individual gene selection methods.

  16. Influence of information on behavioral effects in decision processes

    OpenAIRE

    Angelarosa Longo; Viviana Ventre

    2015-01-01

    Rational models in decision processes are marked out by many anomalies, caused by behavioral issues. We point out the importance of information in causing inconsistent preferences in a decision process. In a single or multi agent decision process each mental model is influenced by the presence, the absence or false information about the problem or about other members of the decision making group. The difficulty in modeling these effects increases because behavioral biases influence also the m...

  17. Integration of Individual Processes and Information Demand Patterns: A Conceptual Analysis

    Directory of Open Access Journals (Sweden)

    Michael Leyer

    2017-12-01

    Full Text Available Individuals need a variety of information when performing their personal processes. However, companies typically know little about the underlying individual demand patterns in these processes. Conceptualizing information demand patterns of individuals is expected to allow for using these as foundation to extend the traditional internal information logistic perspective of companies. Digital options could then be used to align individual and organizational information leading not only to new product and service offers, but also to new work structures in organizations. Thus, we extend prior literature regarding business process management and information logistics by highlighting how information demand patterns (IDP have to be adapted to individual processes. Our exploratory approach is to demonstrate conceptually the conditions and implications of individual IDPs.

  18. Information processing and dynamics in minimally cognitive agents.

    Science.gov (United States)

    Beer, Randall D; Williams, Paul L

    2015-01-01

    There has been considerable debate in the literature about the relative merits of information processing versus dynamical approaches to understanding cognitive processes. In this article, we explore the relationship between these two styles of explanation using a model agent evolved to solve a relational categorization task. Specifically, we separately analyze the operation of this agent using the mathematical tools of information theory and dynamical systems theory. Information-theoretic analysis reveals how task-relevant information flows through the system to be combined into a categorization decision. Dynamical analysis reveals the key geometrical and temporal interrelationships underlying the categorization decision. Finally, we propose a framework for directly relating these two different styles of explanation and discuss the possible implications of our analysis for some of the ongoing debates in cognitive science. Copyright © 2014 Cognitive Science Society, Inc.

  19. Development and Validation of the Social Information Processing Application: A Web-Based Measure of Social Information Processing Patterns in Elementary School-Age Boys

    Science.gov (United States)

    Kupersmidt, Janis B.; Stelter, Rebecca; Dodge, Kenneth A.

    2011-01-01

    The purpose of this study was to evaluate the psychometric properties of an audio computer-assisted self-interviewing Web-based software application called the Social Information Processing Application (SIP-AP) that was designed to assess social information processing skills in boys in RD through 5th grades. This study included a racially and…

  20. Information Integration; The process of integration, evolution and versioning

    NARCIS (Netherlands)

    de Keijzer, Ander; van Keulen, Maurice

    2005-01-01

    At present, many information sources are available wherever you are. Most of the time, the information needed is spread across several of those information sources. Gathering this information is a tedious and time consuming job. Automating this process would assist the user in its task. Integration