WorldWideScience

Sample records for information processing model

  1. Conceptual models of information processing

    Science.gov (United States)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  2. Information-Processing Models and Curriculum Design

    Science.gov (United States)

    Calfee, Robert C.

    1970-01-01

    "This paper consists of three sections--(a) the relation of theoretical analyses of learning to curriculum design, (b) the role of information-processing models in analyses of learning processes, and (c) selected examples of the application of information-processing models to curriculum design problems." (Author)

  3. An information theory-based approach to modeling the information processing of NPP operators

    International Nuclear Information System (INIS)

    Kim, Jong Hyun; Seong, Poong Hyun

    2002-01-01

    This paper proposes a quantitative approach to modeling the information processing of NPP operators. The aim of this work is to derive the amount of the information processed during a certain control task. The focus will be on i) developing a model for information processing of NPP operators and ii) quantifying the model. To resolve the problems of the previous approaches based on the information theory, i.e. the problems of single channel approaches, we primarily develop the information processing model having multiple stages, which contains information flows. Then the uncertainty of the information is quantified using the Conant's model, a kind of information theory

  4. A simplified computational memory model from information processing.

    Science.gov (United States)

    Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang

    2016-11-23

    This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view.

  5. A simplified computational memory model from information processing

    Science.gov (United States)

    Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang

    2016-01-01

    This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view. PMID:27876847

  6. Information in general medical practices: the information processing model.

    Science.gov (United States)

    Crowe, Sarah; Tully, Mary P; Cantrill, Judith A

    2010-04-01

    The need for effective communication and handling of secondary care information in general practices is paramount. To explore practice processes on receiving secondary care correspondence in a way that integrates the information needs and perceptions of practice staff both clinical and administrative. Qualitative study using semi-structured interviews with a wide range of practice staff (n = 36) in nine practices in the Northwest of England. Analysis was based on the framework approach using N-Vivo software and involved transcription, familiarization, coding, charting, mapping and interpretation. The 'information processing model' was developed to describe the six stages involved in practice processing of secondary care information. These included the amendment or updating of practice records whilst simultaneously or separately actioning secondary care recommendations, using either a 'one-step' or 'two-step' approach, respectively. Many factors were found to influence each stage and impact on the continuum of patient care. The primary purpose of processing secondary care information is to support patient care; this study raises the profile of information flow and usage within practices as an issue requiring further consideration.

  7. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    Science.gov (United States)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  8. A quantitative approach to modeling the information processing of NPP operators under input information overload

    International Nuclear Information System (INIS)

    Kim, Jong Hyun; Seong, Poong Hyun

    2002-01-01

    This paper proposes a quantitative approach to modeling the information processing of NPP operators. The aim of this work is to derive the amount of the information processed during a certain control task under input information overload. We primarily develop the information processing model having multiple stages, which contains information flow. Then the uncertainty of the information is quantified using the Conant's model, a kind of information theory. We also investigate the applicability of this approach to quantifying the information reduction of operators under the input information overload

  9. Modeling biochemical transformation processes and information processing with Narrator.

    Science.gov (United States)

    Mandel, Johannes J; Fuss, Hendrik; Palfreyman, Niall M; Dubitzky, Werner

    2007-03-27

    Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Narrator is a flexible and intuitive systems biology tool. It is

  10. Modeling biochemical transformation processes and information processing with Narrator

    Directory of Open Access Journals (Sweden)

    Palfreyman Niall M

    2007-03-01

    Full Text Available Abstract Background Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs, which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Results Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Conclusion Narrator is a

  11. A Process Model for Goal-Based Information Retrieval

    Directory of Open Access Journals (Sweden)

    Harvey Hyman

    2014-12-01

    Full Text Available In this paper we examine the domain of information search and propose a "goal-based" approach to study search strategy. We describe "goal-based information search" using a framework of Knowledge Discovery. We identify two Information Retrieval (IR goals using the constructs of Knowledge Acquisition (KA and Knowledge Explanation (KE. We classify these constructs into two specific information problems: An exploration-exploitation problem and an implicit-explicit problem. Our proposed framework is an extension of prior work in this domain, applying an IR Process Model originally developed for Legal-IR and adapted to Medical-IR. The approach in this paper is guided by the recent ACM-SIG Medical Information Retrieval (MedIR Workshop definition: "methodologies and technologies that seek to improve access to medical information archives via a process of information retrieval."

  12. Modeling of ETL-Processes and Processed Information in Clinical Data Warehousing.

    Science.gov (United States)

    Tute, Erik; Steiner, Jochen

    2018-01-01

    Literature describes a big potential for reuse of clinical patient data. A clinical data warehouse (CDWH) is a means for that. To support management and maintenance of processes extracting, transforming and loading (ETL) data into CDWHs as well as to ease reuse of metadata between regular IT-management, CDWH and secondary data users by providing a modeling approach. Expert survey and literature review to find requirements and existing modeling techniques. An ETL-modeling-technique was developed extending existing modeling techniques. Evaluation by exemplarily modeling existing ETL-process and a second expert survey. Nine experts participated in the first survey. Literature review yielded 15 included publications. Six existing modeling techniques were identified. A modeling technique extending 3LGM2 and combining it with openEHR information models was developed and evaluated. Seven experts participated in the evaluation. The developed approach can help in management and maintenance of ETL-processes and could serve as interface between regular IT-management, CDWH and secondary data users.

  13. Advanced modeling of management processes in information technology

    CERN Document Server

    Kowalczuk, Zdzislaw

    2014-01-01

    This book deals with the issues of modelling management processes of information technology and IT projects while its core is the model of information technology management and its component models (contextual, local) describing initial processing and the maturity capsule as well as a decision-making system represented by a multi-level sequential model of IT technology selection, which acquires a fuzzy rule-based implementation in this work. In terms of applicability, this work may also be useful for diagnosing applicability of IT standards in evaluation of IT organizations. The results of this diagnosis might prove valid for those preparing new standards so that – apart from their own visions – they could, to an even greater extent, take into account the capabilities and needs of the leaders of project and manufacturing teams. The book is intended for IT professionals using the ITIL, COBIT and TOGAF standards in their work. Students of computer science and management who are interested in the issue of IT...

  14. Modelling of information processes management of educational complex

    Directory of Open Access Journals (Sweden)

    Оксана Николаевна Ромашкова

    2014-12-01

    Full Text Available This work concerns information model of the educational complex which includes several schools. A classification of educational complexes formed in Moscow is given. There are also a consideration of the existing organizational structure of the educational complex and a suggestion of matrix management structure. Basic management information processes of the educational complex were conceptualized.

  15. Process and building information modelling in the construction industry by using information delivery manuals and model view definitions

    DEFF Research Database (Denmark)

    Karlshøj, Jan

    2012-01-01

    The construction industry is gradually increasing its use of structured information and building information modelling.To date, the industry has suffered from the disadvantages of a project-based organizational structure and ad hoc solutions. Furthermore, it is not used to formalizing the flow...... of information and specifying exactly which objects and properties are needed for each process and which information is produced by the processes. The present study is based on reviewing the existing methodology of Information Delivery Manuals (IDM) from Buildingsmart, which also is also an ISO standard 29481...... Part 1; and the Model View Definition (MVD) methodology developed by Buildingsmart and BLIS. The research also includes a review of concrete IDM development projects that have been developed over the last five years. Although the study has identified interest in the IDM methodology in a number...

  16. Statistical Language Models and Information Retrieval: Natural Language Processing Really Meets Retrieval

    NARCIS (Netherlands)

    Hiemstra, Djoerd; de Jong, Franciska M.G.

    2001-01-01

    Traditionally, natural language processing techniques for information retrieval have always been studied outside the framework of formal models of information retrieval. In this article, we introduce a new formal model of information retrieval based on the application of statistical language models.

  17. Lecturing and Loving It: Applying the Information-Processing Model.

    Science.gov (United States)

    Parker, Jonathan K.

    1993-01-01

    Discusses the benefits of lecturing, when done properly, in high schools. Describes the positive attributes of effective lecturers. Provides a human information-processing model applicable to the task of lecturing to students. (HB)

  18. A Conceptual Model of the Cognitive Processing of Environmental Distance Information

    Science.gov (United States)

    Montello, Daniel R.

    I review theories and research on the cognitive processing of environmental distance information by humans, particularly that acquired via direct experience in the environment. The cognitive processes I consider for acquiring and thinking about environmental distance information include working-memory, nonmediated, hybrid, and simple-retrieval processes. Based on my review of the research literature, and additional considerations about the sources of distance information and the situations in which it is used, I propose an integrative conceptual model to explain the cognitive processing of distance information that takes account of the plurality of possible processes and information sources, and describes conditions under which particular processes and sources are likely to operate. The mechanism of summing vista distances is identified as widely important in situations with good visual access to the environment. Heuristics based on time, effort, or other information are likely to play their most important role when sensory access is restricted.

  19. Thermodynamics of information processing based on enzyme kinetics: An exactly solvable model of an information pump.

    Science.gov (United States)

    Cao, Yuansheng; Gong, Zongping; Quan, H T

    2015-06-01

    Motivated by the recent proposed models of the information engine [Proc. Natl. Acad. Sci. USA 109, 11641 (2012)] and the information refrigerator [Phys. Rev. Lett. 111, 030602 (2013)], we propose a minimal model of the information pump and the information eraser based on enzyme kinetics. This device can either pump molecules against the chemical potential gradient by consuming the information to be encoded in the bit stream or (partially) erase the information initially encoded in the bit stream by consuming the Gibbs free energy. The dynamics of this model is solved exactly, and the "phase diagram" of the operation regimes is determined. The efficiency and the power of the information machine is analyzed. The validity of the second law of thermodynamics within our model is clarified. Our model offers a simple paradigm for the investigating of the thermodynamics of information processing involving the chemical potential in small systems.

  20. Dynamic information processing states revealed through neurocognitive models of object semantics

    Science.gov (United States)

    Clarke, Alex

    2015-01-01

    Recognising objects relies on highly dynamic, interactive brain networks to process multiple aspects of object information. To fully understand how different forms of information about objects are represented and processed in the brain requires a neurocognitive account of visual object recognition that combines a detailed cognitive model of semantic knowledge with a neurobiological model of visual object processing. Here we ask how specific cognitive factors are instantiated in our mental processes and how they dynamically evolve over time. We suggest that coarse semantic information, based on generic shared semantic knowledge, is rapidly extracted from visual inputs and is sufficient to drive rapid category decisions. Subsequent recurrent neural activity between the anterior temporal lobe and posterior fusiform supports the formation of object-specific semantic representations – a conjunctive process primarily driven by the perirhinal cortex. These object-specific representations require the integration of shared and distinguishing object properties and support the unique recognition of objects. We conclude that a valuable way of understanding the cognitive activity of the brain is though testing the relationship between specific cognitive measures and dynamic neural activity. This kind of approach allows us to move towards uncovering the information processing states of the brain and how they evolve over time. PMID:25745632

  1. Motivation within the Information Processing Model of Foreign Language Learning

    Science.gov (United States)

    Manolopoulou-Sergi, Eleni

    2004-01-01

    The present article highlights the importance of the motivational construct for the foreign language learning (FLL) process. More specifically, in the present article it is argued that motivation is likely to play a significant role at all three stages of the FLL process as they are discussed within the information processing model of FLL, namely,…

  2. A neuromathematical model of human information processing and its application to science content acquisition

    Science.gov (United States)

    Anderson, O. Roger

    The rate of information processing during science learning and the efficiency of the learner in mobilizing relevant information in long-term memory as an aid in transmitting newly acquired information to stable storage in long-term memory are fundamental aspects of science content acquisition. These cognitive processes, moreover, may be substantially related in tempo and quality of organization to the efficiency of higher thought processes such as divergent thinking and problem-solving ability that characterize scientific thought. As a contribution to our quantitative understanding of these fundamental information processes, a mathematical model of information acquisition is presented and empirically evaluated in comparison to evidence obtained from experimental studies of science content acquisition. Computer-based models are used to simulate variations in learning parameters and to generate the theoretical predictions to be empirically tested. The initial tests of the predictive accuracy of the model show close agreement between predicted and actual mean recall scores in short-term learning tasks. Implications of the model for human information acquisition and possible future research are discussed in the context of the unique theoretical framework of the model.

  3. Work and information processing in a solvable model of Maxwell's demon.

    Science.gov (United States)

    Mandal, Dibyendu; Jarzynski, Christopher

    2012-07-17

    We describe a minimal model of an autonomous Maxwell demon, a device that delivers work by rectifying thermal fluctuations while simultaneously writing information to a memory register. We solve exactly for the steady-state behavior of our model, and we construct its phase diagram. We find that our device can also act as a "Landauer eraser", using externally supplied work to remove information from the memory register. By exposing an explicit, transparent mechanism of operation, our model offers a simple paradigm for investigating the thermodynamics of information processing by small systems.

  4. Thermodynamics of information processing based on enzyme kinetics: An exactly solvable model of an information pump

    Science.gov (United States)

    Cao, Yuansheng; Gong, Zongping; Quan, H. T.

    2015-06-01

    Motivated by the recent proposed models of the information engine [Proc. Natl. Acad. Sci. USA 109, 11641 (2012), 10.1073/pnas.1204263109] and the information refrigerator [Phys. Rev. Lett. 111, 030602 (2013), 10.1103/PhysRevLett.111.030602], we propose a minimal model of the information pump and the information eraser based on enzyme kinetics. This device can either pump molecules against the chemical potential gradient by consuming the information to be encoded in the bit stream or (partially) erase the information initially encoded in the bit stream by consuming the Gibbs free energy. The dynamics of this model is solved exactly, and the "phase diagram" of the operation regimes is determined. The efficiency and the power of the information machine is analyzed. The validity of the second law of thermodynamics within our model is clarified. Our model offers a simple paradigm for the investigating of the thermodynamics of information processing involving the chemical potential in small systems.

  5. A Social Information Processing Model of Media Use in Organizations.

    Science.gov (United States)

    Fulk, Janet; And Others

    1987-01-01

    Presents a model to examine how social influence processes affect individuals' attitudes toward communication media and media use behavior, integrating two research areas: media use patterns as the outcome of objectively rational choices and social information processing theory. Asserts (in a synthesis) that media characteristics and attitudes are…

  6. A cascade model of information processing and encoding for retinal prosthesis.

    Science.gov (United States)

    Pei, Zhi-Jun; Gao, Guan-Xin; Hao, Bo; Qiao, Qing-Li; Ai, Hui-Jian

    2016-04-01

    Retinal prosthesis offers a potential treatment for individuals suffering from photoreceptor degeneration diseases. Establishing biological retinal models and simulating how the biological retina convert incoming light signal into spike trains that can be properly decoded by the brain is a key issue. Some retinal models have been presented, ranking from structural models inspired by the layered architecture to functional models originated from a set of specific physiological phenomena. However, Most of these focus on stimulus image compression, edge detection and reconstruction, but do not generate spike trains corresponding to visual image. In this study, based on state-of-the-art retinal physiological mechanism, including effective visual information extraction, static nonlinear rectification of biological systems and neurons Poisson coding, a cascade model of the retina including the out plexiform layer for information processing and the inner plexiform layer for information encoding was brought forward, which integrates both anatomic connections and functional computations of retina. Using MATLAB software, spike trains corresponding to stimulus image were numerically computed by four steps: linear spatiotemporal filtering, static nonlinear rectification, radial sampling and then Poisson spike generation. The simulated results suggested that such a cascade model could recreate visual information processing and encoding functionalities of the retina, which is helpful in developing artificial retina for the retinally blind.

  7. Processing of angular motion and gravity information through an internal model.

    Science.gov (United States)

    Laurens, Jean; Straumann, Dominik; Hess, Bernhard J M

    2010-09-01

    The vestibular organs in the base of the skull provide important information about head orientation and motion in space. Previous studies have suggested that both angular velocity information from the semicircular canals and information about head orientation and translation from the otolith organs are centrally processed in an internal model of head motion, using the principles of optimal estimation. This concept has been successfully applied to model behavioral responses to classical vestibular motion paradigms. This study measured the dynamic of the vestibuloocular reflex during postrotatory tilt, tilt during the optokinetic afternystagmus, and off-vertical axis rotation. The influence of otolith signal on the VOR was systematically varied by using a series of tilt angles. We found that the time constants of responses varied almost identically as a function of gravity in these paradigms. We show that Bayesian modeling could predict the experimental results in an accurate and consistent manner. In contrast to other approaches, the Bayesian model also provides a plausible explanation of why these vestibulooculo motor responses occur as a consequence of an internal process of optimal motion estimation.

  8. Approaching the Affective Factors of Information Seeking: The Viewpoint of the Information Search Process Model

    Science.gov (United States)

    Savolainen, Reijo

    2015-01-01

    Introduction: The article contributes to the conceptual studies of affective factors in information seeking by examining Kuhlthau's information search process model. Method: This random-digit dial telephone survey of 253 people (75% female) living in a rural, medically under-serviced area of Ontario, Canada, follows-up a previous interview study…

  9. The experiential health information processing model: supporting collaborative web-based patient education.

    Science.gov (United States)

    O'Grady, Laura A; Witteman, Holly; Wathen, C Nadine

    2008-12-16

    First generation Internet technologies such as mailing lists or newsgroups afforded unprecedented levels of information exchange within a variety of interest groups, including those who seek health information. With emergence of the World Wide Web many communication applications were ported to web browsers. One of the driving factors in this phenomenon has been the exchange of experiential or anecdotal knowledge that patients share online, and there is emerging evidence that participation in these forums may be having an impact on people's health decision making. Theoretical frameworks supporting this form of information seeking and learning have yet to be proposed. In this article, we propose an adaptation of Kolb's experiential learning theory to begin to formulate an experiential health information processing model that may contribute to our understanding of online health information seeking behaviour in this context. An experiential health information processing model is proposed that can be used as a research framework. Future research directions include investigating the utility of this model in the online health information seeking context, studying the impact of collaborating in these online environments on patient decision making and on health outcomes are provided.

  10. The experiential health information processing model: supporting collaborative web-based patient education

    Science.gov (United States)

    O'Grady, Laura A; Witteman, Holly; Wathen, C Nadine

    2008-01-01

    Background First generation Internet technologies such as mailing lists or newsgroups afforded unprecedented levels of information exchange within a variety of interest groups, including those who seek health information. With emergence of the World Wide Web many communication applications were ported to web browsers. One of the driving factors in this phenomenon has been the exchange of experiential or anecdotal knowledge that patients share online, and there is emerging evidence that participation in these forums may be having an impact on people's health decision making. Theoretical frameworks supporting this form of information seeking and learning have yet to be proposed. Results In this article, we propose an adaptation of Kolb's experiential learning theory to begin to formulate an experiential health information processing model that may contribute to our understanding of online health information seeking behaviour in this context. Conclusion An experiential health information processing model is proposed that can be used as a research framework. Future research directions include investigating the utility of this model in the online health information seeking context, studying the impact of collaborating in these online environments on patient decision making and on health outcomes are provided. PMID:19087353

  11. The experiential health information processing model: supporting collaborative web-based patient education

    Directory of Open Access Journals (Sweden)

    Wathen C Nadine

    2008-12-01

    Full Text Available Abstract Background First generation Internet technologies such as mailing lists or newsgroups afforded unprecedented levels of information exchange within a variety of interest groups, including those who seek health information. With emergence of the World Wide Web many communication applications were ported to web browsers. One of the driving factors in this phenomenon has been the exchange of experiential or anecdotal knowledge that patients share online, and there is emerging evidence that participation in these forums may be having an impact on people's health decision making. Theoretical frameworks supporting this form of information seeking and learning have yet to be proposed. Results In this article, we propose an adaptation of Kolb's experiential learning theory to begin to formulate an experiential health information processing model that may contribute to our understanding of online health information seeking behaviour in this context. Conclusion An experiential health information processing model is proposed that can be used as a research framework. Future research directions include investigating the utility of this model in the online health information seeking context, studying the impact of collaborating in these online environments on patient decision making and on health outcomes are provided.

  12. System model the processing of heterogeneous sensory information in robotized complex

    Science.gov (United States)

    Nikolaev, V.; Titov, V.; Syryamkin, V.

    2018-05-01

    Analyzed the scope and the types of robotic systems consisting of subsystems of the form "a heterogeneous sensors data processing subsystem". On the basis of the Queuing theory model is developed taking into account the unevenness of the intensity of information flow from the sensors to the subsystem of information processing. Analytical solution to assess the relationship of subsystem performance and uneven flows. The research of the obtained solution in the range of parameter values of practical interest.

  13. Animal models for information processing during sleep

    NARCIS (Netherlands)

    Coenen, A.M.L.; Drinkenburg, W.H.I.M.

    2002-01-01

    Information provided by external stimuli does reach the brain during sleep, although the amount of information is reduced during sleep compared to wakefulness. The process controlling this reduction is called `sensory' gating and evidence exists that the underlying neurophysiological processes take

  14. Temporal Information Processing and Stability Analysis of the MHSN Neuron Model in DDF

    Directory of Open Access Journals (Sweden)

    Saket Kumar Choudhary

    2016-12-01

    Full Text Available Implementation of a neuron like information processing structure at hardware level is a burning research problem. In this article, we analyze the modified hybrid spiking neuron model (the MHSN model in distributed delay framework (DDF for hardware level implementation point of view. We investigate its temporal information processing capability in term of inter-spike-interval (ISI distribution. We also perform the stability analysis of the MHSN model, in which, we compute nullclines, steady state solution, eigenvalues corresponding the MHSN model. During phase plane analysis, we notice that the MHSN model generates limit cycle oscillations which is an important phenomenon in many biological processes. Qualitative behavior of these limit cycle does not changes due to the variation in applied input stimulus, however, delay effect the spiking activity and duration of cycle get altered.

  15. Application of SADT and ARIS methodologies for modeling and management of business processes of information systems

    Directory of Open Access Journals (Sweden)

    O. V. Fedorova

    2018-01-01

    Full Text Available The article is devoted to application of SADT and ARIS methodologies for modeling and management of business processes of information systems. The relevance of this article is beyond doubt, because the design of the architecture of information systems, based on a thorough system analysis of the subject area, is of paramount importance for the development of information systems in general. The authors conducted a serious work on the analysis of the application of SADT and ARIS methodologies for modeling and managing business processes of information systems. The analysis was carried out both in terms of modeling business processes (notation and applying the CASE-tool, and in terms of business process management. The first point of view reflects the interaction of the business analyst and the programmer in the development of the information system. The second point of view is the interaction of the business analyst and the customer. The basis of many modern methodologies for modeling business processes is the SADT methodology. Using the methodology of the IDEF family, it is possible to efficiently display and analyze the activity models of a wide range of complex information systems in various aspects. CASE-tool ARIS is a complex of tools for analysis and modeling of the organization's activities. The methodical basis of ARIS is a set of different modeling methods that reflect different views on the system under study. The authors' conclusions are fully justified. The results of the work can be useful for specialists in the field of modeling business processes of information systems. In addition, the article has an oriented character when working on the constituent elements of curricula for students specializing in information specialties and management, provides an update of the content and structure of disciplines on modeling the architecture of information systems and organization management, using models.

  16. Parametric models to relate spike train and LFP dynamics with neural information processing.

    Science.gov (United States)

    Banerjee, Arpan; Dean, Heather L; Pesaran, Bijan

    2012-01-01

    Spike trains and local field potentials (LFPs) resulting from extracellular current flows provide a substrate for neural information processing. Understanding the neural code from simultaneous spike-field recordings and subsequent decoding of information processing events will have widespread applications. One way to demonstrate an understanding of the neural code, with particular advantages for the development of applications, is to formulate a parametric statistical model of neural activity and its covariates. Here, we propose a set of parametric spike-field models (unified models) that can be used with existing decoding algorithms to reveal the timing of task or stimulus specific processing. Our proposed unified modeling framework captures the effects of two important features of information processing: time-varying stimulus-driven inputs and ongoing background activity that occurs even in the absence of environmental inputs. We have applied this framework for decoding neural latencies in simulated and experimentally recorded spike-field sessions obtained from the lateral intraparietal area (LIP) of awake, behaving monkeys performing cued look-and-reach movements to spatial targets. Using both simulated and experimental data, we find that estimates of trial-by-trial parameters are not significantly affected by the presence of ongoing background activity. However, including background activity in the unified model improves goodness of fit for predicting individual spiking events. Uncovering the relationship between the model parameters and the timing of movements offers new ways to test hypotheses about the relationship between neural activity and behavior. We obtained significant spike-field onset time correlations from single trials using a previously published data set where significantly strong correlation was only obtained through trial averaging. We also found that unified models extracted a stronger relationship between neural response latency and trial

  17. Clinical, information and business process modeling to promote development of safe and flexible software.

    Science.gov (United States)

    Liaw, Siaw-Teng; Deveny, Elizabeth; Morrison, Iain; Lewis, Bryn

    2006-09-01

    Using a factorial vignette survey and modeling methodology, we developed clinical and information models - incorporating evidence base, key concepts, relevant terms, decision-making and workflow needed to practice safely and effectively - to guide the development of an integrated rule-based knowledge module to support prescribing decisions in asthma. We identified workflows, decision-making factors, factor use, and clinician information requirements. The Unified Modeling Language (UML) and public domain software and knowledge engineering tools (e.g. Protégé) were used, with the Australian GP Data Model as the starting point for expressing information needs. A Web Services service-oriented architecture approach was adopted within which to express functional needs, and clinical processes and workflows were expressed in the Business Process Execution Language (BPEL). This formal analysis and modeling methodology to define and capture the process and logic of prescribing best practice in a reference implementation is fundamental to tackling deficiencies in prescribing decision support software.

  18. Information Processing and Risk Perception: An Adaptation of the Heuristic-Systematic Model.

    Science.gov (United States)

    Trumbo, Craig W.

    2002-01-01

    Describes heuristic-systematic information-processing model and risk perception--the two major conceptual areas of the analysis. Discusses the proposed model, describing the context of the data collections (public health communication involving cancer epidemiology) and providing the results of a set of three replications using the proposed model.…

  19. Motivated information processing and group decision-making : Effects of process accountability on information processing and decision quality

    NARCIS (Netherlands)

    Scholten, Lotte; van Knippenberg, Daan; Nijstad, Bernard A.; De Dreu, Carsten K. W.

    Integrating dual-process models [Chaiken, S., & Trope, Y. (Eds.). (1999). Dual-process theories in social psychology. NewYork: Guilford Press] with work on information sharing and group decision-making [Stasser, G., & Titus, W. (1985). Pooling of unshared information in group decision making: biased

  20. A model of designing as the intersection between uncertainty perception, information processing, and coevolution

    DEFF Research Database (Denmark)

    Lasso, Sarah Venturim; Cash, Philip; Daalhuizen, Jaap

    2016-01-01

    , the designer's perceived uncertainty is the motivation to start a process of collecting, exchanging, and integrating knowledge. This has been formalised in Information-Processing Theory and more generally described by authors such as Aurisicchio et al. (2013) who describe design as an information...... takes the first steps towards linking these disparate perspectives in a model of designing that synthesises coevolution and information processing. How designers act has been shown to play an important role in the process of New Product Development (NPD) (See e.g. Badke-Schaub and Frankenberger, 2012...... transformation process. Here the aim of the activity is to reduce the perceived uncertainty through identifying and integrating external information and knowledge within the design team. For2example, when perceiving uncertainty the designer might seek new information online, process this information, and share...

  1. Moral Judgment as Information Processing: An Integrative Review

    Directory of Open Access Journals (Sweden)

    Steve eGuglielmo

    2015-10-01

    Full Text Available This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two fundamental questions: (1 What input information guides moral judgments?; and (2 What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework, critically evaluates them on empirical and theoretical grounds, outlines a general integrative model grounded in information processing, and offers conceptual and methodological suggestions for future research. The information processing perspective provides a useful theoretical framework for organizing extant and future work in the rapidly growing field of moral judgment.

  2. Automatic Generation of Object Models for Process Planning and Control Purposes using an International standard for Information Exchange

    Directory of Open Access Journals (Sweden)

    Petter Falkman

    2003-10-01

    Full Text Available In this paper a formal mapping between static information models and dynamic models is presented. The static information models are given according to an international standard for product, process and resource information exchange, (ISO 10303-214. The dynamic models are described as Discrete Event Systems. The product, process and resource information is automatically converted into product routes and used for simulation, controller synthesis and verification. A high level language, combining Petri nets and process algebra, is presented and used for speci- fication of desired routes. A main implication of the presented method is that it enables the reuse of process information when creating dynamic models for process control. This method also enables simulation and verification to be conducted early in the development chain.

  3. [DESCRIPTION AND PRESENTATION OF THE RESULTS OF ELECTROENCEPHALOGRAM PROCESSING USING AN INFORMATION MODEL].

    Science.gov (United States)

    Myznikov, I L; Nabokov, N L; Rogovanov, D Yu; Khankevich, Yu R

    2016-01-01

    The paper proposes to apply the informational modeling of correlation matrix developed by I.L. Myznikov in early 1990s in neurophysiological investigations, such as electroencephalogram recording and analysis, coherence description of signals from electrodes on the head surface. The authors demonstrate information models built using the data from studies of inert gas inhalation by healthy human subjects. In the opinion of the authors, information models provide an opportunity to describe physiological processes with a high level of generalization. The procedure of presenting the EEG results holds great promise for the broad application.

  4. Moral judgment as information processing: an integrative review.

    Science.gov (United States)

    Guglielmo, Steve

    2015-01-01

    How do humans make moral judgments about others' behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment.

  5. Moral judgment as information processing: an integrative review

    Science.gov (United States)

    Guglielmo, Steve

    2015-01-01

    How do humans make moral judgments about others’ behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment. PMID:26579022

  6. Quantum-like model of processing of information in the brain based on classical electromagnetic field.

    Science.gov (United States)

    Khrennikov, Andrei

    2011-09-01

    We propose a model of quantum-like (QL) processing of mental information. This model is based on quantum information theory. However, in contrast to models of "quantum physical brain" reducing mental activity (at least at the highest level) to quantum physical phenomena in the brain, our model matches well with the basic neuronal paradigm of the cognitive science. QL information processing is based (surprisingly) on classical electromagnetic signals induced by joint activity of neurons. This novel approach to quantum information is based on representation of quantum mechanics as a version of classical signal theory which was recently elaborated by the author. The brain uses the QL representation (QLR) for working with abstract concepts; concrete images are described by classical information theory. Two processes, classical and QL, are performed parallely. Moreover, information is actively transmitted from one representation to another. A QL concept given in our model by a density operator can generate a variety of concrete images given by temporal realizations of the corresponding (Gaussian) random signal. This signal has the covariance operator coinciding with the density operator encoding the abstract concept under consideration. The presence of various temporal scales in the brain plays the crucial role in creation of QLR in the brain. Moreover, in our model electromagnetic noise produced by neurons is a source of superstrong QL correlations between processes in different spatial domains in the brain; the binding problem is solved on the QL level, but with the aid of the classical background fluctuations. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  7. Application of a model of social information processing to nursing theory: how nurses respond to patients.

    Science.gov (United States)

    Sheldon, Lisa Kennedy; Ellington, Lee

    2008-11-01

    This paper is a report of a study to assess the applicability of a theoretical model of social information processing in expanding a nursing theory addressing how nurses respond to patients. Nursing communication affects patient outcomes such as anxiety, adherence to treatments and satisfaction with care. Orlando's theory of nursing process describes nurses' reactions to patients' behaviour as generating a perception, thought and feeling in the nurse and then action by the nurse. A model of social information processing describes the sequential steps in the cognitive processes used to respond to social cues and may be useful in describing the nursing process. Cognitive interviews were conducted in 2006 with a convenience sample of 5 nurses in the United States of America. The data were interpreted using the Crick and Dodge model of social information processing. Themes arising from cognitive interviews validated concepts of the nursing theory and the constructs of the model of social information processing. The interviews revealed that the support of peers was an additional construct involved in the development of communication skills, creation of a database and enhancement of self-efficacy. Models of social information processing enhance understanding of the process of how nurses respond to patients and further develop nursing theories further. In combination, the theories are useful in developing research into nurse-patient communication. Future research based on the expansion of nursing theory may identify effective and culturally appropriate nurse response patterns to specific patient interactions with implications for nursing care and patient outcomes.

  8. ADHD performance reflects inefficient but not impulsive information processing: a diffusion model analysis.

    Science.gov (United States)

    Metin, Baris; Roeyers, Herbert; Wiersema, Jan R; van der Meere, Jaap J; Thompson, Margaret; Sonuga-Barke, Edmund

    2013-03-01

    Attention-deficit/hyperactivity disorder (ADHD) is associated with performance deficits across a broad range of tasks. Although individual tasks are designed to tap specific cognitive functions (e.g., memory, inhibition, planning, etc.), these deficits could also reflect general effects related to either inefficient or impulsive information processing or both. These two components cannot be isolated from each other on the basis of classical analysis in which mean reaction time (RT) and mean accuracy are handled separately. Seventy children with a diagnosis of combined type ADHD and 50 healthy controls (between 6 and 17 years) performed two tasks: a simple two-choice RT (2-CRT) task and a conflict control task (CCT) that required higher levels of executive control. RT and errors were analyzed using the Ratcliff diffusion model, which divides decisional time into separate estimates of information processing efficiency (called "drift rate") and speed-accuracy tradeoff (SATO, called "boundary"). The model also provides an estimate of general nondecisional time. Results were the same for both tasks independent of executive load. ADHD was associated with lower drift rate and less nondecisional time. The groups did not differ in terms of boundary parameter estimates. RT and accuracy performance in ADHD appears to reflect inefficient rather than impulsive information processing, an effect independent of executive function load. The results are consistent with models in which basic information processing deficits make an important contribution to the ADHD cognitive phenotype. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  9. Information management in process planning

    NARCIS (Netherlands)

    Lutters, Diederick; Wijnker, T.C.; Kals, H.J.J.

    1999-01-01

    A recently proposed reference model indicates the use of structured information as the basis for the control of design and manufacturing processes. The model is used as a basis to describe the integration of design and process planning. A differentiation is made between macro- and micro process

  10. Comprehensive process model of clinical information interaction in primary care: results of a "best-fit" framework synthesis.

    Science.gov (United States)

    Veinot, Tiffany C; Senteio, Charles R; Hanauer, David; Lowery, Julie C

    2018-06-01

    To describe a new, comprehensive process model of clinical information interaction in primary care (Clinical Information Interaction Model, or CIIM) based on a systematic synthesis of published research. We used the "best fit" framework synthesis approach. Searches were performed in PubMed, Embase, the Cumulative Index to Nursing and Allied Health Literature (CINAHL), PsycINFO, Library and Information Science Abstracts, Library, Information Science and Technology Abstracts, and Engineering Village. Two authors reviewed articles according to inclusion and exclusion criteria. Data abstraction and content analysis of 443 published papers were used to create a model in which every element was supported by empirical research. The CIIM documents how primary care clinicians interact with information as they make point-of-care clinical decisions. The model highlights 3 major process components: (1) context, (2) activity (usual and contingent), and (3) influence. Usual activities include information processing, source-user interaction, information evaluation, selection of information, information use, clinical reasoning, and clinical decisions. Clinician characteristics, patient behaviors, and other professionals influence the process. The CIIM depicts the complete process of information interaction, enabling a grasp of relationships previously difficult to discern. The CIIM suggests potentially helpful functionality for clinical decision support systems (CDSSs) to support primary care, including a greater focus on information processing and use. The CIIM also documents the role of influence in clinical information interaction; influencers may affect the success of CDSS implementations. The CIIM offers a new framework for achieving CDSS workflow integration and new directions for CDSS design that can support the work of diverse primary care clinicians.

  11. Studies on Manfred Eigen's model for the self-organization of information processing.

    Science.gov (United States)

    Ebeling, W; Feistel, R

    2018-05-01

    In 1971, Manfred Eigen extended the principles of Darwinian evolution to chemical processes, from catalytic networks to the emergence of information processing at the molecular level, leading to the emergence of life. In this paper, we investigate some very general characteristics of this scenario, such as the valuation process of phenotypic traits in a high-dimensional fitness landscape, the effect of spatial compartmentation on the valuation, and the self-organized transition from structural to symbolic genetic information of replicating chain molecules. In the first part, we perform an analysis of typical dynamical properties of continuous dynamical models of evolutionary processes. In particular, we study the mapping of genotype to continuous phenotype spaces following the ideas of Wright and Conrad. We investigate typical features of a Schrödinger-like dynamics, the consequences of the high dimensionality, the leading role of saddle points, and Conrad's extra-dimensional bypass. In the last part, we discuss in brief the valuation of compartment models and the self-organized emergence of molecular symbols at the beginning of life.

  12. Testing the Causal Mediation Component of Dodge's Social Information Processing Model of Social Competence and Depression

    Science.gov (United States)

    Possel, Patrick; Seemann, Simone; Ahrens, Stefanie; Hautzinger, Martin

    2006-01-01

    In Dodge's model of "social information processing" depression is the result of a linear sequence of five stages of information processing ("Annu Rev Psychol" 44: 559-584, 1993). These stages follow a person's reaction to situational stimuli, such that each stage of information processing mediates the relationship between earlier and later stages.…

  13. Extending the 4I Organizational Learning Model: Information Sources, Foraging Processes and Tools

    Directory of Open Access Journals (Sweden)

    Tracy A. Jenkin

    2013-08-01

    Full Text Available The continued importance of organizational learning has recently led to several calls for further developing the theory. This article addresses these calls by extending Crossan, Lane and White’s (1999 4I model to include a fifth process, information foraging, and a fourth level, the tool. The resulting 5I organizational learning model can be generalized to a number of learning contexts, especially those that involve understanding and making sense of data and information. Given the need for organizations to both innovate and increase productivity, and the volumes of data and information that are available to support both, the 5I model addresses an important organizational issue.

  14. Process-aware information systems : lessons to be learned from process mining

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Jensen, K.; Aalst, van der W.M.P.

    2009-01-01

    A Process-Aware Information System (PAIS) is a software system that manages and executes operational processes involving people, applications, and/or information sources on the basis of process models. Example PAISs are workflow management systems, case-handling systems, enterprise information

  15. Information processing among high-performance managers

    Directory of Open Access Journals (Sweden)

    S.C. Garcia-Santos

    2010-01-01

    Full Text Available The purpose of this study was to evaluate the information processing of 43 business managers with a professional superior performance. The theoretical framework considers three models: the Theory of Managerial Roles of Henry Mintzberg, the Theory of Information Processing, and Process Model Response to Rorschach by John Exner. The participants have been evaluated by Rorschach method. The results show that these managers are able to collect data, evaluate them and establish rankings properly. At same time, they are capable of being objective and accurate in the problems assessment. This information processing style permits an interpretation of the world around on basis of a very personal and characteristic processing way or cognitive style.

  16. Neurobiological correlates of cognitions in fear and anxiety: a cognitive-neurobiological information-processing model.

    Science.gov (United States)

    Hofmann, Stefan G; Ellard, Kristen K; Siegle, Greg J

    2012-01-01

    We review likely neurobiological substrates of cognitions related to fear and anxiety. Cognitive processes are linked to abnormal early activity reflecting hypervigilance in subcortical networks involving the amygdala, hippocampus, and insular cortex, and later recruitment of cortical regulatory resources, including activation of the anterior cingulate cortex and prefrontal cortex to implement avoidant response strategies. Based on this evidence, we present a cognitive-neurobiological information-processing model of fear and anxiety, linking distinct brain structures to specific stages of information processing of perceived threat.

  17. Applying Catastrophe Theory to an Information-Processing Model of Problem Solving in Science Education

    Science.gov (United States)

    Stamovlasis, Dimitrios; Tsaparlis, Georgios

    2012-01-01

    In this study, we test an information-processing model (IPM) of problem solving in science education, namely the working memory overload model, by applying catastrophe theory. Changes in students' achievement were modeled as discontinuities within a cusp catastrophe model, where working memory capacity was implemented as asymmetry and the degree…

  18. Toward a Model of Human Information Processing for Decision-Making and Skill Acquisition in Laparoscopic Colorectal Surgery.

    Science.gov (United States)

    White, Eoin J; McMahon, Muireann; Walsh, Michael T; Coffey, J Calvin; O Sullivan, Leonard

    To create a human information-processing model for laparoscopic surgery based on already established literature and primary research to enhance laparoscopic surgical education in this context. We reviewed the literature for information-processing models most relevant to laparoscopic surgery. Our review highlighted the necessity for a model that accounts for dynamic environments, perception, allocation of attention resources between the actions of both hands of an operator, and skill acquisition and retention. The results of the literature review were augmented through intraoperative observations of 7 colorectal surgical procedures, supported by laparoscopic video analysis of 12 colorectal procedures. The Wickens human information-processing model was selected as the most relevant theoretical model to which we make adaptions for this specific application. We expanded the perception subsystem of the model to involve all aspects of perception during laparoscopic surgery. We extended the decision-making system to include dynamic decision-making to account for case/patient-specific and surgeon-specific deviations. The response subsystem now includes dual-task performance and nontechnical skills, such as intraoperative communication. The memory subsystem is expanded to include skill acquisition and retention. Surgical decision-making during laparoscopic surgery is the result of a highly complex series of processes influenced not only by the operator's knowledge, but also patient anatomy and interaction with the surgical team. Newer developments in simulation-based education must focus on the theoretically supported elements and events that underpin skill acquisition and affect the cognitive abilities of novice surgeons. The proposed human information-processing model builds on established literature regarding information processing, accounting for a dynamic environment of laparoscopic surgery. This revised model may be used as a foundation for a model describing robotic

  19. Is there room for 'development' in developmental models of information processing biases to threat in children and adolescents?

    Science.gov (United States)

    Field, Andy P; Lester, Kathryn J

    2010-12-01

    Clinical and experimental theories assume that processing biases in attention and interpretation are a causal mechanism through which anxiety develops. Despite growing evidence that these processing biases are present in children and, therefore, develop long before adulthood, these theories ignore the potential role of child development. This review attempts to place information processing biases within a theoretical developmental framework. We consider whether child development has no impact on information processing biases to threat (integral bias model), or whether child development influences information processing biases and if so whether it does so by moderating the expression of an existing bias (moderation model) or by affecting the acquisition of a bias (acquisition model). We examine the extent to which these models fit with existing theory and research evidence and outline some methodological issues that need to be considered when drawing conclusions about the potential role of child development in the information processing of threat stimuli. Finally, we speculate about the developmental processes that might be important to consider in future research.

  20. Cognitive Theory within the Framework of an Information Processing Model and Learning Hierarchy: Viable Alternative to the Bloom-Mager System.

    Science.gov (United States)

    Stahl, Robert J.

    This review of the current status of the human information processing model presents the Stahl Perceptual Information Processing and Operations Model (SPInPrOM) as a model of how thinking, memory, and the processing of information take place within the individual learner. A related system, the Domain of Cognition, is presented as an alternative to…

  1. Evolution of the archaeal and mammalian information processing systems: towards an archaeal model for human disease.

    Science.gov (United States)

    Lyu, Zhe; Whitman, William B

    2017-01-01

    Current evolutionary models suggest that Eukaryotes originated from within Archaea instead of being a sister lineage. To test this model of ancient evolution, we review recent studies and compare the three major information processing subsystems of replication, transcription and translation in the Archaea and Eukaryotes. Our hypothesis is that if the Eukaryotes arose within the archaeal radiation, their information processing systems will appear to be one of kind and not wholly original. Within the Eukaryotes, the mammalian or human systems are emphasized because of their importance in understanding health. Biochemical as well as genetic studies provide strong evidence for the functional similarity of archaeal homologs to the mammalian information processing system and their dissimilarity to the bacterial systems. In many independent instances, a simple archaeal system is functionally equivalent to more elaborate eukaryotic homologs, suggesting that evolution of complexity is likely an central feature of the eukaryotic information processing system. Because fewer components are often involved, biochemical characterizations of the archaeal systems are often easier to interpret. Similarly, the archaeal cell provides a genetically and metabolically simpler background, enabling convenient studies on the complex information processing system. Therefore, Archaea could serve as a parsimonious and tractable host for studying human diseases that arise in the information processing systems.

  2. Dendritic excitability modulates dendritic information processing in a purkinje cell model.

    Science.gov (United States)

    Coop, Allan D; Cornelis, Hugo; Santamaria, Fidel

    2010-01-01

    Using an electrophysiological compartmental model of a Purkinje cell we quantified the contribution of individual active dendritic currents to processing of synaptic activity from granule cells. We used mutual information as a measure to quantify the information from the total excitatory input current (I(Glu)) encoded in each dendritic current. In this context, each active current was considered an information channel. Our analyses showed that most of the information was encoded by the calcium (I(CaP)) and calcium activated potassium (I(Kc)) currents. Mutual information between I(Glu) and I(CaP) and I(Kc) was sensitive to different levels of excitatory and inhibitory synaptic activity that, at the same time, resulted in the same firing rate at the soma. Since dendritic excitability could be a mechanism to regulate information processing in neurons we quantified the changes in mutual information between I(Glu) and all Purkinje cell currents as a function of the density of dendritic Ca (g(CaP)) and Kca (g(Kc)) conductances. We extended our analysis to determine the window of temporal integration of I(Glu) by I(CaP) and I(Kc) as a function of channel density and synaptic activity. The window of information integration has a stronger dependence on increasing values of g(Kc) than on g(CaP), but at high levels of synaptic stimulation information integration is reduced to a few milliseconds. Overall, our results show that different dendritic conductances differentially encode synaptic activity and that dendritic excitability and the level of synaptic activity regulate the flow of information in dendrites.

  3. Using stochastic language models (SLM) to map lexical, syntactic, and phonological information processing in the brain.

    Science.gov (United States)

    Lopopolo, Alessandro; Frank, Stefan L; van den Bosch, Antal; Willems, Roel M

    2017-01-01

    Language comprehension involves the simultaneous processing of information at the phonological, syntactic, and lexical level. We track these three distinct streams of information in the brain by using stochastic measures derived from computational language models to detect neural correlates of phoneme, part-of-speech, and word processing in an fMRI experiment. Probabilistic language models have proven to be useful tools for studying how language is processed as a sequence of symbols unfolding in time. Conditional probabilities between sequences of words are at the basis of probabilistic measures such as surprisal and perplexity which have been successfully used as predictors of several behavioural and neural correlates of sentence processing. Here we computed perplexity from sequences of words and their parts of speech, and their phonemic transcriptions. Brain activity time-locked to each word is regressed on the three model-derived measures. We observe that the brain keeps track of the statistical structure of lexical, syntactic and phonological information in distinct areas.

  4. What do information reuse and automated processing require in engineering design? Semantic process

    Directory of Open Access Journals (Sweden)

    Ossi Nykänen

    2011-12-01

    Full Text Available Purpose: The purpose of this study is to characterize, analyze, and demonstrate machine-understandable semantic process for validating, integrating, and processing technical design information. This establishes both a vision and tools for information reuse and semi-automatic processing in engineering design projects, including virtual machine laboratory applications with generated components.Design/methodology/approach: The process model has been developed iteratively in terms of action research, constrained by the existing technical design practices and assumptions (design documents, expert feedback, available technologies (pre-studies and experiments with scripting and pipeline tools, benchmarking with other process models and methods (notably the RUP and DITA, and formal requirements (computability and the critical information paths for the generated applications. In practice, the work includes both quantitative and qualitative components.Findings: Technical design processes may be greatly enhanced in terms of semantic process thinking, by enriching design information, and automating information validation and transformation tasks. Contemporary design information, however, is mainly intended for human consumption, and needs to be explicitly enriched with the currently missing data and interfaces. In practice, this may require acknowledging the role of technical information or knowledge engineer, to lead the development of the semantic design information process in a design organization. There is also a trade-off between machine-readability and system complexity that needs to be studied further, both empirically and in theory.Research limitations/implications: The conceptualization of the semantic process is essentially an abstraction based on the idea of progressive design. While this effectively allows implementing semantic processes with, e.g., pipeline technologies, the abstraction is valid only when technical design is organized into

  5. Applying the Business Process and Practice Alignment Meta-model: Daily Practices and Process Modelling

    Directory of Open Access Journals (Sweden)

    Ventura Martins Paula

    2017-03-01

    Full Text Available Background: Business Process Modelling (BPM is one of the most important phases of information system design. Business Process (BP meta-models allow capturing informational and behavioural aspects of business processes. Unfortunately, standard BP meta-modelling approaches focus just on process description, providing different BP models. It is not possible to compare and identify related daily practices in order to improve BP models. This lack of information implies that further research in BP meta-models is needed to reflect the evolution/change in BP. Considering this limitation, this paper introduces a new BP meta-model designed by Business Process and Practice Alignment Meta-model (BPPAMeta-model. Our intention is to present a meta-model that addresses features related to the alignment between daily work practices and BP descriptions. Objectives: This paper intends to present a metamodel which is going to integrate daily work information into coherent and sound process definitions. Methods/Approach: The methodology employed in the research follows a design-science approach. Results: The results of the case study are related to the application of the proposed meta-model to align the specification of a BP model with work practices models. Conclusions: This meta-model can be used within the BPPAM methodology to specify or improve business processes models based on work practice descriptions.

  6. BIM. Building Information Model. Special issue; BIM. Building Information Model. Themanummer

    Energy Technology Data Exchange (ETDEWEB)

    Van Gelder, A.L.A. [Arta and Consultancy, Lage Zwaluwe (Netherlands); Van den Eijnden, P.A.A. [Stichting Marktwerking Installatietechniek, Zoetermeer (Netherlands); Veerman, J.; Mackaij, J.; Borst, E. [Royal Haskoning DHV, Nijmegen (Netherlands); Kruijsse, P.M.D. [Wolter en Dros, Amersfoort (Netherlands); Buma, W. [Merlijn Media, Waddinxveen (Netherlands); Bomhof, F.; Willems, P.H.; Boehms, M. [TNO, Delft (Netherlands); Hofman, M.; Verkerk, M. [ISSO, Rotterdam (Netherlands); Bodeving, M. [VIAC Installatie Adviseurs, Houten (Netherlands); Van Ravenswaaij, J.; Van Hoven, H. [BAM Techniek, Bunnik (Netherlands); Boeije, I.; Schalk, E. [Stabiplan, Bodegraven (Netherlands)

    2012-11-15

    A series of 14 articles illustrates the various aspects of the Building Information Model (BIM). The essence of BIM is to capture information about the building process and the building product. [Dutch] In 14 artikelen worden diverse aspecten m.b.t. het Building Information Model (BIM) belicht. De essentie van BIM is het vastleggen van informatie over het bouwproces en het bouwproduct.

  7. Single-process versus multiple-strategy models of decision making: evidence from an information intrusion paradigm.

    Science.gov (United States)

    Söllner, Anke; Bröder, Arndt; Glöckner, Andreas; Betsch, Tilmann

    2014-02-01

    When decision makers are confronted with different problems and situations, do they use a uniform mechanism as assumed by single-process models (SPMs) or do they choose adaptively from a set of available decision strategies as multiple-strategy models (MSMs) imply? Both frameworks of decision making have gathered a lot of support, but only rarely have they been contrasted with each other. Employing an information intrusion paradigm for multi-attribute decisions from givens, SPM and MSM predictions on information search, decision outcomes, attention, and confidence judgments were derived and tested against each other in two experiments. The results consistently support the SPM view: Participants seemingly using a "take-the-best" (TTB) strategy do not ignore TTB-irrelevant information as MSMs would predict, but adapt the amount of information searched, choose alternative choice options, and show varying confidence judgments contingent on the quality of the "irrelevant" information. The uniformity of these findings underlines the adequacy of the novel information intrusion paradigm and comprehensively promotes the notion of a uniform decision making mechanism as assumed by single-process models. Copyright © 2013 The Authors. Published by Elsevier B.V. All rights reserved.

  8. Efficiency of cellular information processing

    International Nuclear Information System (INIS)

    Barato, Andre C; Hartich, David; Seifert, Udo

    2014-01-01

    We show that a rate of conditional Shannon entropy reduction, characterizing the learning of an internal process about an external process, is bounded by the thermodynamic entropy production. This approach allows for the definition of an informational efficiency that can be used to study cellular information processing. We analyze three models of increasing complexity inspired by the Escherichia coli sensory network, where the external process is an external ligand concentration jumping between two values. We start with a simple model for which ATP must be consumed so that a protein inside the cell can learn about the external concentration. With a second model for a single receptor we show that the rate at which the receptor learns about the external environment can be nonzero even without any dissipation inside the cell since chemical work done by the external process compensates for this learning rate. The third model is more complete, also containing adaptation. For this model we show inter alia that a bacterium in an environment that changes at a very slow time-scale is quite inefficient, dissipating much more than it learns. Using the concept of a coarse-grained learning rate, we show for the model with adaptation that while the activity learns about the external signal the option of changing the methylation level increases the concentration range for which the learning rate is substantial. (paper)

  9. Models of neural dynamics in brain information processing - the developments of 'the decade'

    International Nuclear Information System (INIS)

    Borisyuk, G N; Borisyuk, R M; Kazanovich, Yakov B; Ivanitskii, Genrikh R

    2002-01-01

    Neural network models are discussed that have been developed during the last decade with the purpose of reproducing spatio-temporal patterns of neural activity in different brain structures. The main goal of the modeling was to test hypotheses of synchronization, temporal and phase relations in brain information processing. The models being considered are those of temporal structure of spike sequences, of neural activity dynamics, and oscillatory models of attention and feature integration. (reviews of topical problems)

  10. Risk Information Seeking among U.S. and Dutch Residents. An Application of the model of Risk Information Seeking and Processing

    NARCIS (Netherlands)

    ter Huurne, E.F.J.; Griffin, Robert J.; Gutteling, Jan M.

    2009-01-01

    The model of risk information seeking and processing (RISP) proposes characteristics of individuals that might predispose them to seek risk information. The intent of this study is to test the model’s robustness across two independent samples in different nations. Based on data from the United

  11. Information Processing: A Review of Implications of Johnstone's Model for Science Education

    Science.gov (United States)

    St Clair-Thompson, Helen; Overton, Tina; Botton, Chris

    2010-01-01

    The current review is concerned with an information processing model used in science education. The purpose is to summarise the current theoretical understanding, in published research, of a number of factors that are known to influence learning and achievement. These include field independence, working memory, long-term memory, and the use of…

  12. Sustainable Manufacturing via Multi-Scale, Physics-Based Process Modeling and Manufacturing-Informed Design

    Energy Technology Data Exchange (ETDEWEB)

    None

    2017-04-01

    This factsheet describes a project that developed and demonstrated a new manufacturing-informed design framework that utilizes advanced multi-scale, physics-based process modeling to dramatically improve manufacturing productivity and quality in machining operations while reducing the cost of machined components.

  13. Model-free stochastic processes studied with q-wavelet-based informational tools

    International Nuclear Information System (INIS)

    Perez, D.G.; Zunino, L.; Martin, M.T.; Garavaglia, M.; Plastino, A.; Rosso, O.A.

    2007-01-01

    We undertake a model-free investigation of stochastic processes employing q-wavelet based quantifiers, that constitute a generalization of their Shannon counterparts. It is shown that (i) interesting physical information becomes accessible in such a way (ii) for special q values the quantifiers are more sensitive than the Shannon ones and (iii) there exist an implicit relationship between the Hurst parameter H and q within this wavelet framework

  14. The IPOO-model of creative learning and the students' information processing characteristics

    Directory of Open Access Journals (Sweden)

    Katalin Mező

    2015-03-01

    Full Text Available The present study was designed to examine secondary school students' information processing characteristics during learning and their relationship with the students' academic averages, internal motivation for learning and cognitive abilities, such as intelligence and creativity. Although many studies have previously focused on this issue, we are now studying this topic from the perspective of the IPOO-model, which is a new theoretical approach to school learning (note: IPOO is an acronym of Input, Process, Output, Organizing. This study featured 815 participants (secondary school students who completed the following tests and questionnaires: Raven's Advanced Progressive Matrices (APM intelligence test, the "Unusual Uses" creativity test (UUT, the 2nd version of the Jupiterbolha-próba (Jupiter Flea test – JB2 to test the information processing method of learning, and the Learning Attitude Questionnaire (LAQ. In our analysis we took the gender, school grade and academic average of participants into account. According to our results, the quality of information-processing methods of learning is at a low level, and there are no significant strong correlational relationships among the tests and questionnaire results (except in the cases of fluency, originality, and flexibility. There were no significant differences between genders or classes. These findings are consistent with the findings of previous studies.

  15. Holledge gauge failure testing using concurrent information processing algorithm

    International Nuclear Information System (INIS)

    Weeks, G.E.; Daniel, W.E.; Edwards, R.E.; Jannarone, R.J.; Joshi, S.N.; Palakodety, S.S.; Qian, D.

    1996-01-01

    For several decades, computerized information processing systems and human information processing models have developed with a good deal of mutual influence. Any comprehensive psychology text in this decade uses terms that originated in the computer industry, such as ''cache'' and ''memory'', to describe human information processing. Likewise, many engineers today are using ''artificial intelligence''and ''artificial neural network'' computing tools that originated as models of human thought to solve industrial problems. This paper concerns a recently developed human information processing model, called ''concurrent information processing'' (CIP), and a related set of computing tools for solving industrial problems. The problem of focus is adaptive gauge monitoring; the application is pneumatic pressure repeaters (Holledge gauges) used to measure liquid level and density in the Defense Waste Processing Facility and the Integrated DWPF Melter System

  16. Process-aware information systems : bridging people and software through process technology

    NARCIS (Netherlands)

    Dumas, M.; Aalst, van der W.M.P.; Hofstede, ter A.H.M.

    2005-01-01

    A unifying foundation to design and implement process-aware information systems This publication takes on the formidable task of establishing a unifying foundation and set of common underlying principles to effectively model, design, and implement process-aware information systems. Authored by

  17. A language for information commerce processes

    NARCIS (Netherlands)

    Aberer, Karl; Wombacher, Andreas

    Automatizing information commerce requires languages to represent the typical information commerce processes. Existing languages and standards cover either only very specific types of business models or are too general to capture in a concise way the specific properties of information commerce

  18. Modeling Deficits From Early Auditory Information Processing to Psychosocial Functioning in Schizophrenia.

    Science.gov (United States)

    Thomas, Michael L; Green, Michael F; Hellemann, Gerhard; Sugar, Catherine A; Tarasenko, Melissa; Calkins, Monica E; Greenwood, Tiffany A; Gur, Raquel E; Gur, Ruben C; Lazzeroni, Laura C; Nuechterlein, Keith H; Radant, Allen D; Seidman, Larry J; Shiluk, Alexandra L; Siever, Larry J; Silverman, Jeremy M; Sprock, Joyce; Stone, William S; Swerdlow, Neal R; Tsuang, Debby W; Tsuang, Ming T; Turetsky, Bruce I; Braff, David L; Light, Gregory A

    2017-01-01

    Neurophysiologic measures of early auditory information processing (EAP) are used as endophenotypes in genomic studies and biomarkers in clinical intervention studies. Research in schizophrenia has established correlations among measures of EAP, cognition, clinical symptoms, and functional outcome. Clarifying these associations by determining the pathways through which deficits in EAP affect functioning would suggest when and where to therapeutically intervene. To characterize the pathways from EAP to outcome and to estimate the extent to which enhancement of basic information processing might improve cognition and psychosocial functioning in schizophrenia. Cross-sectional data were analyzed using structural equation modeling to examine the associations among EAP, cognition, negative symptoms, and functional outcome. Participants were recruited from the community at 5 geographically distributed laboratories as part of the Consortium on the Genetics of Schizophrenia 2 from July 1, 2010, through January 31, 2014. This well-characterized cohort of 1415 patients with schizophrenia underwent EAP, cognitive, and thorough clinical and functional assessment. Mismatch negativity, P3a, and reorienting negativity were used to measure EAP. Cognition was measured by the Letter Number Span test and scales from the California Verbal Learning Test-Second Edition, the Wechsler Memory Scale-Third Edition, and the Penn Computerized Neurocognitive Battery. Negative symptoms were measured by the Scale for the Assessment of Negative Symptoms. Functional outcome was measured by the Role Functioning Scale. Participants included 1415 unrelated outpatients diagnosed with schizophrenia or schizoaffective disorder (mean [SD] age, 46 [11] years; 979 males [69.2%] and 619 white [43.7%]). Early auditory information processing had a direct effect on cognition (β = 0.37, P model in which EAP deficits lead to poor functional outcome via impaired cognition and increased negative symptoms

  19. The role of information in a lifetime process - a model of weight maintenance by women over long time periods

    Directory of Open Access Journals (Sweden)

    Judit Bar-Ilan

    2006-01-01

    Full Text Available Introduction. This paper proposes a model of information behaviour of women during their life-long struggle to maintain normal weight. Method. The model is integrative and contextual, built on existing models in information science and several other disciplines, and the life stories of about fifty Israeli women aged 25-55 and interviews with professionals. Analysis. The life stories of the participating women were analyzed qualitatively, major themes and phases were identified. Results. Weight loss and/or maintenance behaviour is a lifetime process in which distinctive stages were identified. In most cases the weight gain - weight loss - maintenance cycle is a recurring cycle. Information is a major resource during the process: several roles of information were defined: enabling, motivating, reinforcing, providing background information related to weight problems and creating the internal cognitive schema related to food and weight. Information behaviour and the roles of information vary with the different stages. Information needs are also influenced by the specific stage of the process. Information gathered at previous cycles is reused, and information gained through previous experience effects behaviour in the current cycle. Conclusion. The model has both theoretical and practical implications.

  20. The importance of information goods abstraction levels for information commerce process models

    NARCIS (Netherlands)

    Wijnhoven, Alphonsus B.J.M.

    2002-01-01

    A process model, in the context of e-commerce, is an organized set of activities for the creation, (re-)production, trade and delivery of goods. Electronic commerce studies have created important process models for the trade of physical goods via Internet. These models are not easily suitable for

  1. From information processing to decisions: Formalizing and comparing psychologically plausible choice models.

    Science.gov (United States)

    Heck, Daniel W; Hilbig, Benjamin E; Moshagen, Morten

    2017-08-01

    Decision strategies explain how people integrate multiple sources of information to make probabilistic inferences. In the past decade, increasingly sophisticated methods have been developed to determine which strategy explains decision behavior best. We extend these efforts to test psychologically more plausible models (i.e., strategies), including a new, probabilistic version of the take-the-best (TTB) heuristic that implements a rank order of error probabilities based on sequential processing. Within a coherent statistical framework, deterministic and probabilistic versions of TTB and other strategies can directly be compared using model selection by minimum description length or the Bayes factor. In an experiment with inferences from given information, only three of 104 participants were best described by the psychologically plausible, probabilistic version of TTB. Similar as in previous studies, most participants were classified as users of weighted-additive, a strategy that integrates all available information and approximates rational decisions. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Information Theory Analysis of Cascading Process in a Synthetic Model of Fluid Turbulence

    Directory of Open Access Journals (Sweden)

    Massimo Materassi

    2014-02-01

    Full Text Available The use of transfer entropy has proven to be helpful in detecting which is the verse of dynamical driving in the interaction of two processes, X and Y . In this paper, we present a different normalization for the transfer entropy, which is capable of better detecting the information transfer direction. This new normalized transfer entropy is applied to the detection of the verse of energy flux transfer in a synthetic model of fluid turbulence, namely the Gledzer–Ohkitana–Yamada shell model. Indeed, this is a fully well-known model able to model the fully developed turbulence in the Fourier space, which is characterized by an energy cascade towards the small scales (large wavenumbers k, so that the application of the information-theory analysis to its outcome tests the reliability of the analysis tool rather than exploring the model physics. As a result, the presence of a direct cascade along the scales in the shell model and the locality of the interactions in the space of wavenumbers come out as expected, indicating the validity of this data analysis tool. In this context, the use of a normalized version of transfer entropy, able to account for the difference of the intrinsic randomness of the interacting processes, appears to perform better, being able to discriminate the wrong conclusions to which the “traditional” transfer entropy would drive.

  3. Metadata and their impact on processes in Building Information Modeling

    Directory of Open Access Journals (Sweden)

    Vladimir Nyvlt

    2014-04-01

    Full Text Available Building Information Modeling (BIM itself contains huge potential, how to increase effectiveness of every project in its all life cycle. It means from initial investment plan through project and building-up activities to long-term usage and property maintenance and finally demolition. Knowledge Management or better say Knowledge Sharing covers two sets of tools, managerial and technological. Manager`s needs are real expectations and desires of final users in terms of how could they benefit from managing long-term projects, covering whole life cycle in terms of sparing investment money and other resources. Technology employed can help BIM processes to support and deliver these benefits to users. How to use this technology for data and metadata collection, storage and sharing, which processes may these new technologies deploy. We will touch how to cover optimized processes proposal for better and smooth support of knowledge sharing within project time-scale, and covering all its life cycle.

  4. Scaling the Information Processing Demands of Occupations

    Science.gov (United States)

    Haase, Richard F.; Jome, LaRae M.; Ferreira, Joaquim Armando; Santos, Eduardo J. R.; Connacher, Christopher C.; Sendrowitz, Kerrin

    2011-01-01

    The purpose of this study was to provide additional validity evidence for a model of person-environment fit based on polychronicity, stimulus load, and information processing capacities. In this line of research the confluence of polychronicity and information processing (e.g., the ability of individuals to process stimuli from the environment…

  5. Process modelling on a canonical basis[Process modelling; Canonical modelling

    Energy Technology Data Exchange (ETDEWEB)

    Siepmann, Volker

    2006-12-20

    Based on an equation oriented solving strategy, this thesis investigates a new approach to process modelling. Homogeneous thermodynamic state functions represent consistent mathematical models of thermodynamic properties. Such state functions of solely extensive canonical state variables are the basis of this work, as they are natural objective functions in optimisation nodes to calculate thermodynamic equilibrium regarding phase-interaction and chemical reactions. Analytical state function derivatives are utilised within the solution process as well as interpreted as physical properties. By this approach, only a limited range of imaginable process constraints are considered, namely linear balance equations of state variables. A second-order update of source contributions to these balance equations is obtained by an additional constitutive equation system. These equations are general dependent on state variables and first-order sensitivities, and cover therefore practically all potential process constraints. Symbolic computation technology efficiently provides sparsity and derivative information of active equations to avoid performance problems regarding robustness and computational effort. A benefit of detaching the constitutive equation system is that the structure of the main equation system remains unaffected by these constraints, and a priori information allows to implement an efficient solving strategy and a concise error diagnosis. A tailor-made linear algebra library handles the sparse recursive block structures efficiently. The optimisation principle for single modules of thermodynamic equilibrium is extended to host entire process models. State variables of different modules interact through balance equations, representing material flows from one module to the other. To account for reusability and encapsulation of process module details, modular process modelling is supported by a recursive module structure. The second-order solving algorithm makes it

  6. A Multidirectional Model for Assessing Learning Disabled Students' Intelligence: An Information-Processing Framework.

    Science.gov (United States)

    Swanson, H. Lee

    1982-01-01

    An information processing approach to the assessment of learning disabled students' intellectual performance is presented. The model is based on the assumption that intelligent behavior is comprised of a variety of problem- solving strategies. An account of child problem solving is explained and illustrated with a "thinking aloud" protocol.…

  7. Logical reasoning versus information processing in the dual-strategy model of reasoning.

    Science.gov (United States)

    Markovits, Henry; Brisson, Janie; de Chantal, Pier-Luc

    2017-01-01

    One of the major debates concerning the nature of inferential reasoning is between counterexample-based strategies such as mental model theory and statistical strategies underlying probabilistic models. The dual-strategy model, proposed by Verschueren, Schaeken, & d'Ydewalle (2005a, 2005b), which suggests that people might have access to both kinds of strategy has been supported by several recent studies. These have shown that statistical reasoners make inferences based on using information about premises in order to generate a likelihood estimate of conclusion probability. However, while results concerning counterexample reasoners are consistent with a counterexample detection model, these results could equally be interpreted as indicating a greater sensitivity to logical form. In order to distinguish these 2 interpretations, in Studies 1 and 2, we presented reasoners with Modus ponens (MP) inferences with statistical information about premise strength and in Studies 3 and 4, naturalistic MP inferences with premises having many disabling conditions. Statistical reasoners accepted the MP inference more often than counterexample reasoners in Studies 1 and 2, while the opposite pattern was observed in Studies 3 and 4. Results show that these strategies must be defined in terms of information processing, with no clear relations to "logical" reasoning. These results have additional implications for the underlying debate about the nature of human reasoning. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  8. Influence of information on behavioral effects in decision processes

    Directory of Open Access Journals (Sweden)

    Angelarosa Longo

    2015-07-01

    Full Text Available Rational models in decision processes are marked out by many anomalies, caused by behavioral issues. We point out the importance of information in causing inconsistent preferences in a decision process. In a single or multi agent decision process each mental model is influenced by the presence, the absence or false information about the problem or about other members of the decision making group. The difficulty in modeling these effects increases because behavioral biases influence also the modeler. Behavioral Operational Research (BOR studies these influences to create efficient models to define choices in similar decision processes.

  9. INFORMATION MODEL OF SOCIAL TRANSFORMATIONS

    Directory of Open Access Journals (Sweden)

    Мария Васильевна Комова

    2013-09-01

    Full Text Available The social transformation is considered as a process of qualitative changes of the society, creating a new level of organization in all areas of life, in different social formations, societies of different types of development. The purpose of the study is to create a universal model for studying social transformations based on their understanding as the consequence of the information exchange processes in the society. After defining the conceptual model of the study, the author uses the following methods: the descriptive method, analysis, synthesis, comparison.Information, objectively existing in all elements and systems of the material world, is an integral attribute of the society transformation as well. The information model of social transformations is based on the definition of the society transformation as the change in the information that functions in the society’s information space. The study of social transformations is the study of information flows circulating in the society and being characterized by different spatial, temporal, and structural states. Social transformations are a highly integrated system of social processes and phenomena, the nature, course and consequences of which are affected by the factors representing the whole complex of material objects. The integrated information model of social transformations foresees the interaction of the following components: social memory, information space, and the social ideal. To determine the dynamics and intensity of social transformations the author uses the notions of "information threshold of social transformations" and "information pressure".Thus, the universal nature of information leads to considering social transformations as a system of information exchange processes. Social transformations can be extended to any episteme actualized by social needs. The establishment of an information threshold allows to simulate the course of social development, to predict the

  10. Information Processing of Trauma.

    Science.gov (United States)

    Hartman, Carol R.; Burgess, Ann W.

    1993-01-01

    This paper presents a neuropsychosocial model of information processing to explain a victimization experience, specifically child sexual abuse. It surveys the relation of sensation, perception, and cognition as a systematic way to provide a framework for studying human behavior and describing human response to traumatic events. (Author/JDD)

  11. Modeling and Security Threat Assessments of Data Processed in Cloud Based Information Systems

    Directory of Open Access Journals (Sweden)

    Darya Sergeevna Simonenkova

    2016-03-01

    Full Text Available The subject of the research is modeling and security threat assessments of data processed in cloud based information systems (CBIS. This method allow to determine the current security threats of CBIS, state of the system in which vulnerabilities exists, level of possible violators, security properties and to generate recommendations for neutralizing security threats of CBIS.

  12. Hierarchical process memory: memory as an integral component of information processing

    Science.gov (United States)

    Hasson, Uri; Chen, Janice; Honey, Christopher J.

    2015-01-01

    Models of working memory commonly focus on how information is encoded into and retrieved from storage at specific moments. However, in the majority of real-life processes, past information is used continuously to process incoming information across multiple timescales. Considering single unit, electrocorticography, and functional imaging data, we argue that (i) virtually all cortical circuits can accumulate information over time, and (ii) the timescales of accumulation vary hierarchically, from early sensory areas with short processing timescales (tens to hundreds of milliseconds) to higher-order areas with long processing timescales (many seconds to minutes). In this hierarchical systems perspective, memory is not restricted to a few localized stores, but is intrinsic to information processing that unfolds throughout the brain on multiple timescales. “The present contains nothing more than the past, and what is found in the effect was already in the cause.”Henri L Bergson PMID:25980649

  13. PROCESSING THE INFORMATION CONTENT ON THE BASIS OF FUZZY NEURAL MODEL OF DECISION MAKING

    Directory of Open Access Journals (Sweden)

    Nina V. Komleva

    2013-01-01

    Full Text Available The article is devoted to the issues of mathematical modeling of the decision-making process of information content processing based on the fuzzy neural network TSK. Integral rating assessment of the content, which is necessary for taking a decision about its further usage, is made depended on varying characteristics. Mechanism for building individual trajectory and forming individual competence is provided to make the intellectual content search.

  14. Formalize clinical processes into electronic health information systems: Modelling a screening service for diabetic retinopathy.

    Science.gov (United States)

    Eguzkiza, Aitor; Trigo, Jesús Daniel; Martínez-Espronceda, Miguel; Serrano, Luis; Andonegui, José

    2015-08-01

    Most healthcare services use information and communication technologies to reduce and redistribute the workload associated with follow-up of chronic conditions. However, the lack of normalization of the information handled in and exchanged between such services hinders the scalability and extendibility. The use of medical standards for modelling and exchanging information, especially dual-model based approaches, can enhance the features of screening services. Hence, the approach of this paper is twofold. First, this article presents a generic methodology to model patient-centered clinical processes. Second, a proof of concept of the proposed methodology was conducted within the diabetic retinopathy (DR) screening service of the Health Service of Navarre (Spain) in compliance with a specific dual-model norm (openEHR). As a result, a set of elements required for deploying a model-driven DR screening service has been established, namely: clinical concepts, archetypes, termsets, templates, guideline definition rules, and user interface definitions. This model fosters reusability, because those elements are available to be downloaded and integrated in any healthcare service, and interoperability, since from then on such services can share information seamlessly. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. The role of categorization and scale endpoint comparisons in numerical information processing: A two-process model.

    Science.gov (United States)

    Tao, Tao; Wyer, Robert S; Zheng, Yuhuang

    2017-03-01

    We propose a two-process conceptualization of numerical information processing to describe how people form impressions of a score that is described along a bounded scale. According to the model, people spontaneously categorize a score as high or low. Furthermore, they compare the numerical discrepancy between the score and the endpoint of the scale to which it is closer, if they are not confident of their categorization, and use implications of this comparison as a basis for judgment. As a result, their evaluation of the score is less extreme when the range of numbers along the scale is large (e.g., from 0 to 100) than when it is small (from 0 to 10). Six experiments support this two-process model and demonstrate its generalizability. Specifically, the magnitude of numbers composing the scale has less impact on judgments (a) when the score being evaluated is extreme, (b) when individuals are unmotivated to engage in endpoint comparison processes (i.e., they are low in need for cognition), and (c) when they are unable to do so (i.e., they are under cognitive load). Moreover, the endpoint to which individuals compare the score can depend on their regulatory focus. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  16. Organizational restructuring in response to changes in information-processing technology

    OpenAIRE

    Andrzej Baniak; Jacek Cukrowski

    1999-01-01

    This paper examines the effects of changes in information-processing technology on the efficient organizational forms of data-processing in decision-making systems. Data-processing is modelled in the framework of the dynamic parallel processing model of associative computation with an endogenous set-up costs of the processors. In such a model, the conditions for efficient organization of information-processing are defined and the architecture of the efficient structures is considered. It is s...

  17. Implications of Building Information Modeling on Interior Design Education: The Impact on Teaching Design Processes

    Directory of Open Access Journals (Sweden)

    Amy Roehl, MFA

    2013-06-01

    Full Text Available Currently, major shifts occur in design processes effecting business practices for industries involved with designing and delivering the built environment. These changing conditions are a direct result of industry adoption of relatively new technologies called BIM or Building Information Modeling. This review of literature examines implications of these changing processes on interior design education.

  18. Influence of information on behavioral effects in decision processes

    OpenAIRE

    Angelarosa Longo; Viviana Ventre

    2015-01-01

    Rational models in decision processes are marked out by many anomalies, caused by behavioral issues. We point out the importance of information in causing inconsistent preferences in a decision process. In a single or multi agent decision process each mental model is influenced by the presence, the absence or false information about the problem or about other members of the decision making group. The difficulty in modeling these effects increases because behavioral biases influence also the m...

  19. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers

    DEFF Research Database (Denmark)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel

    2016-01-01

    of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling....... the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension...

  20. Composing Models of Geographic Physical Processes

    Science.gov (United States)

    Hofer, Barbara; Frank, Andrew U.

    Processes are central for geographic information science; yet geographic information systems (GIS) lack capabilities to represent process related information. A prerequisite to including processes in GIS software is a general method to describe geographic processes independently of application disciplines. This paper presents such a method, namely a process description language. The vocabulary of the process description language is derived formally from mathematical models. Physical processes in geography can be described in two equivalent languages: partial differential equations or partial difference equations, where the latter can be shown graphically and used as a method for application specialists to enter their process models. The vocabulary of the process description language comprises components for describing the general behavior of prototypical geographic physical processes. These process components can be composed by basic models of geographic physical processes, which is shown by means of an example.

  1. Information-processing genes

    International Nuclear Information System (INIS)

    Tahir Shah, K.

    1995-01-01

    There are an estimated 100,000 genes in the human genome of which 97% is non-coding. On the other hand, bacteria have little or no non-coding DNA. Non-coding region includes introns, ALU sequences, satellite DNA, and other segments not expressed as proteins. Why it exists? Why nature has kept non-coding during the long evolutionary period if it has no role in the development of complex life forms? Does complexity of a species somehow correlated to the existence of apparently useless sequences? What kind of capability is encoded within such nucleotide sequences that is a necessary, but not a sufficient condition for the evolution of complex life forms, keeping in mind the C-value paradox and the omnipresence of non-coding segments in higher eurkaryotes and also in many archea and prokaryotes. The physico-chemical description of biological processes is hardware oriented and does not highlight algorithmic or information processing aspect. However, an algorithm without its hardware implementation is useless as much as hardware without its capability to run an algorithm. The nature and type of computation an information-processing hardware can perform depends only on its algorithm and the architecture that reflects the algorithm. Given that enormously difficult tasks such as high fidelity replication, transcription, editing and regulation are all achieved within a long linear sequence, it is natural to think that some parts of a genome are involved is these tasks. If some complex algorithms are encoded with these parts, then it is natural to think that non-coding regions contain processing-information algorithms. A comparison between well-known automatic sequences and sequences constructed out of motifs is found in all species proves the point: noncoding regions are a sort of ''hardwired'' programs, i.e., they are linear representations of information-processing machines. Thus in our model, a noncoding region, e.g., an intron contains a program (or equivalently, it is

  2. Information-based models for finance and insurance

    Science.gov (United States)

    Hoyle, Edward

    2010-10-01

    In financial markets, the information that traders have about an asset is reflected in its price. The arrival of new information then leads to price changes. The `information-based framework' of Brody, Hughston and Macrina (BHM) isolates the emergence of information, and examines its role as a driver of price dynamics. This approach has led to the development of new models that capture a broad range of price behaviour. This thesis extends the work of BHM by introducing a wider class of processes for the generation of the market filtration. In the BHM framework, each asset is associated with a collection of random cash flows. The asset price is the sum of the discounted expectations of the cash flows. Expectations are taken with respect (i) an appropriate measure, and (ii) the filtration generated by a set of so-called information processes that carry noisy or imperfect market information about the cash flows. To model the flow of information, we introduce a class of processes termed Lévy random bridges (LRBs), generalising the Brownian and gamma information processes of BHM. Conditioned on its terminal value, an LRB is identical in law to a Lévy bridge. We consider in detail the case where the asset generates a single cash flow X_T at a fixed date T. The flow of information about X_T is modelled by an LRB with random terminal value X_T. An explicit expression for the price process is found by working out the discounted conditional expectation of X_T with respect to the natural filtration of the LRB. New models are constructed using information processes related to the Poisson process, the Cauchy process, the stable-1/2 subordinator, the variance-gamma process, and the normal inverse-Gaussian process. These are applied to the valuation of credit-risky bonds, vanilla and exotic options, and non-life insurance liabilities.

  3. Models of neural dynamics in brain information processing - the developments of 'the decade'

    Energy Technology Data Exchange (ETDEWEB)

    Borisyuk, G N; Borisyuk, R M; Kazanovich, Yakov B [Institute of Mathematical Problems of Biology, Russian Academy of Sciences, Pushchino, Moscow region (Russian Federation); Ivanitskii, Genrikh R [Institute for Theoretical and Experimental Biophysics, Russian Academy of Sciences, Pushchino, Moscow region (Russian Federation)

    2002-10-31

    Neural network models are discussed that have been developed during the last decade with the purpose of reproducing spatio-temporal patterns of neural activity in different brain structures. The main goal of the modeling was to test hypotheses of synchronization, temporal and phase relations in brain information processing. The models being considered are those of temporal structure of spike sequences, of neural activity dynamics, and oscillatory models of attention and feature integration. (reviews of topical problems)

  4. Process model repositories and PNML

    NARCIS (Netherlands)

    Hee, van K.M.; Post, R.D.J.; Somers, L.J.A.M.; Werf, van der J.M.E.M.; Kindler, E.

    2004-01-01

    Bringing system and process models together in repositories facilitates the interchange of model information between modelling tools, and allows the combination and interlinking of complementary models. Petriweb is a web application for managing such repositories. It supports hierarchical process

  5. Emotional voices in context: a neurobiological model of multimodal affective information processing.

    Science.gov (United States)

    Brück, Carolin; Kreifelts, Benjamin; Wildgruber, Dirk

    2011-12-01

    Just as eyes are often considered a gateway to the soul, the human voice offers a window through which we gain access to our fellow human beings' minds - their attitudes, intentions and feelings. Whether in talking or singing, crying or laughing, sighing or screaming, the sheer sound of a voice communicates a wealth of information that, in turn, may serve the observant listener as valuable guidepost in social interaction. But how do human beings extract information from the tone of a voice? In an attempt to answer this question, the present article reviews empirical evidence detailing the cerebral processes that underlie our ability to decode emotional information from vocal signals. The review will focus primarily on two prominent classes of vocal emotion cues: laughter and speech prosody (i.e. the tone of voice while speaking). Following a brief introduction, behavioral as well as neuroimaging data will be summarized that allows to outline cerebral mechanisms associated with the decoding of emotional voice cues, as well as the influence of various context variables (e.g. co-occurring facial and verbal emotional signals, attention focus, person-specific parameters such as gender and personality) on the respective processes. Building on the presented evidence, a cerebral network model will be introduced that proposes a differential contribution of various cortical and subcortical brain structures to the processing of emotional voice signals both in isolation and in context of accompanying (facial and verbal) emotional cues. Copyright © 2011 Elsevier B.V. All rights reserved.

  6. Emotional voices in context: A neurobiological model of multimodal affective information processing

    Science.gov (United States)

    Brück, Carolin; Kreifelts, Benjamin; Wildgruber, Dirk

    2011-12-01

    Just as eyes are often considered a gateway to the soul, the human voice offers a window through which we gain access to our fellow human beings' minds - their attitudes, intentions and feelings. Whether in talking or singing, crying or laughing, sighing or screaming, the sheer sound of a voice communicates a wealth of information that, in turn, may serve the observant listener as valuable guidepost in social interaction. But how do human beings extract information from the tone of a voice? In an attempt to answer this question, the present article reviews empirical evidence detailing the cerebral processes that underlie our ability to decode emotional information from vocal signals. The review will focus primarily on two prominent classes of vocal emotion cues: laughter and speech prosody (i.e. the tone of voice while speaking). Following a brief introduction, behavioral as well as neuroimaging data will be summarized that allows to outline cerebral mechanisms associated with the decoding of emotional voice cues, as well as the influence of various context variables (e.g. co-occurring facial and verbal emotional signals, attention focus, person-specific parameters such as gender and personality) on the respective processes. Building on the presented evidence, a cerebral network model will be introduced that proposes a differential contribution of various cortical and subcortical brain structures to the processing of emotional voice signals both in isolation and in context of accompanying (facial and verbal) emotional cues.

  7. What makes process models understandable?

    NARCIS (Netherlands)

    Mendling, J.; Reijers, H.A.; Cardoso, J.; Alonso, G.; Dadam, P.; Rosemann, M.

    2007-01-01

    Despite that formal and informal quality aspects are of significant importance to business process modeling, there is only little empirical work reported on process model quality and its impact factors. In this paper we investigate understandability as a proxy for quality of process models and focus

  8. Temporal Expectation and Information Processing: A Model-Based Analysis

    Science.gov (United States)

    Jepma, Marieke; Wagenmakers, Eric-Jan; Nieuwenhuis, Sander

    2012-01-01

    People are able to use temporal cues to anticipate the timing of an event, enabling them to process that event more efficiently. We conducted two experiments, using the fixed-foreperiod paradigm (Experiment 1) and the temporal-cueing paradigm (Experiment 2), to assess which components of information processing are speeded when subjects use such…

  9. Attachment in Middle Childhood: Associations with Information Processing

    Science.gov (United States)

    Zimmermann, Peter; Iwanski, Alexandra

    2015-01-01

    Attachment theory suggests that internal working models of self and significant others influence adjustment during development by controlling information processing and self-regulation. We provide a conceptual overview on possible mechanisms linking attachment and information processing and review the current literature in middle childhood.…

  10. Using life cycle information in process discovery

    NARCIS (Netherlands)

    Leemans, S.J.J.; Fahland, D.; Van Der Aalst, W.M.P.; Reichert, M.; Reijers, H.A.

    2016-01-01

    Understanding the performance of business processes is an important part of any business process intelligence project. From historical information recorded in event logs, performance can be measured and visualized on a discovered process model. Thereby the accuracy of the measured performance, e.g.,

  11. Pyphant – A Python Framework for Modelling Reusable Information Processing Tasks

    Directory of Open Access Journals (Sweden)

    2008-06-01

    Full Text Available We are presenting the Python framework “Pyphant” for the creation and application of information flow models. The central idea of this approach is to encapsulate each data processing step in one unit which we call a worker. A worker receives input via sockets and provides the results of its data processing via plugs. These can be connected to other workers' sockets. The resulting directed graph is called a recipe. Classes for these objects comprise the Pyphant core. To implement the actual processing steps, Pyphant relies on third-party plug-ins which extend the basic worker class and can be distributed as Python eggs. On top of the core, Pyphant offers an information exchange layer which facilitates the interoperability of the workers, using Numpy objects. A third layer comprises textual and graphical user interfaces. The former allows for the batch processing of data and the latter allows for the interactive construction of recipes.

    This paper discusses the Pyphant framework and presents an example recipe for determining the length scale of aggregated polymeric phases, building an amphiphilic conetwork from an Atomic Force Microscopy (AFM phase mode image.

  12. Markovian Processes for Quantitative Information Leakage

    DEFF Research Database (Denmark)

    Biondi, Fabrizio

    Quantification of information leakage is a successful approach for evaluating the security of a system. It models the system to be analyzed as a channel with the secret as the input and an output as observable by the attacker as the output, and applies information theory to quantify the amount...... and randomized processes with Markovian models and to compute their information leakage for a very general model of attacker. We present the QUAIL tool that automates such analysis and is able to compute the information leakage of an imperative WHILE language. Finally, we show how to use QUAIL to analyze some...... of information transmitted through such channel, thus effectively quantifying how many bits of the secret can be inferred by the attacker by analyzing the system’s output. Channels are usually encoded as matrices of conditional probabilities, known as channel matrices. Such matrices grow exponentially...

  13. A proposed general model of information behaviour.

    Directory of Open Access Journals (Sweden)

    2003-01-01

    Full Text Available Presents a critical description of Wilson's (1996 global model of information behaviour and proposes major modification on the basis of research into information behaviour of managers, conducted in Poland. The theoretical analysis and research results suggest that Wilson's model has certain imperfections, both in its conceptual content, and in graphical presentation. The model, for example, cannot be used to describe managers' information behaviour, since managers basically are not the end users of external from organization or computerized information services, and they acquire information mainly through various intermediaries. Therefore, the model cannot be considered as a general model, applicable to every category of information users. The proposed new model encompasses the main concepts of Wilson's model, such as: person-in-context, three categories of intervening variables (individual, social and environmental, activating mechanisms, cyclic character of information behaviours, and the adoption of a multidisciplinary approach to explain them. However, the new model introduces several changes. They include: 1. identification of 'context' with the intervening variables; 2. immersion of the chain of information behaviour in the 'context', to indicate that the context variables influence behaviour at all stages of the process (identification of needs, looking for information, processing and using it; 3. stress is put on the fact that the activating mechanisms also can occur at all stages of the information acquisition process; 4. introduction of two basic strategies of looking for information: personally and/or using various intermediaries.

  14. Driver's various information process and multi-ruled decision-making mechanism: a fundamental of intelligent driving shaping model

    Directory of Open Access Journals (Sweden)

    Wuhong Wang

    2011-05-01

    Full Text Available The most difficult but important problem in advance driver assistance system development is how to measure and model the behavioral response of drivers with focusing on the cognition process. This paper describes driver's deceleration and acceleration behavior based on driving situation awareness in the car-following process, and then presents several driving models for analysis of driver's safety approaching behavior in traffic operation. The emphasis of our work is placed on the research of driver's various information process and multi-ruled decisionmaking mechanism by considering the complicated control process of driving; the results will be able to provide a theoretical basis for intelligent driving shaping model.

  15. An information-processing model of three cortical regions: evidence in episodic memory retrieval.

    Science.gov (United States)

    Sohn, Myeong-Ho; Goode, Adam; Stenger, V Andrew; Jung, Kwan-Jin; Carter, Cameron S; Anderson, John R

    2005-03-01

    ACT-R (Anderson, J.R., et al., 2003. An information-processing model of the BOLD response in symbol manipulation tasks. Psychon. Bull. Rev. 10, 241-261) relates the inferior dorso-lateral prefrontal cortex to a retrieval buffer that holds information retrieved from memory and the posterior parietal cortex to an imaginal buffer that holds problem representations. Because the number of changes in a problem representation is not necessarily correlated with retrieval difficulties, it is possible to dissociate prefrontal-parietal activations. In two fMRI experiments, we examined this dissociation using the fan effect paradigm. Experiment 1 compared a recognition task, in which representation requirement remains the same regardless of retrieval difficulty, with a recall task, in which both representation and retrieval loads increase with retrieval difficulty. In the recognition task, the prefrontal activation revealed a fan effect but not the parietal activation. In the recall task, both regions revealed fan effects. In Experiment 2, we compared visually presented stimuli and aurally presented stimuli using the recognition task. While only the prefrontal region revealed the fan effect, the activation patterns in the prefrontal and the parietal region did not differ by stimulus presentation modality. In general, these results provide support for the prefrontal-parietal dissociation in terms of retrieval and representation and the modality-independent nature of the information processed by these regions. Using ACT-R, we also provide computational models that explain patterns of fMRI responses in these two areas during recognition and recall.

  16. Information transfer with rate-modulated Poisson processes: a simple model for nonstationary stochastic resonance.

    Science.gov (United States)

    Goychuk, I

    2001-08-01

    Stochastic resonance in a simple model of information transfer is studied for sensory neurons and ensembles of ion channels. An exact expression for the information gain is obtained for the Poisson process with the signal-modulated spiking rate. This result allows one to generalize the conventional stochastic resonance (SR) problem (with periodic input signal) to the arbitrary signals of finite duration (nonstationary SR). Moreover, in the case of a periodic signal, the rate of information gain is compared with the conventional signal-to-noise ratio. The paper establishes the general nonequivalence between both measures notwithstanding their apparent similarity in the limit of weak signals.

  17. Modeling Business Processes in Public Administration

    Science.gov (United States)

    Repa, Vaclav

    During more than 10 years of its existence business process modeling became a regular part of organization management practice. It is mostly regarded. as a part of information system development or even as a way to implement some supporting technology (for instance workflow system). Although I do not agree with such reduction of the real meaning of a business process, it is necessary to admit that information technologies play an essential role in business processes (see [1] for more information), Consequently, an information system is inseparable from a business process itself because it is a cornerstone of the general basic infrastructure of a business. This fact impacts on all dimensions of business process management. One of these dimensions is the methodology that postulates that the information systems development provide the business process management with exact methods and tools for modeling business processes. Also the methodology underlying the approach presented in this paper has its roots in the information systems development methodology.

  18. Modeling the reemergence of information diffusion in social network

    Science.gov (United States)

    Yang, Dingda; Liao, Xiangwen; Shen, Huawei; Cheng, Xueqi; Chen, Guolong

    2018-01-01

    Information diffusion in networks is an important research topic in various fields. Existing studies either focus on modeling the process of information diffusion, e.g., independent cascade model and linear threshold model, or investigate information diffusion in networks with certain structural characteristics such as scale-free networks and small world networks. However, there are still several phenomena that have not been captured by existing information diffusion models. One of the prominent phenomena is the reemergence of information diffusion, i.e., a piece of information reemerges after the completion of its initial diffusion process. In this paper, we propose an optimized information diffusion model by introducing a new informed state into traditional susceptible-infected-removed model. We verify the proposed model via simulations in real-world social networks, and the results indicate that the model can reproduce the reemergence of information during the diffusion process.

  19. Information flow security for business process models - just one click away

    NARCIS (Netherlands)

    Lehmann, A.; Fahland, D.; Lohmann, N.; Moser, S.

    2012-01-01

    When outsourcing tasks of a business process to a third party, information flow security becomes a critical issue. In particular implicit information leaks are an intriguing problem. Given a business process one could ask whether the execution of a confidential task is kept secret to a third party

  20. Integrating textual and model-based process descriptions for comprehensive process search

    NARCIS (Netherlands)

    Leopold, Henrik; van der Aa, Han; Pittke, Fabian; Raffel, Manuel; Mendling, Jan; Reijers, Hajo A.

    2016-01-01

    Documenting business processes using process models is common practice in many organizations. However, not all process information is best captured in process models. Hence, many organizations complement these models with textual descriptions that specify additional details. The problem with this

  1. Modeling gross primary production of agro-forestry ecosystems by assimilation of satellite-derived information in a process-based model.

    Science.gov (United States)

    Migliavacca, Mirco; Meroni, Michele; Busetto, Lorenzo; Colombo, Roberto; Zenone, Terenzio; Matteucci, Giorgio; Manca, Giovanni; Seufert, Guenther

    2009-01-01

    In this paper we present results obtained in the framework of a regional-scale analysis of the carbon budget of poplar plantations in Northern Italy. We explored the ability of the process-based model BIOME-BGC to estimate the gross primary production (GPP) using an inverse modeling approach exploiting eddy covariance and satellite data. We firstly present a version of BIOME-BGC coupled with the radiative transfer models PROSPECT and SAILH (named PROSAILH-BGC) with the aims of i) improving the BIOME-BGC description of the radiative transfer regime within the canopy and ii) allowing the assimilation of remotely-sensed vegetation index time series, such as MODIS NDVI, into the model. Secondly, we present a two-step model inversion for optimization of model parameters. In the first step, some key ecophysiological parameters were optimized against data collected by an eddy covariance flux tower. In the second step, important information about phenological dates and about standing biomass were optimized against MODIS NDVI. Results obtained showed that the PROSAILH-BGC allowed simulation of MODIS NDVI with good accuracy and that we described better the canopy radiation regime. The inverse modeling approach was demonstrated to be useful for the optimization of ecophysiological model parameters, phenological dates and parameters related to the standing biomass, allowing good accuracy of daily and annual GPP predictions. In summary, this study showed that assimilation of eddy covariance and remote sensing data in a process model may provide important information for modeling gross primary production at regional scale.

  2. Modeling Gross Primary Production of Agro-Forestry Ecosystems by Assimilation of Satellite-Derived Information in a Process-Based Model

    Directory of Open Access Journals (Sweden)

    Guenther Seufert

    2009-02-01

    Full Text Available In this paper we present results obtained in the framework of a regional-scale analysis of the carbon budget of poplar plantations in Northern Italy. We explored the ability of the process-based model BIOME-BGC to estimate the gross primary production (GPP using an inverse modeling approach exploiting eddy covariance and satellite data. We firstly present a version of BIOME-BGC coupled with the radiative transfer models PROSPECT and SAILH (named PROSAILH-BGC with the aims of i improving the BIOME-BGC description of the radiative transfer regime within the canopy and ii allowing the assimilation of remotely-sensed vegetation index time series, such as MODIS NDVI, into the model. Secondly, we present a two-step model inversion for optimization of model parameters. In the first step, some key ecophysiological parameters were optimized against data collected by an eddy covariance flux tower. In the second step, important information about phenological dates and about standing biomass were optimized against MODIS NDVI. Results obtained showed that the PROSAILH-BGC allowed simulation of MODIS NDVI with good accuracy and that we described better the canopy radiation regime. The inverse modeling approach was demonstrated to be useful for the optimization of ecophysiological model parameters, phenological dates and parameters related to the standing biomass, allowing good accuracy of daily and annual GPP predictions. In summary, this study showed that assimilation of eddy covariance and remote sensing data in a process model may provide important information for modeling gross primary production at regional scale.

  3. Method for modeling social care processes for national information exchange.

    Science.gov (United States)

    Miettinen, Aki; Mykkänen, Juha; Laaksonen, Maarit

    2012-01-01

    Finnish social services include 21 service commissions of social welfare including Adoption counselling, Income support, Child welfare, Services for immigrants and Substance abuse care. This paper describes the method used for process modeling in the National project for IT in Social Services in Finland (Tikesos). The process modeling in the project aimed to support common national target state processes from the perspective of national electronic archive, increased interoperability between systems and electronic client documents. The process steps and other aspects of the method are presented. The method was developed, used and refined during the three years of process modeling in the national project.

  4. Towards a structured process modeling method: Building the prescriptive modeling theory information on submission

    NARCIS (Netherlands)

    Claes, J.; Vanderfeesten, I.T.P.; Gailly, F.; Grefen, P.W.P.J.; Poels, G.

    2017-01-01

    In their effort to control and manage processes, organizations often create process models. The quality of such models is not always optimal, because it is challenging for a modeler to translate her mental image of the process into a formal process description. In order to support this complex human

  5. Expectation, information processing, and subjective duration.

    Science.gov (United States)

    Simchy-Gross, Rhimmon; Margulis, Elizabeth Hellmuth

    2018-01-01

    In research on psychological time, it is important to examine the subjective duration of entire stimulus sequences, such as those produced by music (Teki, Frontiers in Neuroscience, 10, 2016). Yet research on the temporal oddball illusion (according to which oddball stimuli seem longer than standard stimuli of the same duration) has examined only the subjective duration of single events contained within sequences, not the subjective duration of sequences themselves. Does the finding that oddballs seem longer than standards translate to entire sequences, such that entire sequences that contain oddballs seem longer than those that do not? Is this potential translation influenced by the mode of information processing-whether people are engaged in direct or indirect temporal processing? Two experiments aimed to answer both questions using different manipulations of information processing. In both experiments, musical sequences either did or did not contain oddballs (auditory sliding tones). To manipulate information processing, we varied the task (Experiment 1), the sequence event structure (Experiments 1 and 2), and the sequence familiarity (Experiment 2) independently within subjects. Overall, in both experiments, the sequences that contained oddballs seemed shorter than those that did not when people were engaged in direct temporal processing, but longer when people were engaged in indirect temporal processing. These findings support the dual-process contingency model of time estimation (Zakay, Attention, Perception & Psychophysics, 54, 656-664, 1993). Theoretical implications for attention-based and memory-based models of time estimation, the pacemaker accumulator and coding efficiency hypotheses of time perception, and dynamic attending theory are discussed.

  6. Information processing. [in human performance

    Science.gov (United States)

    Wickens, Christopher D.; Flach, John M.

    1988-01-01

    Theoretical models of sensory-information processing by the human brain are reviewed from a human-factors perspective, with a focus on their implications for aircraft and avionics design. The topics addressed include perception (signal detection and selection), linguistic factors in perception (context provision, logical reversals, absence of cues, and order reversals), mental models, and working and long-term memory. Particular attention is given to decision-making problems such as situation assessment, decision formulation, decision quality, selection of action, the speed-accuracy tradeoff, stimulus-response compatibility, stimulus sequencing, dual-task performance, task difficulty and structure, and factors affecting multiple task performance (processing modalities, codes, and stages).

  7. An analytical approach to customer requirement information processing

    Science.gov (United States)

    Zhou, Zude; Xiao, Zheng; Liu, Quan; Ai, Qingsong

    2013-11-01

    'Customer requirements' (CRs) management is a key component of customer relationship management (CRM). By processing customer-focused information, CRs management plays an important role in enterprise systems (ESs). Although two main CRs analysis methods, quality function deployment (QFD) and Kano model, have been applied to many fields by many enterprises in the past several decades, the limitations such as complex processes and operations make them unsuitable for online businesses among small- and medium-sized enterprises (SMEs). Currently, most SMEs do not have the resources to implement QFD or Kano model. In this article, we propose a method named customer requirement information (CRI), which provides a simpler and easier way for SMEs to run CRs analysis. The proposed method analyses CRs from the perspective of information and applies mathematical methods to the analysis process. A detailed description of CRI's acquisition, classification and processing is provided.

  8. Using a logical information model-driven design process in healthcare.

    Science.gov (United States)

    Cheong, Yu Chye; Bird, Linda; Tun, Nwe Ni; Brooks, Colleen

    2011-01-01

    A hybrid standards-based approach has been adopted in Singapore to develop a Logical Information Model (LIM) for healthcare information exchange. The Singapore LIM uses a combination of international standards, including ISO13606-1 (a reference model for electronic health record communication), ISO21090 (healthcare datatypes), SNOMED CT (healthcare terminology) and HL7 v2 (healthcare messaging). This logic-based design approach also incorporates mechanisms for achieving bi-directional semantic interoperability.

  9. Biosphere Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    J. Schmitt

    2000-05-25

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  10. Biosphere Process Model Report

    International Nuclear Information System (INIS)

    Schmitt, J.

    2000-01-01

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  11. Implementation of the Business Process Modelling Notation (BPMN) in the modelling of anatomic pathology processes.

    Science.gov (United States)

    Rojo, Marcial García; Rolón, Elvira; Calahorra, Luis; García, Felix Oscar; Sánchez, Rosario Paloma; Ruiz, Francisco; Ballester, Nieves; Armenteros, María; Rodríguez, Teresa; Espartero, Rafael Martín

    2008-07-15

    Process orientation is one of the essential elements of quality management systems, including those in use in healthcare. Business processes in hospitals are very complex and variable. BPMN (Business Process Modelling Notation) is a user-oriented language specifically designed for the modelling of business (organizational) processes. Previous experiences of the use of this notation in the processes modelling within the Pathology in Spain or another country are not known. We present our experience in the elaboration of the conceptual models of Pathology processes, as part of a global programmed surgical patient process, using BPMN. With the objective of analyzing the use of BPMN notation in real cases, a multidisciplinary work group was created, including software engineers from the Dep. of Technologies and Information Systems from the University of Castilla-La Mancha and health professionals and administrative staff from the Hospital General de Ciudad Real. The work in collaboration was carried out in six phases: informative meetings, intensive training, process selection, definition of the work method, process describing by hospital experts, and process modelling. The modelling of the processes of Anatomic Pathology is presented using BPMN. The presented subprocesses are those corresponding to the surgical pathology examination of the samples coming from operating theatre, including the planning and realization of frozen studies. The modelling of Anatomic Pathology subprocesses has allowed the creation of an understandable graphical model, where management and improvements are more easily implemented by health professionals.

  12. Model for Electromagnetic Information Leakage

    OpenAIRE

    Mao Jian; Li Yongmei; Zhang Jiemin; Liu Jinming

    2013-01-01

    Electromagnetic leakage will happen in working information equipments; it could lead to information leakage. In order to discover the nature of information in electromagnetic leakage, this paper combined electromagnetic theory with information theory as an innovative research method. It outlines a systematic model of electromagnetic information leakage, which theoretically describes the process of information leakage, intercept and reproduction based on electromagnetic radiation, and ana...

  13. Causal Relationship Model of the Information and Communication Technology Skill Affect the Technology Acceptance Process in the 21ST Century for Undergraduate Students

    Directory of Open Access Journals (Sweden)

    Thanyatorn Amornkitpinyo

    2015-02-01

    Full Text Available The objective of this study is to design a framework for a causal relationship model of the Information and Communication Technology skills that affect the Technology Acceptance Process (TAP for undergraduate students in the 21ST Century. This research uses correlational analysis. A consideration of the research methodology is divided into two sections. The first section involves a synthesis concept framework for process acceptance of the causal relationship model of the Information and Communication Technology skills that affect the Technology Acceptance Process for undergraduate students in the 21ST Century. The second section proposes the design concept framework of the model. The research findings are as follows: 1 The exogenous latent variables included in the causal relationship model of the Information and Communication Technology skills that affect the Technology Acceptance Process for undergraduate students in the 21ST Century are basic ICT skills and self-efficacy. 2 The mediating latent variables of the causal relationship model of the Information and Communication Technology skills that affect the Technology Acceptance Process for undergraduate students in the 21ST Century are from the TAM Model, these includes three components: 1 perceived usefulness, 2 perceived ease of use and 3 attitudes. 3 The outcome latent variable of the causal relationship model of the Information and Communication Technology skills that affect the Technology Acceptance Process for undergraduate students in the 21ST Century is behavioural intention.

  14. An information theoretic model of information processing in the Drosophila olfactory system: the role of inhibitory neurons for system efficiency.

    Science.gov (United States)

    Faghihi, Faramarz; Kolodziejski, Christoph; Fiala, André; Wörgötter, Florentin; Tetzlaff, Christian

    2013-12-20

    Fruit flies (Drosophila melanogaster) rely on their olfactory system to process environmental information. This information has to be transmitted without system-relevant loss by the olfactory system to deeper brain areas for learning. Here we study the role of several parameters of the fly's olfactory system and the environment and how they influence olfactory information transmission. We have designed an abstract model of the antennal lobe, the mushroom body and the inhibitory circuitry. Mutual information between the olfactory environment, simulated in terms of different odor concentrations, and a sub-population of intrinsic mushroom body neurons (Kenyon cells) was calculated to quantify the efficiency of information transmission. With this method we study, on the one hand, the effect of different connectivity rates between olfactory projection neurons and firing thresholds of Kenyon cells. On the other hand, we analyze the influence of inhibition on mutual information between environment and mushroom body. Our simulations show an expected linear relation between the connectivity rate between the antennal lobe and the mushroom body and firing threshold of the Kenyon cells to obtain maximum mutual information for both low and high odor concentrations. However, contradicting all-day experiences, high odor concentrations cause a drastic, and unrealistic, decrease in mutual information for all connectivity rates compared to low concentration. But when inhibition on the mushroom body is included, mutual information remains at high levels independent of other system parameters. This finding points to a pivotal role of inhibition in fly information processing without which the system efficiency will be substantially reduced.

  15. Antecedent characteristics of online cancer information seeking among rural breast cancer patients: an application of the Cognitive-Social Health Information Processing (C-SHIP) model.

    Science.gov (United States)

    Shaw, Bret R; Dubenske, Lori L; Han, Jeong Yeob; Cofta-Woerpel, Ludmila; Bush, Nigel; Gustafson, David H; McTavish, Fiona

    2008-06-01

    Little research has examined the antecedent characteristics of patients most likely to seek online cancer information. This study employs the Cognitive-Social Health Information Processing (C-SHIP) model as a framework to understand what psychosocial characteristics precede online cancer-related information seeking among rural breast cancer patients who often have fewer health care providers and limited local support services. Examining 144 patients who were provided free computer hardware, Internet access, and training for how to use an interactive cancer communication system, pretest survey scores indicating patients' psychosocial status were correlated with specific online cancer information seeking behaviors. Each of the factors specified by the C-SHIP model had significant relationships with online cancer information seeking behaviors, with the strongest findings emerging for cancer-relevant encodings and self-construals, cancer-relevant beliefs and expectancies, and cancer-relevant self-regulatory competencies and skills. Specifically, patients with more negative appraisals in these domains were more likely to seek out online cancer information. Additionally, antecedent variables associated with the C-SHIP model had more frequent relationships with experiential information as compared with to didactic information. This study supports the applicability of the model to discern why people afflicted with cancer may seek online information to cope with their disease.

  16. Antecedent Characteristics of Online Cancer Information Seeking Among Rural Breast Cancer Patients: An Application of the Cognitive-Social Health Information Processing (C-SHIP) Model

    Science.gov (United States)

    Shaw, Bret R.; DuBenske, Lori L.; Han, Jeong Yeob; Cofta-Woerpel, Ludmila; Bush, Nigel; Gustafson, David H.; McTavish, Fiona

    2013-01-01

    Little research has examined the antecedent characteristics of patients most likely to seek online cancer information. This study employs the Cognitive-Social Health Information Processing (C-SHIP) model as a framework to understand what psychosocial characteristics precede online cancer-related information seeking among rural breast cancer patients who often have fewer healthcare providers and limited local support services. Examining 144 patients who were provided free computer hardware, Internet access and training for how to use an Interactive Cancer Communication System, pre-test survey scores indicating patients’ psychosocial status were correlated with specific online cancer information seeking behaviors. Each of the factors specified by the C-SHIP model had significant relationships with online cancer information seeking behaviors with the strongest findings emerging for cancer-relevant encodings and self-construals, cancer-relevant beliefs and expectancies and cancer-relevant self-regulatory competencies and skills. Specifically, patients with more negative appraisals in these domains were more likely to seek out online cancer information. Additionally, antecedent variables associated with the C-SHIP model had more frequent relationships with experiential information as compared to didactic information. This study supports the applicability of the model to discern why people afflicted with cancer may seek online information to cope with their disease. PMID:18569368

  17. Parametric Accuracy: Building Information Modeling Process Applied to the Cultural Heritage Preservation

    Science.gov (United States)

    Garagnani, S.; Manferdini, A. M.

    2013-02-01

    Since their introduction, modeling tools aimed to architectural design evolved in today's "digital multi-purpose drawing boards" based on enhanced parametric elements able to originate whole buildings within virtual environments. Semantic splitting and elements topology are features that allow objects to be "intelligent" (i.e. self-aware of what kind of element they are and with whom they can interact), representing this way basics of Building Information Modeling (BIM), a coordinated, consistent and always up to date workflow improved in order to reach higher quality, reliability and cost reductions all over the design process. Even if BIM was originally intended for new architectures, its attitude to store semantic inter-related information can be successfully applied to existing buildings as well, especially if they deserve particular care such as Cultural Heritage sites. BIM engines can easily manage simple parametric geometries, collapsing them to standard primitives connected through hierarchical relationships: however, when components are generated by existing morphologies, for example acquiring point clouds by digital photogrammetry or laser scanning equipment, complex abstractions have to be introduced while remodeling elements by hand, since automatic feature extraction in available software is still not effective. In order to introduce a methodology destined to process point cloud data in a BIM environment with high accuracy, this paper describes some experiences on monumental sites documentation, generated through a plug-in written for Autodesk Revit and codenamed GreenSpider after its capability to layout points in space as if they were nodes of an ideal cobweb.

  18. Do Social Information-Processing Models Explain Aggressive Behaviour by Children with Mild Intellectual Disabilities in Residential Care?

    Science.gov (United States)

    van Nieuwenhuijzen, M.; de Castro, B. O.; van der Valk, I.; Wijnroks, L.; Vermeer, A.; Matthys, W.

    2006-01-01

    Background: This study aimed to examine whether the social information-processing model (SIP model) applies to aggressive behaviour by children with mild intellectual disabilities (MID). The response-decision element of SIP was expected to be unnecessary to explain aggressive behaviour in these children, and SIP was expected to mediate the…

  19. Translating building information modeling to building energy modeling using model view definition.

    Science.gov (United States)

    Jeong, WoonSeong; Kim, Jong Bum; Clayton, Mark J; Haberl, Jeff S; Yan, Wei

    2014-01-01

    This paper presents a new approach to translate between Building Information Modeling (BIM) and Building Energy Modeling (BEM) that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM) has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD) consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM) and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica) development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1) the BIM-based Modelica models generated from Revit2Modelica and (2) BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1) enables BIM models to be translated into ModelicaBEM models, (2) enables system interface development based on the MVD for thermal simulation, and (3) facilitates the reuse of original BIM data into building energy simulation without an import/export process.

  20. Translating Building Information Modeling to Building Energy Modeling Using Model View Definition

    Directory of Open Access Journals (Sweden)

    WoonSeong Jeong

    2014-01-01

    Full Text Available This paper presents a new approach to translate between Building Information Modeling (BIM and Building Energy Modeling (BEM that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1 the BIM-based Modelica models generated from Revit2Modelica and (2 BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1 enables BIM models to be translated into ModelicaBEM models, (2 enables system interface development based on the MVD for thermal simulation, and (3 facilitates the reuse of original BIM data into building energy simulation without an import/export process.

  1. Affect and Persuasion: Effects on Motivation for Information Processing.

    Science.gov (United States)

    Leach, Mark M; Stoltenberg, Cal D.

    The relationship between mood and information processing, particularly when reviewing the Elaboration Likelihood Model of persuasion, lacks conclusive evidence. This study was designed to investigate the hypothesis that information processing would be greater for mood-topic congruence than non mood-topic congruence. Undergraduate students (N=216)…

  2. A neural network model of normal and abnormal auditory information processing.

    Science.gov (United States)

    Du, X; Jansen, B H

    2011-08-01

    The ability of the brain to attenuate the response to irrelevant sensory stimulation is referred to as sensory gating. A gating deficiency has been reported in schizophrenia. To study the neural mechanisms underlying sensory gating, a neuroanatomically inspired model of auditory information processing has been developed. The mathematical model consists of lumped parameter modules representing the thalamus (TH), the thalamic reticular nucleus (TRN), auditory cortex (AC), and prefrontal cortex (PC). It was found that the membrane potential of the pyramidal cells in the PC module replicated auditory evoked potentials, recorded from the scalp of healthy individuals, in response to pure tones. Also, the model produced substantial attenuation of the response to the second of a pair of identical stimuli, just as seen in actual human experiments. We also tested the viewpoint that schizophrenia is associated with a deficit in prefrontal dopamine (DA) activity, which would lower the excitatory and inhibitory feedback gains in the AC and PC modules. Lowering these gains by less than 10% resulted in model behavior resembling the brain activity seen in schizophrenia patients, and replicated the reported gating deficits. The model suggests that the TRN plays a critical role in sensory gating, with the smaller response to a second tone arising from a reduction in inhibition of TH by the TRN. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. Modeling of the Operating Information for System of Logistical Support of the Hardware-software Means of Safety of the Distributed Systems for Data Processing

    Directory of Open Access Journals (Sweden)

    A. A. Durakovsky

    2010-03-01

    Full Text Available The technique of information modeling of processes and procedures making them by preparation of the operating information for system of logistical support of technological processes of operation and service of hardware-software means of safety of the distributed systems of data processing is offered. Procedures of preparation of the operating information for the system of logistical support of APSOB РСОД concern: working out and formalization of algorithm of functioning; construction of model of the functioning, allowing to calculate degree of risk of operation; decomposition of model and classification of its objects for the purpose of the unequivocal description of all elements of the operating information and mutual coordination of relations between information units.

  4. Modelling Hospital Materials Management Processes

    Directory of Open Access Journals (Sweden)

    Raffaele Iannone

    2013-06-01

    integrated and detailed analysis and description model for hospital materials management data and tasks, which is able to tackle information from patient requirements to usage, from replenishment requests to supplying and handling activities. The model takes account of medical risk reduction, traceability and streamlined processes perspectives. Second, the paper translates this information into a business process model and mathematical formalization.The study provides a useful guide to the various relevant technology‐related, management and business issues, laying the foundations of an efficient reengineering of the supply chain to reduce healthcare costs and improve the quality of care.

  5. Quantum Information Processing

    CERN Document Server

    Leuchs, Gerd

    2005-01-01

    Quantum processing and communication is emerging as a challenging technique at the beginning of the new millennium. This is an up-to-date insight into the current research of quantum superposition, entanglement, and the quantum measurement process - the key ingredients of quantum information processing. The authors further address quantum protocols and algorithms. Complementary to similar programmes in other countries and at the European level, the German Research Foundation (DFG) started a focused research program on quantum information in 1999. The contributions - written by leading experts - bring together the latest results in quantum information as well as addressing all the relevant questions

  6. Deterministic geologic processes and stochastic modeling

    International Nuclear Information System (INIS)

    Rautman, C.A.; Flint, A.L.

    1992-01-01

    This paper reports that recent outcrop sampling at Yucca Mountain, Nevada, has produced significant new information regarding the distribution of physical properties at the site of a potential high-level nuclear waste repository. consideration of the spatial variability indicates that her are a number of widespread deterministic geologic features at the site that have important implications for numerical modeling of such performance aspects as ground water flow and radionuclide transport. Because the geologic processes responsible for formation of Yucca Mountain are relatively well understood and operate on a more-or-less regional scale, understanding of these processes can be used in modeling the physical properties and performance of the site. Information reflecting these deterministic geologic processes may be incorporated into the modeling program explicitly using geostatistical concepts such as soft information, or implicitly, through the adoption of a particular approach to modeling

  7. An Information Processing Perspective on Divergence and Convergence in Collaborative Learning

    Science.gov (United States)

    Jorczak, Robert L.

    2011-01-01

    This paper presents a model of collaborative learning that takes an information processing perspective of learning by social interaction. The collaborative information processing model provides a theoretical basis for understanding learning principles associated with social interaction and explains why peer-to-peer discussion is potentially more…

  8. Behavioral conformance of artifact-centric process models

    NARCIS (Netherlands)

    Fahland, D.; Leoni, de M.; Dongen, van B.F.; Aalst, van der W.M.P.; Abramowicz, W.

    2011-01-01

    The use of process models in business information systems for analysis, execution, and improvement of processes assumes that the models describe reality. Conformance checking is a technique to validate how good a given process model describes recorded executions of the actual process. Recently,

  9. Temporal expectation and information processing: A model-based analysis

    NARCIS (Netherlands)

    Jepma, M.; Wagenmakers, E.-J.; Nieuwenhuis, S.

    2012-01-01

    People are able to use temporal cues to anticipate the timing of an event, enabling them to process that event more efficiently. We conducted two experiments, using the fixed-foreperiod paradigm (Experiment 1) and the temporal-cueing paradigm (Experiment 2), to assess which components of information

  10. The mediation of mothers' self-fulfilling effects on their children's alcohol use: self-verification, informational conformity, and modeling processes.

    Science.gov (United States)

    Madon, Stephanie; Guyll, Max; Buller, Ashley A; Scherr, Kyle C; Willard, Jennifer; Spoth, Richard

    2008-08-01

    This research examined whether self-fulfilling prophecy effects are mediated by self-verification, informational conformity, and modeling processes. The authors examined these mediational processes across multiple time frames with longitudinal data obtained from two samples of mother-child dyads (N-sub-1 = 486; N-sub-2 = 287), with children's alcohol use as the outcome variable. The results provided consistent support for the mediational process of self-verification. In both samples and across several years of adolescence, there was a significant indirect effect of mothers' beliefs on children's alcohol use through children's self-assessed likelihood of drinking alcohol in the future. Comparatively less support was found for informational conformity and modeling processes as mediators of mothers' self-fulfilling effects. The potential for self-fulfilling prophecies to produce long-lasting changes in targets' behavior via self-verification processes are discussed. (c) 2008 APA, all rights reserved

  11. Function Model for Community Health Service Information

    Science.gov (United States)

    Yang, Peng; Pan, Feng; Liu, Danhong; Xu, Yongyong

    In order to construct a function model of community health service (CHS) information for development of CHS information management system, Integration Definition for Function Modeling (IDEF0), an IEEE standard which is extended from Structured Analysis and Design(SADT) and now is a widely used function modeling method, was used to classifying its information from top to bottom. The contents of every level of the model were described and coded. Then function model for CHS information, which includes 4 super-classes, 15 classes and 28 sub-classed of business function, 43 business processes and 168 business activities, was established. This model can facilitate information management system development and workflow refinement.

  12. Mindfulness Training Alters Emotional Memory Recall Compared to Active Controls: Support for an Emotional Information Processing Model of Mindfulness

    OpenAIRE

    Roberts-Wolfe, Douglas; Sacchet, Matthew D.; Hastings, Elizabeth; Roth, Harold; Britton, Willoughby

    2012-01-01

    Objectives: While mindfulness-based interventions have received widespread application in both clinical and non-clinical populations, the mechanism by which mindfulness meditation improves well-being remains elusive. One possibility is that mindfulness training alters the processing of emotional information, similar to prevailing cognitive models of depression and anxiety. The aim of this study was to investigate the effects of mindfulness training on emotional information processing (i.e., m...

  13. Working memory capacity and redundant information processing efficiency.

    Science.gov (United States)

    Endres, Michael J; Houpt, Joseph W; Donkin, Chris; Finn, Peter R

    2015-01-01

    Working memory capacity (WMC) is typically measured by the amount of task-relevant information an individual can keep in mind while resisting distraction or interference from task-irrelevant information. The current research investigated the extent to which differences in WMC were associated with performance on a novel redundant memory probes (RMP) task that systematically varied the amount of to-be-remembered (targets) and to-be-ignored (distractor) information. The RMP task was designed to both facilitate and inhibit working memory search processes, as evidenced by differences in accuracy, response time, and Linear Ballistic Accumulator (LBA) model estimates of information processing efficiency. Participants (N = 170) completed standard intelligence tests and dual-span WMC tasks, along with the RMP task. As expected, accuracy, response-time, and LBA model results indicated memory search and retrieval processes were facilitated under redundant-target conditions, but also inhibited under mixed target/distractor and redundant-distractor conditions. Repeated measures analyses also indicated that, while individuals classified as high (n = 85) and low (n = 85) WMC did not differ in the magnitude of redundancy effects, groups did differ in the efficiency of memory search and retrieval processes overall. Results suggest that redundant information reliably facilitates and inhibits the efficiency or speed of working memory search, and these effects are independent of more general limits and individual differences in the capacity or space of working memory.

  14. Non-homogeneous Markov process models with informative observations with an application to Alzheimer's disease.

    Science.gov (United States)

    Chen, Baojiang; Zhou, Xiao-Hua

    2011-05-01

    Identifying risk factors for transition rates among normal cognition, mildly cognitive impairment, dementia and death in an Alzheimer's disease study is very important. It is known that transition rates among these states are strongly time dependent. While Markov process models are often used to describe these disease progressions, the literature mainly focuses on time homogeneous processes, and limited tools are available for dealing with non-homogeneity. Further, patients may choose when they want to visit the clinics, which creates informative observations. In this paper, we develop methods to deal with non-homogeneous Markov processes through time scale transformation when observation times are pre-planned with some observations missing. Maximum likelihood estimation via the EM algorithm is derived for parameter estimation. Simulation studies demonstrate that the proposed method works well under a variety of situations. An application to the Alzheimer's disease study identifies that there is a significant increase in transition rates as a function of time. Furthermore, our models reveal that the non-ignorable missing mechanism is perhaps reasonable. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Team confidence, motivated information processing, and dynamic group decision making

    NARCIS (Netherlands)

    de Dreu, C.K.W.; Beersma, B.

    2010-01-01

    According to the Motivated Information Processing in Groups (MIP-G) model, groups should perform ambiguous (non-ambiguous) tasks better when they have high (low) epistemic motivation and concomitant tendencies to engage in systematic (heuristic) information processing and exchange. The authors

  16. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers.

    Science.gov (United States)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R; Weber, Barbara

    2016-05-09

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling.

  17. Construction Process Simulation and Safety Analysis Based on Building Information Model and 4D Technology

    Institute of Scientific and Technical Information of China (English)

    HU Zhenzhong; ZHANG Jianping; DENG Ziyin

    2008-01-01

    Time-dependent structure analysis theory has been proved to be more accurate and reliable com-pared to commonly used methods during construction. However, so far applications are limited to partial pe-riod and part of the structure because of immeasurable artificial intervention. Based on the building informa-tion model (BIM) and four-dimensional (4D) technology, this paper proposes an improves structure analysis method, which can generate structural geometry, resistance model, and loading conditions automatically by a close interlink of the schedule information, architectural model, and material properties. The method was applied to a safety analysis during a continuous and dynamic simulation of the entire construction process.The results show that the organic combination of the BIM, 4D technology, construction simulation, and safety analysis of time-dependent structures is feasible and practical. This research also lays a foundation for further researches on building lifecycle management by combining architectural design, structure analy-sis, and construction management.

  18. Executive Information Systems' Multidimensional Models

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available Executive Information Systems are design to improve the quality of strategic level of management in organization through a new type of technology and several techniques for extracting, transforming, processing, integrating and presenting data in such a way that the organizational knowledge filters can easily associate with this data and turn it into information for the organization. These technologies are known as Business Intelligence Tools. But in order to build analytic reports for Executive Information Systems (EIS in an organization we need to design a multidimensional model based on the business model from the organization. This paper presents some multidimensional models that can be used in EIS development and propose a new model that is suitable for strategic business requests.

  19. Motivated information processing in organizational teams: Progress, puzzles, and prospects

    NARCIS (Netherlands)

    Nijstad, B.A.; de Dreu, C.K.W.

    2012-01-01

    Much of the research into group and team functioning looks at groups that perform cognitive tasks, such as decision making, problem solving, and innovation. The Motivated Information Processing in Groups Model (MIP-G; De Dreu, Nijstad, & Van Knippenberg, 2008) conjectures that information processing

  20. Motivated information processing, strategic choice, and the quality of negotiated agreement

    NARCIS (Netherlands)

    De Dreu, Carsten K W; Beersma, Bianca; Stroebe, Katherine; Euwema, Martin C.

    The authors tested a motivated information-processing model of negotiation: To reach high joint outcomes, negotiators need a deep understanding of the task, which requires them to exchange information and to process new information systematically. All this depends on social motivation, epistemic

  1. Motivated information processing, strategic choice, and the quality of negotiated agreement

    NARCIS (Netherlands)

    De Dreu, CKW; Beersma, B; Stroebe, K; Euwema, MC

    The authors tested a motivated information-processing model of negotiation: To reach high joint outcomes, negotiators need a deep understanding of the task, which requires them to exchange information and to process new information. systematically. All this depends on social motivation, epistemic

  2. Process mining using BPMN: relating event logs and process models

    NARCIS (Netherlands)

    Kalenkova, A.A.; van der Aalst, W.M.P.; Lomazova, I.A.; Rubin, V.A.

    2017-01-01

    Process-aware information systems (PAIS) are systems relying on processes, which involve human and software resources to achieve concrete goals. There is a need to develop approaches for modeling, analysis, improvement and monitoring processes within PAIS. These approaches include process mining

  3. Process mining using BPMN : relating event logs and process models

    NARCIS (Netherlands)

    Kalenkova, A.A.; Aalst, van der W.M.P.; Lomazova, I.A.; Rubin, V.A.

    2015-01-01

    Process-aware information systems (PAIS) are systems relying on processes, which involve human and software resources to achieve concrete goals. There is a need to develop approaches for modeling, analysis, improvement and monitoring processes within PAIS. These approaches include process mining

  4. The Mediation of Mothers’ Self-Fulfilling Effects on Their Children’s Alcohol Use: Self-Verification, Informational Conformity and Modeling Processes

    Science.gov (United States)

    Madon, Stephanie; Guyll, Max; Buller, Ashley A.; Scherr, Kyle C.; Willard, Jennifer; Spoth, Richard

    2010-01-01

    This research examined whether self-fulfilling prophecy effects are mediated by self-verification, informational conformity, and modeling processes. The authors examined these mediational processes across multiple time frames with longitudinal data obtained from two samples of mother – child dyads (N1 = 487; N2 = 287). Children’s alcohol use was the outcome variable. The results provided consistent support for the mediational process of self-verification. In both samples and across several years of adolescence, there was a significant indirect effect of mothers’ beliefs on children’s alcohol use through children’s self-assessed likelihood of drinking alcohol in the future. Comparatively less support was found for informational conformity and modeling processes as mediators of mothers’ self-fulfilling effects. The potential for self-fulfilling prophecies to produce long lasting changes in targets’ behavior via self-verification processes are discussed. PMID:18665708

  5. Motivated information processing in group judgement and decision making

    NARCIS (Netherlands)

    de Dreu, C.K.W.; Nijstad, B.A.; van Knippenberg, D.

    2008-01-01

    This article expands the view of groups as information processors into a motivated information processing in groups (MIP-G) model by emphasizing, first, the mixedmotive structure of many group tasks and, second, the idea that individuals engage in more or less deliberate information search and

  6. Model medication management process in Australian nursing homes using business process modeling.

    Science.gov (United States)

    Qian, Siyu; Yu, Ping

    2013-01-01

    One of the reasons for end user avoidance or rejection to use health information systems is poor alignment of the system with healthcare workflow, likely causing by system designers' lack of thorough understanding about healthcare process. Therefore, understanding the healthcare workflow is the essential first step for the design of optimal technologies that will enable care staff to complete the intended tasks faster and better. The often use of multiple or "high risk" medicines by older people in nursing homes has the potential to increase medication error rate. To facilitate the design of information systems with most potential to improve patient safety, this study aims to understand medication management process in nursing homes using business process modeling method. The paper presents study design and preliminary findings from interviewing two registered nurses, who were team leaders in two nursing homes. Although there were subtle differences in medication management between the two homes, major medication management activities were similar. Further field observation will be conducted. Based on the data collected from observations, an as-is process model for medication management will be developed.

  7. Motivated information processing in group judgment and decision making

    NARCIS (Netherlands)

    De Dreu, Carsten K. W.; Nijstad, Bernard A.; van Knippenberg, Daan

    This article expands the view of groups as information processors into a motivated information processing in groups (MIP-G) model by emphasizing, first, the mixed-motive structure of many group tasks and, second, the idea that individuals engage in more or less deliberate information search and

  8. User-guided discovery of declarative process models

    NARCIS (Netherlands)

    Maggi, F.M.; Mooij, A.J.; Aalst, van der W.M.P.; Chawla, N.; King, I.; Sperduti, A.

    2011-01-01

    Process mining techniques can be used to effectively discover process models from logs with example behaviour. Cross-correlating a discovered model with information in the log can be used to improve the underlying process. However, existing process discovery techniques have two important drawbacks.

  9. Developing a Process Model for the Forensic Extraction of Information from Desktop Search Applications

    Directory of Open Access Journals (Sweden)

    Timothy Pavlic

    2008-03-01

    Full Text Available Desktop search applications can contain cached copies of files that were deleted from the file system. Forensic investigators see this as a potential source of evidence, as documents deleted by suspects may still exist in the cache. Whilst there have been attempts at recovering data collected by desktop search applications, there is no methodology governing the process, nor discussion on the most appropriate means to do so. This article seeks to address this issue by developing a process model that can be applied when developing an information extraction application for desktop search applications, discussing preferred methods and the limitations of each. This work represents a more structured approach than other forms of current research.

  10. Probabilistic modeling of discourse-aware sentence processing.

    Science.gov (United States)

    Dubey, Amit; Keller, Frank; Sturt, Patrick

    2013-07-01

    Probabilistic models of sentence comprehension are increasingly relevant to questions concerning human language processing. However, such models are often limited to syntactic factors. This restriction is unrealistic in light of experimental results suggesting interactions between syntax and other forms of linguistic information in human sentence processing. To address this limitation, this article introduces two sentence processing models that augment a syntactic component with information about discourse co-reference. The novel combination of probabilistic syntactic components with co-reference classifiers permits them to more closely mimic human behavior than existing models. The first model uses a deep model of linguistics, based in part on probabilistic logic, allowing it to make qualitative predictions on experimental data; the second model uses shallow processing to make quantitative predictions on a broad-coverage reading-time corpus. Copyright © 2013 Cognitive Science Society, Inc.

  11. Business Process Modeling: Perceived Benefits

    Science.gov (United States)

    Indulska, Marta; Green, Peter; Recker, Jan; Rosemann, Michael

    The process-centered design of organizations and information systems is globally seen as an appropriate response to the increased economic pressure on organizations. At the methodological core of process-centered management is process modeling. However, business process modeling in large initiatives can be a time-consuming and costly exercise, making it potentially difficult to convince executive management of its benefits. To date, and despite substantial interest and research in the area of process modeling, the understanding of the actual benefits of process modeling in academia and practice is limited. To address this gap, this paper explores the perception of benefits derived from process modeling initiatives, as reported through a global Delphi study. The study incorporates the views of three groups of stakeholders - academics, practitioners and vendors. Our findings lead to the first identification and ranking of 19 unique benefits associated with process modeling. The study in particular found that process modeling benefits vary significantly between practitioners and academics. We argue that the variations may point to a disconnect between research projects and practical demands.

  12. Information Processing in Auto-regulated Systems

    Directory of Open Access Journals (Sweden)

    Karl Javorszky

    2003-06-01

    Full Text Available Abstract: We present a model of information processing which is based on two concurrent ways of describing the world, where a description in one of the languages limits the possibilities for realisations in the other language. The two describing dimensions appear in our common sense as dichotomies of perspectives: subjective - objective; diversity - similarity; individual - collective. We abstract from the subjective connotations and treat the test theoretical case of an interval on which several concurrent categories can be introduced. We investigate multidimensional partitions as potential carriers of information and compare their efficiency to that of sequenced carriers. We regard the same assembly once as a contemporary collection, once as a longitudinal sequence and find promising inroads towards understanding information processing by auto-regulated systems. Information is understood to point out that what is the case from among alternatives, which could be the case. We have translated these ideas into logical operations on the set of natural numbers and have found two equivalence points on N where matches between sequential and commutative ways of presenting a state of the world can agree in a stable fashion: a flip-flop mechanism is envisioned. By following this new approach, a mathematical treatment of some poignant biomathematical problems is allowed. Also, the concepts presented in this treatise may well have relevance and applications within the information processing and the theory of language fields.

  13. Algorithm-structured computer arrays and networks architectures and processes for images, percepts, models, information

    CERN Document Server

    Uhr, Leonard

    1984-01-01

    Computer Science and Applied Mathematics: Algorithm-Structured Computer Arrays and Networks: Architectures and Processes for Images, Percepts, Models, Information examines the parallel-array, pipeline, and other network multi-computers.This book describes and explores arrays and networks, those built, being designed, or proposed. The problems of developing higher-level languages for systems and designing algorithm, program, data flow, and computer structure are also discussed. This text likewise describes several sequences of successively more general attempts to combine the power of arrays wi

  14. Continuous-variable quantum information processing

    DEFF Research Database (Denmark)

    Andersen, Ulrik Lund; Leuchs, G.; Silberhorn, C.

    2010-01-01

    the continuous degree of freedom of a quantum system for encoding, processing or detecting information, one enters the field of continuous-variable (CV) quantum information processing. In this paper we review the basic principles of CV quantum information processing with main focus on recent developments...... in the field. We will be addressing the three main stages of a quantum information system; the preparation stage where quantum information is encoded into CVs of coherent states and single-photon states, the processing stage where CV information is manipulated to carry out a specified protocol and a detection...... stage where CV information is measured using homodyne detection or photon counting....

  15. Information Processing Capacity of Dynamical Systems

    Science.gov (United States)

    Dambre, Joni; Verstraeten, David; Schrauwen, Benjamin; Massar, Serge

    2012-07-01

    Many dynamical systems, both natural and artificial, are stimulated by time dependent external signals, somehow processing the information contained therein. We demonstrate how to quantify the different modes in which information can be processed by such systems and combine them to define the computational capacity of a dynamical system. This is bounded by the number of linearly independent state variables of the dynamical system, equaling it if the system obeys the fading memory condition. It can be interpreted as the total number of linearly independent functions of its stimuli the system can compute. Our theory combines concepts from machine learning (reservoir computing), system modeling, stochastic processes, and functional analysis. We illustrate our theory by numerical simulations for the logistic map, a recurrent neural network, and a two-dimensional reaction diffusion system, uncovering universal trade-offs between the non-linearity of the computation and the system's short-term memory.

  16. Information Processing Capacity of Dynamical Systems

    Science.gov (United States)

    Dambre, Joni; Verstraeten, David; Schrauwen, Benjamin; Massar, Serge

    2012-01-01

    Many dynamical systems, both natural and artificial, are stimulated by time dependent external signals, somehow processing the information contained therein. We demonstrate how to quantify the different modes in which information can be processed by such systems and combine them to define the computational capacity of a dynamical system. This is bounded by the number of linearly independent state variables of the dynamical system, equaling it if the system obeys the fading memory condition. It can be interpreted as the total number of linearly independent functions of its stimuli the system can compute. Our theory combines concepts from machine learning (reservoir computing), system modeling, stochastic processes, and functional analysis. We illustrate our theory by numerical simulations for the logistic map, a recurrent neural network, and a two-dimensional reaction diffusion system, uncovering universal trade-offs between the non-linearity of the computation and the system's short-term memory. PMID:22816038

  17. Methods of information processing

    Energy Technology Data Exchange (ETDEWEB)

    Kosarev, Yu G; Gusev, V D

    1978-01-01

    Works are presented on automation systems for editing and publishing operations by methods of processing symbol information and information contained in training selection (ranking of objectives by promise, classification algorithm of tones and noise). The book will be of interest to specialists in the automation of processing textural information, programming, and pattern recognition.

  18. Natural Information Processing Systems

    OpenAIRE

    John Sweller; Susan Sweller

    2006-01-01

    Natural information processing systems such as biological evolution and human cognition organize information used to govern the activities of natural entities. When dealing with biologically secondary information, these systems can be specified by five common principles that we propose underlie natural information processing systems. The principles equate: (1) human long-term memory with a genome; (2) learning from other humans with biological reproduction; (3) problem solving through random ...

  19. Clinical information modeling processes for semantic interoperability of electronic health records: systematic review and inductive analysis.

    Science.gov (United States)

    Moreno-Conde, Alberto; Moner, David; Cruz, Wellington Dimas da; Santos, Marcelo R; Maldonado, José Alberto; Robles, Montserrat; Kalra, Dipak

    2015-07-01

    This systematic review aims to identify and compare the existing processes and methodologies that have been published in the literature for defining clinical information models (CIMs) that support the semantic interoperability of electronic health record (EHR) systems. Following the preferred reporting items for systematic reviews and meta-analyses systematic review methodology, the authors reviewed published papers between 2000 and 2013 that covered that semantic interoperability of EHRs, found by searching the PubMed, IEEE Xplore, and ScienceDirect databases. Additionally, after selection of a final group of articles, an inductive content analysis was done to summarize the steps and methodologies followed in order to build CIMs described in those articles. Three hundred and seventy-eight articles were screened and thirty six were selected for full review. The articles selected for full review were analyzed to extract relevant information for the analysis and characterized according to the steps the authors had followed for clinical information modeling. Most of the reviewed papers lack a detailed description of the modeling methodologies used to create CIMs. A representative example is the lack of description related to the definition of terminology bindings and the publication of the generated models. However, this systematic review confirms that most clinical information modeling activities follow very similar steps for the definition of CIMs. Having a robust and shared methodology could improve their correctness, reliability, and quality. Independently of implementation technologies and standards, it is possible to find common patterns in methods for developing CIMs, suggesting the viability of defining a unified good practice methodology to be used by any clinical information modeler. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. Modelling health care processes for eliciting user requirements: a way to link a quality paradigm and clinical information system design.

    Science.gov (United States)

    Staccini, P; Joubert, M; Quaranta, J F; Fieschi, D; Fieschi, M

    2001-12-01

    Healthcare institutions are looking at ways to increase their efficiency by reducing costs while providing care services with a high level of safety. Thus, hospital information systems have to support quality improvement objectives. The elicitation of the requirements has to meet users' needs in relation to both the quality (efficacy, safety) and the monitoring of all health care activities (traceability). Information analysts need methods to conceptualise clinical information systems that provide actors with individual benefits and guide behavioural changes. A methodology is proposed to elicit and structure users' requirements using a process-oriented analysis, and it is applied to the blood transfusion process. An object-oriented data model of a process has been defined in order to organise the data dictionary. Although some aspects of activity, such as 'where', 'what else', and 'why' are poorly represented by the data model alone, this method of requirement elicitation fits the dynamic of data input for the process to be traced. A hierarchical representation of hospital activities has to be found for the processes to be interrelated, and for their characteristics to be shared, in order to avoid data redundancy and to fit the gathering of data with the provision of care.

  1. Information Processing and Dynamics in Minimally Cognitive Agents

    Science.gov (United States)

    Beer, Randall D.; Williams, Paul L.

    2015-01-01

    There has been considerable debate in the literature about the relative merits of information processing versus dynamical approaches to understanding cognitive processes. In this article, we explore the relationship between these two styles of explanation using a model agent evolved to solve a relational categorization task. Specifically, we…

  2. Business process modeling in healthcare.

    Science.gov (United States)

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.

  3. Display analysis with the optimal control model of the human operator. [pilot-vehicle display interface and information processing

    Science.gov (United States)

    Baron, S.; Levison, W. H.

    1977-01-01

    Application of the optimal control model of the human operator to problems in display analysis is discussed. Those aspects of the model pertaining to the operator-display interface and to operator information processing are reviewed and discussed. The techniques are then applied to the analysis of advanced display/control systems for a Terminal Configured Vehicle. Model results are compared with those obtained in a large, fixed-base simulation.

  4. UML in business process modeling

    Directory of Open Access Journals (Sweden)

    Bartosz Marcinkowski

    2013-03-01

    Full Text Available Selection and proper application of business process modeling methods and techniques have a significant impact on organizational improvement capabilities as well as proper understanding of functionality of information systems that shall support activity of the organization. A number of business process modeling notations were popularized in practice in recent decades. Most significant of the notations include Business Process Modeling Notation (OMG BPMN and several Unified Modeling Language (OMG UML extensions. In this paper, the assessment whether one of the most flexible and strictly standardized contemporary business process modeling notations, i.e. Rational UML Profile for Business Modeling, enable business analysts to prepare business models that are all-embracing and understandable by all the stakeholders. After the introduction, methodology of research is discussed. Section 2 presents selected case study results. The paper is concluded with a summary.

  5. LanguageNet: A Novel Framework for Processing Unstructured Text Information

    DEFF Research Database (Denmark)

    Qureshi, Pir Abdul Rasool; Memon, Nasrullah; Wiil, Uffe Kock

    2011-01-01

    In this paper we present LanguageNet—a novel framework for processing unstructured text information from human generated content. The state of the art information processing frameworks have some shortcomings: modeled in generalized form, trained on fixed (limited) data sets, and leaving...... the specialization necessary for information consolidation to the end users. The proposed framework is the first major attempt to address these shortcomings. LanguageNet provides extended support of graphical methods contributing added value to the capabilities of information processing. We discuss the benefits...... of the framework and compare it with the available state of the art. We also describe how the framework improves the information gathering process and contribute towards building systems with better performance in the domain of Open Source Intelligence....

  6. Information-Theoretic Perspectives on Geophysical Models

    Science.gov (United States)

    Nearing, Grey

    2016-04-01

    To test any hypothesis about any dynamic system, it is necessary to build a model that places that hypothesis into the context of everything else that we know about the system: initial and boundary conditions and interactions between various governing processes (Hempel and Oppenheim, 1948, Cartwright, 1983). No hypothesis can be tested in isolation, and no hypothesis can be tested without a model (for a geoscience-related discussion see Clark et al., 2011). Science is (currently) fundamentally reductionist in the sense that we seek some small set of governing principles that can explain all phenomena in the universe, and such laws are ontological in the sense that they describe the object under investigation (Davies, 1990 gives several competing perspectives on this claim). However, since we cannot build perfect models of complex systems, any model that does not also contain an epistemological component (i.e., a statement, like a probability distribution, that refers directly to the quality of of the information from the model) is falsified immediately (in the sense of Popper, 2002) given only a small number of observations. Models necessarily contain both ontological and epistemological components, and what this means is that the purpose of any robust scientific method is to measure the amount and quality of information provided by models. I believe that any viable philosophy of science must be reducible to this statement. The first step toward a unified theory of scientific models (and therefore a complete philosophy of science) is a quantitative language that applies to both ontological and epistemological questions. Information theory is one such language: Cox' (1946) theorem (see Van Horn, 2003) tells us that probability theory is the (only) calculus that is consistent with Classical Logic (Jaynes, 2003; chapter 1), and information theory is simply the integration of convex transforms of probability ratios (integration reduces density functions to scalar

  7. REMOTE SYNTHESIS AND CONTROL INFORMATION TECHNOLOGY OF SYSTEM-DYNAMIC MODELS

    Directory of Open Access Journals (Sweden)

    A. V. Masloboev

    2015-07-01

    Full Text Available The general line of research is concerned with development of information technologies and computer simulation tools for management information and analytical support of complex semistructured systems. Regional socio-economic systems are consideredas a representative of this system type. Investigation is carried out within the bounds of development strategy implementation of the Arctic zone of the Russian Federation and national safety until 2020 in the Murmansk region, specifically under engineering of high end information infrastructure for innovation and security control problem-solving of regional development. Research methodology consists of system dynamics modeling method, distributed information system engineering technologies, pattern-based modeling and design techniques. The work deals with development of toolkit for decision-making information support problem-solving in the field of innovation security management of regional economics. For that purpose a system-dynamic models suite of innovation process standard components and information technology for remote formation and control of innovation business simulation models under research have been developed. Designed toolkit provides innovation security index dynamics forecasting and innovation business effectiveness of regional economics. Information technology is implemented within the bounds of thin client architecture and is intended for simulation models design process automation of complex systems. Technology implementation software tools provide pattern-based system-dynamic models distributed formation and simulation control of innovation processes. The technology provides availability and reusability index enhancement of information support facilities in application to innovation process simulation at the expense of distributed access to innovation business simulation modeling tools and model synthesis by the reusable components, simulating standard elements of innovation

  8. Information processing of earth resources data

    Science.gov (United States)

    Zobrist, A. L.; Bryant, N. A.

    1982-01-01

    Current trends in the use of remotely sensed data include integration of multiple data sources of various formats and use of complex models. These trends have placed a strain on information processing systems because an enormous number of capabilities are needed to perform a single application. A solution to this problem is to create a general set of capabilities which can perform a wide variety of applications. General capabilities for the Image-Based Information System (IBIS) are outlined in this report. They are then cross-referenced for a set of applications performed at JPL.

  9. Formal approach to modeling of modern Information Systems

    Directory of Open Access Journals (Sweden)

    Bálint Molnár

    2016-01-01

    Full Text Available Most recently, the concept of business documents has started to play double role. On one hand, a business document (word processing text or calculation sheet can be used as specification tool, on the other hand the business document is an immanent constituent of business processes, thereby essential component of business Information Systems. The recent tendency is that the majority of documents and their contents within business Information Systems remain in semi-structured format and a lesser part of documents is transformed into schemas of structured databases. In order to keep the emerging situation in hand, we suggest the creation (1 a theoretical framework for modeling business Information Systems; (2 and a design method for practical application based on the theoretical model that provides the structuring principles. The modeling approach that focuses on documents and their interrelationships with business processes assists in perceiving the activities of modern Information Systems.

  10. The Evolution Process on Information Technology Outsourcing Relationship

    Directory of Open Access Journals (Sweden)

    Duan Weihua

    2017-01-01

    Full Text Available Information technology outsourcing relationship is one of the key issues to IT outsourcing success. To explore how to manage and promote IT outsourcing relationship, it is necessary to understand its evolution process. Firstly, the types of IT outsourcing based on relationship quality and IT outsourcing project level will be analyzed; Secondly, two evolution process models of IT outsourcing relationship are proposed based on relationship quality and IT outsourcing project level, and the IT outsourcing relationship evolution process is indicated; Finally, an IT outsourcing relationship evolution process model is developed, and the development process of IT outsourcing relationship from low to high under the internal and external power is explained.

  11. Multi-enzyme Process Modeling

    DEFF Research Database (Denmark)

    Andrade Santacoloma, Paloma de Gracia

    are affected (in a positive or negative way) by the presence of the other enzymes and compounds in the media. In this thesis the concept of multi-enzyme in-pot term is adopted for processes that are carried out by the combination of enzymes in a single reactor and implemented at pilot or industrial scale...... features of the process and provides the information required to structure the process model by using a step-by-step procedure with the required tools and methods. In this way, this framework increases efficiency of the model development process with respect to time and resources needed (fast and effective....... In this way the model parameters that drives the main dynamic behavior can be identified and thus a better understanding of this type of processes. In order to develop, test and verify the methodology, three case studies were selected, specifically the bi-enzyme process for the production of lactobionic acid...

  12. Information processing, computation, and cognition.

    Science.gov (United States)

    Piccinini, Gualtiero; Scarantino, Andrea

    2011-01-01

    Computation and information processing are among the most fundamental notions in cognitive science. They are also among the most imprecisely discussed. Many cognitive scientists take it for granted that cognition involves computation, information processing, or both - although others disagree vehemently. Yet different cognitive scientists use 'computation' and 'information processing' to mean different things, sometimes without realizing that they do. In addition, computation and information processing are surrounded by several myths; first and foremost, that they are the same thing. In this paper, we address this unsatisfactory state of affairs by presenting a general and theory-neutral account of computation and information processing. We also apply our framework by analyzing the relations between computation and information processing on one hand and classicism, connectionism, and computational neuroscience on the other. We defend the relevance to cognitive science of both computation, at least in a generic sense, and information processing, in three important senses of the term. Our account advances several foundational debates in cognitive science by untangling some of their conceptual knots in a theory-neutral way. By leveling the playing field, we pave the way for the future resolution of the debates' empirical aspects.

  13. Application of information and communication technology in process reengineering

    Directory of Open Access Journals (Sweden)

    Đurović Aleksandar M.

    2014-01-01

    Full Text Available This paper examines the role of information communication technologies in reengineering processes. General analysis of a process will show that information communication technologies improve their efficiency. Reengineering model based on the BPMN 2.0 standard will be applied to the process of seeking internship/job by students from Faculty of Transport and Traffic Engineering. In the paper, after defining the technical characteristics and required functionalities, web / mobile application is proposed, enabling better visibility of traffic engineers to companies seeking that education profile.

  14. Optimal Information Processing in Biochemical Networks

    Science.gov (United States)

    Wiggins, Chris

    2012-02-01

    A variety of experimental results over the past decades provide examples of near-optimal information processing in biological networks, including in biochemical and transcriptional regulatory networks. Computing information-theoretic quantities requires first choosing or computing the joint probability distribution describing multiple nodes in such a network --- for example, representing the probability distribution of finding an integer copy number of each of two interacting reactants or gene products while respecting the `intrinsic' small copy number noise constraining information transmission at the scale of the cell. I'll given an overview of some recent analytic and numerical work facilitating calculation of such joint distributions and the associated information, which in turn makes possible numerical optimization of information flow in models of noisy regulatory and biochemical networks. Illustrating cases include quantification of form-function relations, ideal design of regulatory cascades, and response to oscillatory driving.

  15. Motivated information processing and group decision refusal

    NARCIS (Netherlands)

    Nijstad, Bernard A.; Oltmanns, Jan

    Group decision making has attracted much scientific interest, but few studies have investigated group decisions that do not get made. Based on the Motivated Information Processing in Groups model, this study analysed the effect of epistemic motivation (low vs. high) and social motivation (proself

  16. Theoretical aspects and modelling of cellular decision making, cell killing and information-processing in photodynamic therapy of cancer.

    Science.gov (United States)

    Gkigkitzis, Ioannis

    2013-01-01

    The aim of this report is to provide a mathematical model of the mechanism for making binary fate decisions about cell death or survival, during and after Photodynamic Therapy (PDT) treatment, and to supply the logical design for this decision mechanism as an application of rate distortion theory to the biochemical processing of information by the physical system of a cell. Based on system biology models of the molecular interactions involved in the PDT processes previously established, and regarding a cellular decision-making system as a noisy communication channel, we use rate distortion theory to design a time dependent Blahut-Arimoto algorithm where the input is a stimulus vector composed of the time dependent concentrations of three PDT related cell death signaling molecules and the output is a cell fate decision. The molecular concentrations are determined by a group of rate equations. The basic steps are: initialize the probability of the cell fate decision, compute the conditional probability distribution that minimizes the mutual information between input and output, compute the cell probability of cell fate decision that minimizes the mutual information and repeat the last two steps until the probabilities converge. Advance to the next discrete time point and repeat the process. Based on the model from communication theory described in this work, and assuming that the activation of the death signal processing occurs when any of the molecular stimulants increases higher than a predefined threshold (50% of the maximum concentrations), for 1800s of treatment, the cell undergoes necrosis within the first 30 minutes with probability range 90.0%-99.99% and in the case of repair/survival, it goes through apoptosis within 3-4 hours with probability range 90.00%-99.00%. Although, there is no experimental validation of the model at this moment, it reproduces some patterns of survival ratios of predicted experimental data. Analytical modeling based on cell death

  17. Data retrieval systems and models of information situations

    International Nuclear Information System (INIS)

    Jankowski, L.

    1984-01-01

    Demands placed on data retrieval systems and their basic parameters are given. According to the stage of development of data collection and processing, data retrieval systems may be divided into systems for the simple recording and provision of data, systems for recording and providing data with integrated statistical functions, and logical information systems. The structure is characterized of the said information systems as are methods of processing and representation of facts. The notion is defined of ''artificial intelligence'' in the development of logical information systems. The structure of representing knowledge in diverse forms of the model is decisive in logical information systems related to nuclear research. The main model elements are the characteristics of data, forms of representation and program. In dependence on the structure of data, the structure of the preparatory and transformation algorithms and on the aim of the system it is possible to classify data retrieval systems related to nuclear research and technology into five logical information models: linear, identification, advisory, theory-experiment models and problem solving models. The characteristics are given of the said models and examples of data retrieval systems for the individual models. (E.S.)

  18. The Analysis of Electronic Journal Utilization In Learning Process: Technology Acceptance Model And Information System Success

    Directory of Open Access Journals (Sweden)

    Achmad Zaky

    2017-12-01

    Full Text Available This study aims to observe the behavior of electronic journal (e-journal among bachelor students of the Universitas Brawijaya by Technology Acceptance Model (TAM and Information System Success (ISS as theoretical framework. The research samples are all bachelor students who have used e-journal in their learning process. The respondents are selected by convenience sampling method. The data are collected through survey and analyzed by Partial Least Square (PLS with SmartPLS 3. The result of the study reveals that user satisfaction and intention to use have significant effect on actual use of e-journal among bachelor students at the Universitas Brawijaya. Those variables affect the actual use because they have been formed by other variables such as information quality, perceived easiness, perceived usefulness, and attitude towards behavior. Furthermore, information quality has significant influence on user satisfaction, while perceived usefulness and perceived usefulness do not have direct effect on the intention to use. The implication of this study is relevant for educators to recognize the reason factors to use e-journal in the learning process.

  19. Conservation Process Model (cpm): a Twofold Scientific Research Scope in the Information Modelling for Cultural Heritage

    Science.gov (United States)

    Fiorani, D.; Acierno, M.

    2017-05-01

    The aim of the present research is to develop an instrument able to adequately support the conservation process by means of a twofold approach, based on both BIM environment and ontology formalisation. Although BIM has been successfully experimented within AEC (Architecture Engineering Construction) field, it has showed many drawbacks for architectural heritage. To cope with unicity and more generally complexity of ancient buildings, applications so far developed have shown to poorly adapt BIM to conservation design with unsatisfactory results (Dore, Murphy 2013; Carrara 2014). In order to combine achievements reached within AEC through BIM environment (design control and management) with an appropriate, semantically enriched and flexible The presented model has at its core a knowledge base developed through information ontologies and oriented around the formalization and computability of all the knowledge necessary for the full comprehension of the object of architectural heritage an its conservation. Such a knowledge representation is worked out upon conceptual categories defined above all within architectural criticism and conservation scope. The present paper aims at further extending the scope of conceptual modelling within cultural heritage conservation already formalized by the model. A special focus is directed on decay analysis and surfaces conservation project.

  20. Prosody's Contribution to Fluency: An Examination of the Theory of Automatic Information Processing

    Science.gov (United States)

    Schrauben, Julie E.

    2010-01-01

    LaBerge and Samuels' (1974) theory of automatic information processing in reading offers a model that explains how and where the processing of information occurs and the degree to which processing of information occurs. These processes are dependent upon two criteria: accurate word decoding and automatic word recognition. However, LaBerge and…

  1. A stream-based mathematical model for distributed information processing systems - SysLab system model

    OpenAIRE

    Klein, Cornel; Rumpe, Bernhard; Broy, Manfred

    2014-01-01

    In the SysLab project we develop a software engineering method based on a mathematical foundation. The SysLab system model serves as an abstract mathematical model for information systems and their components. It is used to formalize the semantics of all used description techniques such as object diagrams state automata sequence charts or data-flow diagrams. Based on the requirements for such a reference model, we define the system model including its different views and their relationships.

  2. Influence Processes for Information Technology Acceptance

    DEFF Research Database (Denmark)

    Bhattacherjee, Anol; Sanford, Clive Carlton

    2006-01-01

    This study examines how processes of external influence shape information technology acceptance among potential users, how such influence effects vary across a user population, and whether these effects are persistent over time. Drawing on the elaboration-likelihood model (ELM), we compared two...... alternative influence processes, the central and peripheral routes, in motivating IT acceptance. These processes were respectively operationalized using the argument quality and source credibility constructs, and linked to perceived usefulness and attitude, the core perceptual drivers of IT acceptance. We...... further examined how these influence processes were moderated by users' IT expertise and perceived job relevance and the temporal stability of such influence effects. Nine hypotheses thus developed were empirically validated using a field survey of document management system acceptance at an eastern...

  3. Scientific information processing procedures

    Directory of Open Access Journals (Sweden)

    García, Maylin

    2013-07-01

    Full Text Available The paper systematizes several theoretical view-points on scientific information processing skill. It decomposes the processing skills into sub-skills. Several methods such analysis, synthesis, induction, deduction, document analysis were used to build up a theoretical framework. Interviews and survey to professional being trained and a case study was carried out to evaluate the results. All professional in the sample improved their performance in scientific information processing.

  4. The cognitive viewpoint on information science and processing information in cognitive psychology - a vision for interdisciplinary

    Directory of Open Access Journals (Sweden)

    Shirley Guimarães Pimenta

    2012-08-01

    Full Text Available The interaction amongst the ‘user’, ‘information’, and ‘text’ is of interest to Information Science although it has deserved insufficient attention in the literature. This issue is addressed by this paper whose main purpose is to contribute to the discussion of theoretical affinity between the cognitive viewpoint in Information Science and the information processing approach in Cognitive Psychology. Firstly, the interdisciplinary nature of Information Science is discussed and justified as a means to deepen and strengthen its theoretical framework. Such interdisciplinarity helps to avoid stagnation and keep pace with other disciplines. Secondly, the discussion takes into consideration the cognitive paradigm, which originates the cognitive viewpoint approach in Information Science. It is highlighted that the cognitive paradigm represented a change in the Social Sciences due to the shift of focus from the object and the signal to the individual. Besides that, it sheds light to the notion of models of worlds, i.e., the systems of categories and concepts that guide the interaction between the individual and his/her environment. Thirdly, the theoretical assumptions of the cognitive viewpoint approach are discussed, with emphasis on the concept of ‘information’, as resulting of cognitive processes and as related to the notion of ‘text’. This approach points out the relevance of understanding the interaction amongst users, information, and text. However, it lacks further development. Using notions which are common to both approaches, some of the gaps can be fulfilled. Finally, the concept of ‘text’, its constituents and structures are presented from the perspective of text comprehension models and according to the information processing approach. As a concluding remark, it is suggested that bringing together the cognitive viewpoint and the information processing approach can be enriching and fruitful to the both Information

  5. Information accessibility and cryptic processes

    Energy Technology Data Exchange (ETDEWEB)

    Mahoney, John R; Ellison, Christopher J; Crutchfield, James P [Complexity Sciences Center and Physics Department, University of California at Davis, One Shields Avenue, Davis, CA 95616 (United States)], E-mail: jrmahoney@ucdavis.edu, E-mail: cellison@cse.ucdavis.edu, E-mail: chaos@cse.ucdavis.edu

    2009-09-11

    We give a systematic expansion of the crypticity-a recently introduced measure of the inaccessibility of a stationary process's internal state information. This leads to a hierarchy of k-cryptic processes and allows us to identify finite-state processes that have infinite cryptic order-the internal state information is present across arbitrarily long, observed sequences. The crypticity expansion is exact in both the finite- and infinite-order cases. It turns out that k-crypticity is complementary to the Markovian finite-order property that describes state information in processes. One application of these results is an efficient expansion of the excess entropy-the mutual information between a process's infinite past and infinite future-that is finite and exact for finite-order cryptic processes. (fast track communication)

  6. Mass media in health promotion: an analysis using an extended information-processing model.

    Science.gov (United States)

    Flay, B R; DiTecco, D; Schlegel, R P

    1980-01-01

    The information-processing model of the attitude and behavior change process was critically examined and extended from six to 12 levels for a better analysis of change due to mass media campaigns. Findings from social psychology and communications research, and from evaluations of mass media health promotion programs, were reviewed to determine how source, message, channel, receiver, and destination variables affect each of the levels of change of major interest (knowledge, beliefs, attitudes, intentions and behavior). Factors found to most likely induce permanent attitude and behavior change (most important in health promotion) were: presentation and repetition over long time periods, via multiple sources, at different times (including "prime" or high-exposure times), by multiple sources, in novel and involving ways, with appeals to multiple motives, development of social support, and provisions of appropriate behavioral skills, alternatives, and reinforcement (preferably in ways that get the active participation of the audience). Suggestions for evaluation of mass media programs that take account of this complexity were advanced.

  7. Information processing for aerospace structural health monitoring

    Science.gov (United States)

    Lichtenwalner, Peter F.; White, Edward V.; Baumann, Erwin W.

    1998-06-01

    Structural health monitoring (SHM) technology provides a means to significantly reduce life cycle of aerospace vehicles by eliminating unnecessary inspections, minimizing inspection complexity, and providing accurate diagnostics and prognostics to support vehicle life extension. In order to accomplish this, a comprehensive SHM system will need to acquire data from a wide variety of diverse sensors including strain gages, accelerometers, acoustic emission sensors, crack growth gages, corrosion sensors, and piezoelectric transducers. Significant amounts of computer processing will then be required to convert this raw sensor data into meaningful information which indicates both the diagnostics of the current structural integrity as well as the prognostics necessary for planning and managing the future health of the structure in a cost effective manner. This paper provides a description of the key types of information processing technologies required in an effective SHM system. These include artificial intelligence techniques such as neural networks, expert systems, and fuzzy logic for nonlinear modeling, pattern recognition, and complex decision making; signal processing techniques such as Fourier and wavelet transforms for spectral analysis and feature extraction; statistical algorithms for optimal detection, estimation, prediction, and fusion; and a wide variety of other algorithms for data analysis and visualization. The intent of this paper is to provide an overview of the role of information processing for SHM, discuss various technologies which can contribute to accomplishing this role, and present some example applications of information processing for SHM implemented at the Boeing Company.

  8. Mathematical model of information process of protection of the social sector

    Science.gov (United States)

    Novikov, D. A.; Tsarkova, E. G.; Dubrovin, A. S.; Soloviev, A. S.

    2018-03-01

    In work the mathematical model of information protection of society against distribution of extremist moods by means of impact on mass consciousness of information placed in media is investigated. Internal and external channels on which there is a dissemination of information are designated. The problem of optimization consisting in search of the optimum strategy allowing to use most effectively media for dissemination of antiterrorist information with the minimum financial expenses is solved. The algorithm of a numerical method of the solution of a problem of optimization is constructed and also the analysis of results of a computing experiment is carried out.

  9. Human Information Processing and Supervisory Control.

    Science.gov (United States)

    1980-05-01

    errors (that is of the output of the human operator). There is growing evidence (Senders, personal communication; Norman , personal communication...relates to the relative tendency to depend on sensory information or to be more analytic and independent. Norman (personal communication) has referred...decision process model. Ergonomics, 12, 543-557. Senders, J., Elkid, J., Grignetti, M., & Smallwood , R. 1966. An investigation of the visual sampling

  10. Information Processing and Limited Liability

    OpenAIRE

    Bartosz Mackowiak; Mirko Wiederholt

    2012-01-01

    Decision-makers often face limited liability and thus know that their loss will be bounded. We study how limited liability affects the behavior of an agent who chooses how much information to acquire and process in order to take a good decision. We find that an agent facing limited liability processes less information than an agent with unlimited liability. The informational gap between the two agents is larger in bad times than in good times and when information is more costly to process.

  11. Attachment and the processing of social information across the life span: theory and evidence.

    Science.gov (United States)

    Dykas, Matthew J; Cassidy, Jude

    2011-01-01

    Researchers have used J. Bowlby's (1969/1982, 1973, 1980, 1988) attachment theory frequently as a basis for examining whether experiences in close personal relationships relate to the processing of social information across childhood, adolescence, and adulthood. We present an integrative life-span-encompassing theoretical model to explain the patterns of results that have emerged from these studies. The central proposition is that individuals who possess secure experience-based internal working models of attachment will process--in a relatively open manner--a broad range of positive and negative attachment-relevant social information. Moreover, secure individuals will draw on their positive attachment-related knowledge to process this information in a positively biased schematic way. In contrast, individuals who possess insecure internal working models of attachment will process attachment-relevant social information in one of two ways, depending on whether the information could cause the individual psychological pain. If processing the information is likely to lead to psychological pain, insecure individuals will defensively exclude this information from further processing. If, however, the information is unlikely to lead to psychological pain, then insecure individuals will process this information in a negatively biased schematic fashion that is congruent with their negative attachment-related experiences. In a comprehensive literature review, we describe studies that illustrate these patterns of attachment-related information processing from childhood to adulthood. This review focuses on studies that have examined specific components (e.g., attention and memory) and broader aspects (e.g., attributions) of social information processing. We also provide general conclusions and suggestions for future research.

  12. Repairing business process models as retrieved from source code

    NARCIS (Netherlands)

    Fernández-Ropero, M.; Reijers, H.A.; Pérez-Castillo, R.; Piattini, M.; Nurcan, S.; Proper, H.A.; Soffer, P.; Krogstie, J.; Schmidt, R.; Halpin, T.; Bider, I.

    2013-01-01

    The static analysis of source code has become a feasible solution to obtain underlying business process models from existing information systems. Due to the fact that not all information can be automatically derived from source code (e.g., consider manual activities), such business process models

  13. Information processing and dynamics in minimally cognitive agents.

    Science.gov (United States)

    Beer, Randall D; Williams, Paul L

    2015-01-01

    There has been considerable debate in the literature about the relative merits of information processing versus dynamical approaches to understanding cognitive processes. In this article, we explore the relationship between these two styles of explanation using a model agent evolved to solve a relational categorization task. Specifically, we separately analyze the operation of this agent using the mathematical tools of information theory and dynamical systems theory. Information-theoretic analysis reveals how task-relevant information flows through the system to be combined into a categorization decision. Dynamical analysis reveals the key geometrical and temporal interrelationships underlying the categorization decision. Finally, we propose a framework for directly relating these two different styles of explanation and discuss the possible implications of our analysis for some of the ongoing debates in cognitive science. Copyright © 2014 Cognitive Science Society, Inc.

  14. Constructing topic models of Internet of Things for information processing.

    Science.gov (United States)

    Xin, Jie; Cui, Zhiming; Zhang, Shukui; He, Tianxu; Li, Chunhua; Huang, Haojing

    2014-01-01

    Internet of Things (IoT) is regarded as a remarkable development of the modern information technology. There is abundant digital products data on the IoT, linking with multiple types of objects/entities. Those associated entities carry rich information and usually in the form of query records. Therefore, constructing high quality topic hierarchies that can capture the term distribution of each product record enables us to better understand users' search intent and benefits tasks such as taxonomy construction, recommendation systems, and other communications solutions for the future IoT. In this paper, we propose a novel record entity topic model (RETM) for IoT environment that is associated with a set of entities and records and a Gibbs sampling-based algorithm is proposed to learn the model. We conduct extensive experiments on real-world datasets and compare our approach with existing methods to demonstrate the advantage of our approach.

  15. Constructing Topic Models of Internet of Things for Information Processing

    Science.gov (United States)

    Xin, Jie; Cui, Zhiming; Zhang, Shukui; He, Tianxu; Li, Chunhua; Huang, Haojing

    2014-01-01

    Internet of Things (IoT) is regarded as a remarkable development of the modern information technology. There is abundant digital products data on the IoT, linking with multiple types of objects/entities. Those associated entities carry rich information and usually in the form of query records. Therefore, constructing high quality topic hierarchies that can capture the term distribution of each product record enables us to better understand users' search intent and benefits tasks such as taxonomy construction, recommendation systems, and other communications solutions for the future IoT. In this paper, we propose a novel record entity topic model (RETM) for IoT environment that is associated with a set of entities and records and a Gibbs sampling-based algorithm is proposed to learn the model. We conduct extensive experiments on real-world datasets and compare our approach with existing methods to demonstrate the advantage of our approach. PMID:25110737

  16. Computational spectrotemporal auditory model with applications to acoustical information processing

    Science.gov (United States)

    Chi, Tai-Shih

    A computational spectrotemporal auditory model based on neurophysiological findings in early auditory and cortical stages is described. The model provides a unified multiresolution representation of the spectral and temporal features of sound likely critical in the perception of timbre. Several types of complex stimuli are used to demonstrate the spectrotemporal information preserved by the model. Shown by these examples, this two stage model reflects the apparent progressive loss of temporal dynamics along the auditory pathway from the rapid phase-locking (several kHz in auditory nerve), to moderate rates of synchrony (several hundred Hz in midbrain), to much lower rates of modulations in the cortex (around 30 Hz). To complete this model, several projection-based reconstruction algorithms are implemented to resynthesize the sound from the representations with reduced dynamics. One particular application of this model is to assess speech intelligibility. The spectro-temporal Modulation Transfer Functions (MTF) of this model is investigated and shown to be consistent with the salient trends in the human MTFs (derived from human detection thresholds) which exhibit a lowpass function with respect to both spectral and temporal dimensions, with 50% bandwidths of about 16 Hz and 2 cycles/octave. Therefore, the model is used to demonstrate the potential relevance of these MTFs to the assessment of speech intelligibility in noise and reverberant conditions. Another useful feature is the phase singularity emerged in the scale space generated by this multiscale auditory model. The singularity is shown to have certain robust properties and carry the crucial information about the spectral profile. Such claim is justified by perceptually tolerable resynthesized sounds from the nonconvex singularity set. In addition, the singularity set is demonstrated to encode the pitch and formants at different scales. These properties make the singularity set very suitable for traditional

  17. User's manual for a process model code

    International Nuclear Information System (INIS)

    Kern, E.A.; Martinez, D.P.

    1981-03-01

    The MODEL code has been developed for computer modeling of materials processing facilities associated with the nuclear fuel cycle. However, it can also be used in other modeling applications. This report provides sufficient information for a potential user to apply the code to specific process modeling problems. Several examples that demonstrate most of the capabilities of the code are provided

  18. Discovering Process Reference Models from Process Variants Using Clustering Techniques

    NARCIS (Netherlands)

    Li, C.; Reichert, M.U.; Wombacher, Andreas

    2008-01-01

    In today's dynamic business world, success of an enterprise increasingly depends on its ability to react to changes in a quick and flexible way. In response to this need, process-aware information systems (PAIS) emerged, which support the modeling, orchestration and monitoring of business processes

  19. Information processing psychology: A promising paradigm for research in science teaching

    Science.gov (United States)

    Stewart, James H.; Atkin, Julia A.

    Three research paradigms, those of Ausubel, Gagné and Piaget, have received a great deal of attention in the literature of science education. In this article a fourth paradigm is presented - an information processing psychology paradigm. The article is composed of two sections. The first section describes a model of memory developed by information processing psychologists. The second section describes how such a model could be used to guide science education research on learning and problem solving.Received: 19 October 1981

  20. Aggression and Moral Development: Integrating Social Information Processing and Moral Domain Models

    Science.gov (United States)

    Arsenio, William F.; Lemerise, Elizabeth A.

    2004-01-01

    Social information processing and moral domain theories have developed in relative isolation from each other despite their common focus on intentional harm and victimization, and mutual emphasis on social cognitive processes in explaining aggressive, morally relevant behaviors. This article presents a selective summary of these literatures with…

  1. Understanding the Information Research Process of Experienced Online Information Researchers to Inform Development of a Scholars Portal

    Directory of Open Access Journals (Sweden)

    Martha Whitehead

    2009-06-01

    Full Text Available Objective - The main purpose of this study was to understand the information research process of experienced online information researchers in a variety of disciplines, gather their ideas for improvement and as part of this to validate a proposed research framework for use in future development of Ontario’s Scholars Portal.Methods - This was a qualitative research study in which sixty experienced online information researchers participated in face-to-face workshops that included a collaborative design component. The sessions were conducted and recorded by usability specialists who subsequently analyzed the data and identified patterns and themes.Results - Key themes included the similarities of the information research process across all disciplines, the impact of interdisciplinarity, the social aspect of research and opportunities for process improvement. There were many specific observations regarding current and ideal processes. Implications for portal development and further research included: supporting a common process while accommodating user-defined differences; supporting citation chaining practices with new opportunities for data linkage and granularity; enhancing keyword searching with various types of intervention; exploring trusted social networks; exploring new mental models for data manipulation while retaining traditional objects; improving citation and document management. Conclusion – The majority of researchers in the study had almost no routine in their information research processes, had developed few techniques to assist themselves and had very little awareness of the tools available to help them. There are many opportunities to aid researchers in the research process that can be explored when developing scholarly research portals. That development will be well guided by the framework ‘discover, gather, synthesize, create, share.’

  2. CONSERVATION PROCESS MODEL (CPM: A TWOFOLD SCIENTIFIC RESEARCH SCOPE IN THE INFORMATION MODELLING FOR CULTURAL HERITAGE

    Directory of Open Access Journals (Sweden)

    D. Fiorani

    2017-05-01

    Full Text Available The aim of the present research is to develop an instrument able to adequately support the conservation process by means of a twofold approach, based on both BIM environment and ontology formalisation. Although BIM has been successfully experimented within AEC (Architecture Engineering Construction field, it has showed many drawbacks for architectural heritage. To cope with unicity and more generally complexity of ancient buildings, applications so far developed have shown to poorly adapt BIM to conservation design with unsatisfactory results (Dore, Murphy 2013; Carrara 2014. In order to combine achievements reached within AEC through BIM environment (design control and management with an appropriate, semantically enriched and flexible The presented model has at its core a knowledge base developed through information ontologies and oriented around the formalization and computability of all the knowledge necessary for the full comprehension of the object of architectural heritage an its conservation. Such a knowledge representation is worked out upon conceptual categories defined above all within architectural criticism and conservation scope. The present paper aims at further extending the scope of conceptual modelling within cultural heritage conservation already formalized by the model. A special focus is directed on decay analysis and surfaces conservation project.

  3. How Students Learn: Information Processing, Intellectual Development and Confrontation

    Science.gov (United States)

    Entwistle, Noel

    1975-01-01

    A model derived from information processing theory is described, which helps to explain the complex verbal learning of students and suggests implications for lecturing techniques. Other factors affecting learning, which are not covered by the model, are discussed in relationship to it: student's intellectual development and effects of individual…

  4. Contexts for concepts: Information modeling for semantic interoperability

    NARCIS (Netherlands)

    Oude Luttighuis, P.H.W.M.; Stap, R.E.; Quartel, D.

    2011-01-01

    Conceptual information modeling is a well-established practice, aimed at preparing the implementation of information systems, the specification of electronic message formats, and the design of information processes. Today's ever more connected world however poses new challenges for conceptual

  5. A model-driven approach to information security compliance

    Science.gov (United States)

    Correia, Anacleto; Gonçalves, António; Teodoro, M. Filomena

    2017-06-01

    The availability, integrity and confidentiality of information are fundamental to the long-term survival of any organization. Information security is a complex issue that must be holistically approached, combining assets that support corporate systems, in an extended network of business partners, vendors, customers and other stakeholders. This paper addresses the conception and implementation of information security systems, conform the ISO/IEC 27000 set of standards, using the model-driven approach. The process begins with the conception of a domain level model (computation independent model) based on information security vocabulary present in the ISO/IEC 27001 standard. Based on this model, after embedding in the model mandatory rules for attaining ISO/IEC 27001 conformance, a platform independent model is derived. Finally, a platform specific model serves the base for testing the compliance of information security systems with the ISO/IEC 27000 set of standards.

  6. Hybrid quantum information processing

    Energy Technology Data Exchange (ETDEWEB)

    Furusawa, Akira [Department of Applied Physics, School of Engineering, The University of Tokyo (Japan)

    2014-12-04

    I will briefly explain the definition and advantage of hybrid quantum information processing, which is hybridization of qubit and continuous-variable technologies. The final goal would be realization of universal gate sets both for qubit and continuous-variable quantum information processing with the hybrid technologies. For that purpose, qubit teleportation with a continuousvariable teleporter is one of the most important ingredients.

  7. On the correlation between process model metrics and errors

    NARCIS (Netherlands)

    Mendling, J.; Neumann, G.; Aalst, van der W.M.P.; Grundy, J.; Hartmann, S.; Laender, S.; Maciaszek, L.; Roddick, J.F.

    2007-01-01

    Business process models play an important role for the management, design, and improvement of process organizations and process-aware information systems. Despite the extensive application of process modeling in practice there are hardly empirical results available on quality aspects of process

  8. Physiological arousal in processing recognition information

    Directory of Open Access Journals (Sweden)

    Guy Hochman

    2010-07-01

    Full Text Available The recognition heuristic (RH; Goldstein and Gigerenzer, 2002 suggests that, when applicable, probabilistic inferences are based on a noncompensatory examination of whether an object is recognized or not. The overall findings on the processes that underlie this fast and frugal heuristic are somewhat mixed, and many studies have expressed the need for considering a more compensatory integration of recognition information. Regardless of the mechanism involved, it is clear that recognition has a strong influence on choices, and this finding might be explained by the fact that recognition cues arouse affect and thus receive more attention than cognitive cues. To test this assumption, we investigated whether recognition results in a direct affective signal by measuring physiological arousal (i.e., peripheral arterial tone in the established city-size task. We found that recognition of cities does not directly result in increased physiological arousal. Moreover, the results show that physiological arousal increased with increasing inconsistency between recognition information and additional cue information. These findings support predictions derived by a compensatory Parallel Constraint Satisfaction model rather than predictions of noncompensatory models. Additional results concerning confidence ratings, response times, and choice proportions further demonstrated that recognition information and other cognitive cues are integrated in a compensatory manner.

  9. Information processing by networks of quantum decision makers

    Science.gov (United States)

    Yukalov, V. I.; Yukalova, E. P.; Sornette, D.

    2018-02-01

    We suggest a model of a multi-agent society of decision makers taking decisions being based on two criteria, one is the utility of the prospects and the other is the attractiveness of the considered prospects. The model is the generalization of quantum decision theory, developed earlier for single decision makers realizing one-step decisions, in two principal aspects. First, several decision makers are considered simultaneously, who interact with each other through information exchange. Second, a multistep procedure is treated, when the agents exchange information many times. Several decision makers exchanging information and forming their judgment, using quantum rules, form a kind of a quantum information network, where collective decisions develop in time as a result of information exchange. In addition to characterizing collective decisions that arise in human societies, such networks can describe dynamical processes occurring in artificial quantum intelligence composed of several parts or in a cluster of quantum computers. The practical usage of the theory is illustrated on the dynamic disjunction effect for which three quantitative predictions are made: (i) the probabilistic behavior of decision makers at the initial stage of the process is described; (ii) the decrease of the difference between the initial prospect probabilities and the related utility factors is proved; (iii) the existence of a common consensus after multiple exchange of information is predicted. The predicted numerical values are in very good agreement with empirical data.

  10. Unveiling the mystery of visual information processing in human brain.

    Science.gov (United States)

    Diamant, Emanuel

    2008-08-15

    It is generally accepted that human vision is an extremely powerful information processing system that facilitates our interaction with the surrounding world. However, despite extended and extensive research efforts, which encompass many exploration fields, the underlying fundamentals and operational principles of visual information processing in human brain remain unknown. We still are unable to figure out where and how along the path from eyes to the cortex the sensory input perceived by the retina is converted into a meaningful object representation, which can be consciously manipulated by the brain. Studying the vast literature considering the various aspects of brain information processing, I was surprised to learn that the respected scholarly discussion is totally indifferent to the basic keynote question: "What is information?" in general or "What is visual information?" in particular. In the old days, it was assumed that any scientific research approach has first to define its basic departure points. Why was it overlooked in brain information processing research remains a conundrum. In this paper, I am trying to find a remedy for this bizarre situation. I propose an uncommon definition of "information", which can be derived from Kolmogorov's Complexity Theory and Chaitin's notion of Algorithmic Information. Embracing this new definition leads to an inevitable revision of traditional dogmas that shape the state of the art of brain information processing research. I hope this revision would better serve the challenging goal of human visual information processing modeling.

  11. Bayesian networks and information theory for audio-visual perception modeling.

    Science.gov (United States)

    Besson, Patricia; Richiardi, Jonas; Bourdin, Christophe; Bringoux, Lionel; Mestre, Daniel R; Vercher, Jean-Louis

    2010-09-01

    Thanks to their different senses, human observers acquire multiple information coming from their environment. Complex cross-modal interactions occur during this perceptual process. This article proposes a framework to analyze and model these interactions through a rigorous and systematic data-driven process. This requires considering the general relationships between the physical events or factors involved in the process, not only in quantitative terms, but also in term of the influence of one factor on another. We use tools from information theory and probabilistic reasoning to derive relationships between the random variables of interest, where the central notion is that of conditional independence. Using mutual information analysis to guide the model elicitation process, a probabilistic causal model encoded as a Bayesian network is obtained. We exemplify the method by using data collected in an audio-visual localization task for human subjects, and we show that it yields a well-motivated model with good predictive ability. The model elicitation process offers new prospects for the investigation of the cognitive mechanisms of multisensory perception.

  12. Process modeling and control applied to real-time monitoring of distillation processes by near-infrared spectroscopy.

    Science.gov (United States)

    de Oliveira, Rodrigo R; Pedroza, Ricardo H P; Sousa, A O; Lima, Kássio M G; de Juan, Anna

    2017-09-08

    A distillation device that acquires continuous and synchronized measurements of temperature, percentage of distilled fraction and NIR spectra has been designed for real-time monitoring of distillation processes. As a process model, synthetic commercial gasoline batches produced in Brazil, which contain mixtures of pure gasoline blended with ethanol have been analyzed. The information provided by this device, i.e., distillation curves and NIR spectra, has served as initial information for the proposal of new strategies of process modeling and multivariate statistical process control (MSPC). Process modeling based on PCA batch analysis provided global distillation trajectories, whereas multiset MCR-ALS analysis is proposed to obtain a component-wise characterization of the distillation evolution and distilled fractions. Distillation curves, NIR spectra or compressed NIR information under the form of PCA scores and MCR-ALS concentration profiles were tested as the seed information to build MSPC models. New on-line PCA-based MSPC approaches, some inspired on local rank exploratory methods for process analysis, are proposed and work as follows: a) MSPC based on individual process observation models, where multiple local PCA models are built considering the sole information in each observation point; b) Fixed Size Moving Window - MSPC, in which local PCA models are built considering a moving window of the current and few past observation points; and c) Evolving MSPC, where local PCA models are built with an increasing window of observations covering all points since the beginning of the process until the current observation. Performance of different approaches has been assessed in terms of sensitivity to fault detection and number of false alarms. The outcome of this work will be of general use to define strategies for on-line process monitoring and control and, in a more specific way, to improve quality control of petroleum derived fuels and other substances submitted

  13. A descriptive model of information problem solving while using internet

    NARCIS (Netherlands)

    Brand-Gruwel, Saskia; Wopereis, Iwan; Walraven, Amber

    2009-01-01

    This paper presents the IPS-I-model: a model that describes the process of information problem solving (IPS) in which the Internet (I) is used to search information. The IPS-I-model is based on three studies, in which students in secondary and (post) higher education were asked to solve information

  14. Human machine interaction: The special role for human unconscious emotional information processing

    NARCIS (Netherlands)

    Noort, M.W.M.L. van den; Hugdahl, K.; Bosch, M.P.C.

    2005-01-01

    The nature of (un)conscious human emotional information processing remains a great mystery. On the one hand, classical models view human conscious emotional information processing as computation among the brain’s neurons but fail to address its enigmatic features. On the other hand, quantum

  15. Modelling Choice of Information Sources

    Directory of Open Access Journals (Sweden)

    Agha Faisal Habib Pathan

    2013-04-01

    Full Text Available This paper addresses the significance of traveller information sources including mono-modal and multimodal websites for travel decisions. The research follows a decision paradigm developed earlier, involving an information acquisition process for travel choices, and identifies the abstract characteristics of new information sources that deserve further investigation (e.g. by incorporating these in models and studying their significance in model estimation. A Stated Preference experiment is developed and the utility functions are formulated by expanding the travellers' choice set to include different combinations of sources of information. In order to study the underlying choice mechanisms, the resulting variables are examined in models based on different behavioural strategies, including utility maximisation and minimising the regret associated with the foregone alternatives. This research confirmed that RRM (Random Regret Minimisation Theory can fruitfully be used and can provide important insights for behavioural studies. The study also analyses the properties of travel planning websites and establishes a link between travel choices and the content, provenance, design, presence of advertisements, and presentation of information. The results indicate that travellers give particular credence to governmentowned sources and put more importance on their own previous experiences than on any other single source of information. Information from multimodal websites is more influential than that on train-only websites. This in turn is more influential than information from friends, while information from coachonly websites is the least influential. A website with less search time, specific information on users' own criteria, and real time information is regarded as most attractive

  16. Modeling Business Processes of the Social Insurance Fund in Information System Runa WFE

    Science.gov (United States)

    Kataev, M. Yu; Bulysheva, L. A.; Xu, Li D.; Loseva, N. V.

    2016-08-01

    Introduction - Business processes are gradually becoming a tool that allows you at a new level to put employees or to make more efficient document management system. In these directions the main work, and presents the largest possible number of publications. However, business processes are still poorly implemented in public institutions, where it is very difficult to formalize the main existing processes. Us attempts to build a system of business processes for such state agencies as the Russian social insurance Fund (SIF), where virtually all of the processes, when different inputs have the same output: public service. The parameters of the state services (as a rule, time limits) are set by state laws and regulations. The article provides a brief overview of the FSS, the formulation of requirements to business processes, the justification of the choice of software for modeling business processes and create models of work in the system Runa WFE and optimization models one of the main business processes of the FSS. The result of the work of Runa WFE is an optimized model of the business process of FSS.

  17. Towards elicitation of users requirements for hospital information system: from a care process modelling technique to a web based collaborative tool.

    OpenAIRE

    Staccini, Pascal M.; Joubert, Michel; Quaranta, Jean-Francois; Fieschi, Marius

    2002-01-01

    Growing attention is being given to the use of process modeling methodology for user requirements elicitation. In the analysis phase of hospital information systems, the usefulness of care-process models has been investigated to evaluate the conceptual applicability and practical understandability by clinical staff and members of users teams. Nevertheless, there still remains a gap between users and analysts in their mutual ability to share conceptual views and vocabulary, keeping the meaning...

  18. Random Matrices for Information Processing – A Democratic Vision

    DEFF Research Database (Denmark)

    Cakmak, Burak

    The thesis studies three important applications of random matrices to information processing. Our main contribution is that we consider probabilistic systems involving more general random matrix ensembles than the classical ensembles with iid entries, i.e. models that account for statistical...... dependence between the entries. Specifically, the involved matrices are invariant or fulfill a certain asymptotic freeness condition as their dimensions grow to infinity. Informally speaking, all latent variables contribute to the system model in a democratic fashion – there are no preferred latent variables...

  19. Utility-based early modulation of processing distracting stimulus information.

    Science.gov (United States)

    Wendt, Mike; Luna-Rodriguez, Aquiles; Jacobsen, Thomas

    2014-12-10

    Humans are selective information processors who efficiently prevent goal-inappropriate stimulus information to gain control over their actions. Nonetheless, stimuli, which are both unnecessary for solving a current task and liable to cue an incorrect response (i.e., "distractors"), frequently modulate task performance, even when consistently paired with a physical feature that makes them easily discernible from target stimuli. Current models of cognitive control assume adjustment of the processing of distractor information based on the overall distractor utility (e.g., predictive value regarding the appropriate response, likelihood to elicit conflict with target processing). Although studies on distractor interference have supported the notion of utility-based processing adjustment, previous evidence is inconclusive regarding the specificity of this adjustment for distractor information and the stage(s) of processing affected. To assess the processing of distractors during sensory-perceptual phases we applied EEG recording in a stimulus identification task, involving successive distractor-target presentation, and manipulated the overall distractor utility. Behavioral measures replicated previously found utility modulations of distractor interference. Crucially, distractor-evoked visual potentials (i.e., posterior N1) were more pronounced in high-utility than low-utility conditions. This effect generalized to distractors unrelated to the utility manipulation, providing evidence for item-unspecific adjustment of early distractor processing to the experienced utility of distractor information. Copyright © 2014 the authors 0270-6474/14/3416720-06$15.00/0.

  20. The Practice of Information Processing Model in the Teaching of Cognitive Strategies

    Science.gov (United States)

    Ozel, Ali

    2009-01-01

    In this research, the differentiation condition of teaching the learning strategies depending on the time which the first grade of primary school teachers carried out to form an information-process skeleton on student is tried to be found out. This process including the efforts of 260 teachers in this direction consists of whether the adequate…

  1. A non-linear model of information seeking behaviour

    Directory of Open Access Journals (Sweden)

    Allen E. Foster

    2005-01-01

    Full Text Available The results of a qualitative, naturalistic, study of information seeking behaviour are reported in this paper. The study applied the methods recommended by Lincoln and Guba for maximising credibility, transferability, dependability, and confirmability in data collection and analysis. Sampling combined purposive and snowball methods, and led to a final sample of 45 inter-disciplinary researchers from the University of Sheffield. In-depth semi-structured interviews were used to elicit detailed examples of information seeking. Coding of interview transcripts took place in multiple iterations over time and used Atlas-ti software to support the process. The results of the study are represented in a non-linear Model of Information Seeking Behaviour. The model describes three core processes (Opening, Orientation, and Consolidation and three levels of contextual interaction (Internal Context, External Context, and Cognitive Approach, each composed of several individual activities and attributes. The interactivity and shifts described by the model show information seeking to be non-linear, dynamic, holistic, and flowing. The paper concludes by describing the whole model of behaviours as analogous to an artist's palette, in which activities remain available throughout information seeking. A summary of key implications of the model and directions for further research are included.

  2. Reshaping the Enterprise through an Information Architecture and Process Reengineering.

    Science.gov (United States)

    Laudato, Nicholas C.; DeSantis, Dennis J.

    1995-01-01

    The approach used by the University of Pittsburgh (Pennsylvania) in designing a campus-wide information architecture and a framework for reengineering the business process included building consensus on a general philosophy for information systems, using pattern-based abstraction techniques, applying data modeling and application prototyping, and…

  3. Infochemistry Information Processing at the Nanoscale

    CERN Document Server

    Szacilowski, Konrad

    2012-01-01

    Infochemistry: Information Processing at the Nanoscale, defines a new field of science, and describes the processes, systems and devices at the interface between chemistry and information sciences. The book is devoted to the application of molecular species and nanostructures to advanced information processing. It includes the design and synthesis of suitable materials and nanostructures, their characterization, and finally applications of molecular species and nanostructures for information storage and processing purposes. Divided into twelve chapters; the first three chapters serve as an int

  4. Animated-simulation modeling facilitates clinical-process costing.

    Science.gov (United States)

    Zelman, W N; Glick, N D; Blackmore, C C

    2001-09-01

    Traditionally, the finance department has assumed responsibility for assessing process costs in healthcare organizations. To enhance process-improvement efforts, however, many healthcare providers need to include clinical staff in process cost analysis. Although clinical staff often use electronic spreadsheets to model the cost of specific processes, PC-based animated-simulation tools offer two major advantages over spreadsheets: they allow clinicians to interact more easily with the costing model so that it more closely represents the process being modeled, and they represent cost output as a cost range rather than as a single cost estimate, thereby providing more useful information for decision making.

  5. AIM for Allostery: Using the Ising Model to Understand Information Processing and Transmission in Allosteric Biomolecular Systems.

    Science.gov (United States)

    LeVine, Michael V; Weinstein, Harel

    2015-05-01

    In performing their biological functions, molecular machines must process and transmit information with high fidelity. Information transmission requires dynamic coupling between the conformations of discrete structural components within the protein positioned far from one another on the molecular scale. This type of biomolecular "action at a distance" is termed allostery . Although allostery is ubiquitous in biological regulation and signal transduction, its treatment in theoretical models has mostly eschewed quantitative descriptions involving the system's underlying structural components and their interactions. Here, we show how Ising models can be used to formulate an approach to allostery in a structural context of interactions between the constitutive components by building simple allosteric constructs we termed Allosteric Ising Models (AIMs). We introduce the use of AIMs in analytical and numerical calculations that relate thermodynamic descriptions of allostery to the structural context, and then show that many fundamental properties of allostery, such as the multiplicative property of parallel allosteric channels, are revealed from the analysis of such models. The power of exploring mechanistic structural models of allosteric function in more complex systems by using AIMs is demonstrated by building a model of allosteric signaling for an experimentally well-characterized asymmetric homodimer of the dopamine D2 receptor.

  6. Kuhlthau’s Classic Research on the Information Search Process (ISP Provides Evidence for Information Seeking as a Constructivist Process. A review of: Kuhlthau, Carol C. “Inside the Search Process: Information Seeking from the User's Perspective.” Journal of the American Society for Information Science 42.5 (1991: 361‐71.

    Directory of Open Access Journals (Sweden)

    Shelagh K. Genuis

    2007-12-01

    Full Text Available Objective – To extend understanding of purposeful information seeking and to present a model of the information search process (ISP from the perspective of the user.Design – Review of theoretical foundation, summing up of qualitative and quantitative data from a series of five foundational studies, and presentation of ISP model. Setting – Summarised research was conducted primarily in high school and college environments where subjects were investigating an assigned topic. A small proportion of public libraries were used in the fifth study within the reviewed series.Subjects – The ISP model as presented in this ‘classic’ article is based on studies involving a total of 558 participants. The first study involved 26 academically advanced high school seniors, and the 2 subsequent studies involved respectively 20 and 4 of the original participants following their completion of 4 years of college. The final 2 studies involved respectively 147 high, middle and low achieving high school seniors, and 385 academic, public and school library users.Methods – This paper presents the foundation for the ISP model by reviewing the relationship between Kelly’s personal construct theory, Belkin, Brooks, and Oddy’s investigation of cognitive aspects of the constructive information seeking process, and Taylor’s work on levels of information need (“Question‐negotiation” and value‐added information (“Value added”. This is followed by a review of Kuhlthau’s five foundational studies, which investigated the common information seeking experiences of users who were seeking to expand knowledge related to a particular topic or problem. The first of these studies was a small‐scale exploration in which participants were given two assignments. Questionnaires, journaling, search logs, and reflective writing were used to collect data throughout the process of assignment completion. Data collection was augmented by case studies involving in

  7. Quantum information processing

    National Research Council Canada - National Science Library

    Leuchs, Gerd; Beth, Thomas

    2003-01-01

    ... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.5 SimulationofHamiltonians... References... 1 1 1 3 5 8 10 2 Quantum Information Processing and Error Correction with Jump Codes (G. Alber, M. Mussinger...

  8. Talk as a Metacognitive Strategy during the Information Search Process of Adolescents

    Science.gov (United States)

    Bowler, Leanne

    2010-01-01

    Introduction: This paper describes a metacognitive strategy related to the social dimension of the information search process of adolescents. Method: A case study that used naturalistic methods to explore the metacognitive thinking nd associated emotions of ten adolescents. The study was framed by Kuhlthau's Information Search Process model and…

  9. Information Memory Processing and Retrieval: The Use of Information Theory to Study Primacy and Recency Characteristics of Ninth Grade Science Students Processing Learning Tasks.

    Science.gov (United States)

    Dunlop, David L.

    Reported is another study related to the Project on an Information Memory Model. This study involved using information theory to investigate the concepts of primacy and recency as they were exhibited by ninth-grade science students while processing a biological sorting problem and an immediate, abstract recall task. Two hundred randomly selected…

  10. Information behavior versus communication: application models in multidisciplinary settings

    Directory of Open Access Journals (Sweden)

    Cecília Morena Maria da Silva

    2015-05-01

    Full Text Available This paper deals with the information behavior as support for models of communication design in the areas of Information Science, Library and Music. The communication models proposition is based on models of Tubbs and Moss (2003, Garvey and Griffith (1972, adapted by Hurd (1996 and Wilson (1999. Therefore, the questions arose: (i what are the informational skills required of librarians who act as mediators in scholarly communication process and informational user behavior in the educational environment?; (ii what are the needs of music related researchers and as produce, seek, use and access the scientific knowledge of your area?; and (iii as the contexts involved in scientific collaboration processes influence in the scientific production of information science field in Brazil? The article includes a literature review on the information behavior and its insertion in scientific communication considering the influence of context and/or situation of the objects involved in motivating issues. The hypothesis is that the user information behavior in different contexts and situations influence the definition of a scientific communication model. Finally, it is concluded that the same concept or a set of concepts can be used in different perspectives, reaching up, thus, different results.

  11. Agricultural information dissemination using ICTs: A review and analysis of information dissemination models in China

    Directory of Open Access Journals (Sweden)

    Yun Zhang

    2016-03-01

    Full Text Available Over the last three decades, China’s agriculture sector has been transformed from the traditional to modern practice through the effective deployment of Information and Communication Technologies (ICTs. Information processing and dissemination have played a critical role in this transformation process. Many studies in relation to agriculture information services have been conducted in China, but few of them have attempted to provide a comprehensive review and analysis of different information dissemination models and their applications. This paper aims to review and identify the ICT based information dissemination models in China and to share the knowledge and experience in applying emerging ICTs in disseminating agriculture information to farmers and farm communities to improve productivity and economic, social and environmental sustainability. The paper reviews and analyzes the development stages of China’s agricultural information dissemination systems and different mechanisms for agricultural information service development and operations. Seven ICT-based information dissemination models are identified and discussed. Success cases are presented. The findings provide a useful direction for researchers and practitioners in developing future ICT based information dissemination systems. It is hoped that this paper will also help other developing countries to learn from China’s experience and best practice in their endeavor of applying emerging ICTs in agriculture information dissemination and knowledge transfer.

  12. The Evolution Process on Information Technology Outsourcing Relationship

    OpenAIRE

    Duan Weihua

    2017-01-01

    Information technology outsourcing relationship is one of the key issues to IT outsourcing success. To explore how to manage and promote IT outsourcing relationship, it is necessary to understand its evolution process. Firstly, the types of IT outsourcing based on relationship quality and IT outsourcing project level will be analyzed; Secondly, two evolution process models of IT outsourcing relationship are proposed based on relationship quality and IT outsourcing project level, and the IT ou...

  13. The combined effects of self-referent information processing and ruminative responses on adolescent depression.

    Science.gov (United States)

    Black, Stephanie Winkeljohn; Pössel, Patrick

    2013-08-01

    Adolescents who develop depression have worse interpersonal and affective experiences and are more likely to develop substance problems and/or suicidal ideation compared to adolescents who do not develop depression. This study examined the combined effects of negative self-referent information processing and rumination (i.e., brooding and reflection) on adolescent depressive symptoms. It was hypothesized that the interaction of negative self-referent information processing and brooding would significantly predict depressive symptoms, while the interaction of negative self-referent information processing and reflection would not predict depressive symptoms. Adolescents (n = 92; 13-15 years; 34.7% female) participated in a 6-month longitudinal study. Self-report instruments measured depressive symptoms and rumination; a cognitive task measured information processing. Path modelling in Amos 19.0 analyzed the data. The interaction of negative information processing and brooding significantly predicted an increase in depressive symptoms 6 months later. The interaction of negative information processing and reflection did not significantly predict depression, however, the model not meet a priori standards to accept the null hypothesis. Results suggest clinicians working with adolescents at-risk for depression should consider focusing on the reduction of brooding and negative information processing to reduce long-term depressive symptoms.

  14. PREFACE: Quantum information processing

    Science.gov (United States)

    Briggs, Andrew; Ferry, David; Stoneham, Marshall

    2006-05-01

    Microelectronics and the classical information technologies transformed the physics of semiconductors. Photonics has given optical materials a new direction. Quantum information technologies, we believe, will have immense impact on condensed matter physics. The novel systems of quantum information processing need to be designed and made. Their behaviours must be manipulated in ways that are intrinsically quantal and generally nanoscale. Both in this special issue and in previous issues (see e.g., Spiller T P and Munro W J 2006 J. Phys.: Condens. Matter 18 V1-10) we see the emergence of new ideas that link the fundamentals of science to the pragmatism of market-led industry. We hope these papers will be followed by many others on quantum information processing in the Journal of Physics: Condensed Matter.

  15. Social Information Processing in Deaf Adolescents

    Science.gov (United States)

    Torres, Jesús; Saldaña, David; Rodríguez-Ortiz, Isabel R.

    2016-01-01

    The goal of this study was to compare the processing of social information in deaf and hearing adolescents. A task was developed to assess social information processing (SIP) skills of deaf adolescents based on Crick and Dodge's (1994; A review and reformulation of social information-processing mechanisms in children's social adjustment.…

  16. Modeling interdependencies between business and communication processes in hospitals.

    Science.gov (United States)

    Brigl, Birgit; Wendt, Thomas; Winter, Alfred

    2003-01-01

    The optimization and redesign of business processes in hospitals is an important challenge for the hospital information management who has to design and implement a suitable HIS architecture. Nevertheless, there are no tools available specializing in modeling information-driven business processes and the consequences on the communication between information processing, tools. Therefore, we will present an approach which facilitates the representation and analysis of business processes and resulting communication processes between application components and their interdependencies. This approach aims not only to visualize those processes, but to also to evaluate if there are weaknesses concerning the information processing infrastructure which hinder the smooth implementation of the business processes.

  17. Real-time information and processing system for radiation protection

    International Nuclear Information System (INIS)

    Oprea, I.; Oprea, M.; Stoica, M.; Badea, E.; Guta, V.

    1999-01-01

    The real-time information and processing system has as main task to record, collect, process and transmit the radiation level and weather data, being proposed for radiation protection, environmental monitoring around nuclear facilities and for civil defence. Such a system can offer information in order to provide mapping, data base, modelling and communication and to assess the consequences of nuclear accidents. The system incorporates a number of stationary or mobile radiation monitoring equipment, weather parameter measuring station, a GIS-based information processing center and the communication network, all running on a real-time operating system. It provides the automatic data collection on-line and off-line, remote diagnostic, advanced presentation techniques, including a graphically oriented executive support, which has the ability to respond to an emergency by geographical representation of the hazard zones on the map.The system can be integrated into national or international environmental monitoring systems, being based on local intelligent measuring and transmission units, simultaneous processing and data presentation using a real-time operating system for PC and geographical information system (GIS). Such an integrated system is composed of independent applications operating under the same computer, which is capable to improve the protection of the population and decision makers efforts, updating the remote GIS data base. All information can be managed directly from the map by multilevel data retrieving and presentation by using on-line dynamic evolution of the events, environment information, evacuation optimization, image and voice processing

  18. Links between attachment and social information processing: examination of intergenerational processes.

    Science.gov (United States)

    Dykas, Matthew J; Ehrlich, Katherine B; Cassidy, Jude

    2011-01-01

    This chapter describes theory and research on intergenerational connections between parents' attachment and children's social information processing, as well as between parents' social information processing and children's attachment. The chapter begins with a discussion of attachment theorists' early insights into the role that social information processing plays in attachment processes. Next, current theory about the mechanisms through which cross-generational links between attachment and social information processing might emerge is presented. The central proposition is that the quality of attachment and/or the social information processing of the parent contributes to the quality of attachment and/or social information processing in the child, and these links emerge through mediating processes related to social learning, open communication, gate-keeping, emotion regulation, and joint attention. A comprehensive review of the literature is then presented. The chapter ends with the presentation of a current theoretical perspective and suggestions for future empirical and clinical endeavors.

  19. An evaluation of the coping patterns of rape victims: integration with a schema-based information-processing model.

    Science.gov (United States)

    Littleton, Heather

    2007-08-01

    The current study sought to provide an expansion of Resick and Schnicke's information-processing model of interpersonal violence response. Their model posits that interpersonal violence threatens victims' schematic beliefs and that victims can resolve this threat through assimilation, accommodation, or overaccommodation. In addition, it is hypothesized that how victims resolve schematic threat affects their coping strategies. To test this hypothesis, a cluster analysis of rape victims' coping patterns was conducted. Victims' coping patterns were related to distress, self-worth, and rape label in ways consistent with predictions. Thus, future research should focus on the implications of how victims integrate trauma with schemas.

  20. How Qualitative Methods Can be Used to Inform Model Development.

    Science.gov (United States)

    Husbands, Samantha; Jowett, Susan; Barton, Pelham; Coast, Joanna

    2017-06-01

    Decision-analytic models play a key role in informing healthcare resource allocation decisions. However, there are ongoing concerns with the credibility of models. Modelling methods guidance can encourage good practice within model development, but its value is dependent on its ability to address the areas that modellers find most challenging. Further, it is important that modelling methods and related guidance are continually updated in light of any new approaches that could potentially enhance model credibility. The objective of this article was to highlight the ways in which qualitative methods have been used and recommended to inform decision-analytic model development and enhance modelling practices. With reference to the literature, the article discusses two key ways in which qualitative methods can be, and have been, applied. The first approach involves using qualitative methods to understand and inform general and future processes of model development, and the second, using qualitative techniques to directly inform the development of individual models. The literature suggests that qualitative methods can improve the validity and credibility of modelling processes by providing a means to understand existing modelling approaches that identifies where problems are occurring and further guidance is needed. It can also be applied within model development to facilitate the input of experts to structural development. We recommend that current and future model development would benefit from the greater integration of qualitative methods, specifically by studying 'real' modelling processes, and by developing recommendations around how qualitative methods can be adopted within everyday modelling practice.

  1. The impact of working memory and the “process of process modelling” on model quality: Investigating experienced versus inexperienced modellers

    Science.gov (United States)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R.; Weber, Barbara

    2016-01-01

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling. PMID:27157858

  2. Towards the understanding of network information processing in biology

    Science.gov (United States)

    Singh, Vijay

    Living organisms perform incredibly well in detecting a signal present in the environment. This information processing is achieved near optimally and quite reliably, even though the sources of signals are highly variable and complex. The work in the last few decades has given us a fair understanding of how individual signal processing units like neurons and cell receptors process signals, but the principles of collective information processing on biological networks are far from clear. Information processing in biological networks, like the brain, metabolic circuits, cellular-signaling circuits, etc., involves complex interactions among a large number of units (neurons, receptors). The combinatorially large number of states such a system can exist in makes it impossible to study these systems from the first principles, starting from the interactions between the basic units. The principles of collective information processing on such complex networks can be identified using coarse graining approaches. This could provide insights into the organization and function of complex biological networks. Here I study models of biological networks using continuum dynamics, renormalization, maximum likelihood estimation and information theory. Such coarse graining approaches identify features that are essential for certain processes performed by underlying biological networks. We find that long-range connections in the brain allow for global scale feature detection in a signal. These also suppress the noise and remove any gaps present in the signal. Hierarchical organization with long-range connections leads to large-scale connectivity at low synapse numbers. Time delays can be utilized to separate a mixture of signals with temporal scales. Our observations indicate that the rules in multivariate signal processing are quite different from traditional single unit signal processing.

  3. Multidimensional biochemical information processing of dynamical patterns.

    Science.gov (United States)

    Hasegawa, Yoshihiko

    2018-02-01

    Cells receive signaling molecules by receptors and relay information via sensory networks so that they can respond properly depending on the type of signal. Recent studies have shown that cells can extract multidimensional information from dynamical concentration patterns of signaling molecules. We herein study how biochemical systems can process multidimensional information embedded in dynamical patterns. We model the decoding networks by linear response functions, and optimize the functions with the calculus of variations to maximize the mutual information between patterns and output. We find that, when the noise intensity is lower, decoders with different linear response functions, i.e., distinct decoders, can extract much information. However, when the noise intensity is higher, distinct decoders do not provide the maximum amount of information. This indicates that, when transmitting information by dynamical patterns, embedding information in multiple patterns is not optimal when the noise intensity is very large. Furthermore, we explore the biochemical implementations of these decoders using control theory and demonstrate that these decoders can be implemented biochemically through the modification of cascade-type networks, which are prevalent in actual signaling pathways.

  4. A Participatory Model for Multi-Document Health Information Summarisation

    Directory of Open Access Journals (Sweden)

    Dinithi Nallaperuma

    2017-03-01

    Full Text Available Increasing availability and access to health information has been a paradigm shift in healthcare provision as it empowers both patients and practitioners alike. Besides awareness, significant time savings and process efficiencies can be achieved through effective summarisation of healthcare information. Relevance and accuracy are key concerns when generating summaries for such documents. Despite advances in automated summarisation approaches, the role of participation has not been explored. In this paper, we propose a new model for multi-document health information summarisation that takes into account the role of participation. The updated IS user participation theory was extended to explicate these roles. The proposed model integrates both extractive and abstractive summarisation processes with continuous participatory inputs to each phase. The model was implemented as a client-server application and evaluated by both domain experts and health information consumers. Results from the evaluation phase indicates the model is successful in generating relevant and accurate summaries for diverse audiences.

  5. Mechanisms of placebo analgesia: A dual-process model informed by insights from cross-species comparisons.

    Science.gov (United States)

    Schafer, Scott M; Geuter, Stephan; Wager, Tor D

    2018-01-01

    Placebo treatments are pharmacologically inert, but are known to alleviate symptoms across a variety of clinical conditions. Associative learning and cognitive expectations both play important roles in placebo responses, however we are just beginning to understand how interactions between these processes lead to powerful effects. Here, we review the psychological principles underlying placebo effects and our current understanding of their brain bases, focusing on studies demonstrating both the importance of cognitive expectations and those that demonstrate expectancy-independent associative learning. To account for both forms of placebo analgesia, we propose a dual-process model in which flexible, contextually driven cognitive schemas and attributions guide associative learning processes that produce stable, long-term placebo effects. According to this model, the placebo-induction paradigms with the most powerful effects are those that combine reinforcement (e.g., the experience of reduced pain after placebo treatment) with suggestions and context cues that disambiguate learning by attributing perceived benefit to the placebo. Using this model as a conceptual scaffold, we review and compare neurobiological systems identified in both human studies of placebo analgesia and behavioral pain modulation in rodents. We identify substantial overlap between the circuits involved in human placebo analgesia and those that mediate multiple forms of context-based modulation of pain behavior in rodents, including forebrain-brainstem pathways and opioid and cannabinoid systems in particular. This overlap suggests that placebo effects are part of a set of adaptive mechanisms for shaping nociceptive signaling based on its information value and anticipated optimal response in a given behavioral context. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Information Processing in Adolescents with Bipolar I Disorder

    Science.gov (United States)

    Whitney, Jane; Joormann, Jutta; Gotlib, Ian H.; Kelley, Ryan G.; Acquaye, Tenah; Howe, Meghan; Chang, Kiki D.; Singh, Manpreet K.

    2012-01-01

    Background: Cognitive models of bipolar I disorder (BD) may aid in identification of children who are especially vulnerable to chronic mood dysregulation. Information-processing biases related to memory and attention likely play a role in the development and persistence of BD among adolescents; however, these biases have not been extensively…

  7. A Model for an Electronic Information Marketplace

    Directory of Open Access Journals (Sweden)

    Wei Ge

    2005-11-01

    Full Text Available As the information content on the Internet increases, the task of locating desired information and assessing its quality becomes increasingly difficult. This development causes users to be more willing to pay for information that is focused on specific issues, verifiable, and available upon request. Thus, the nature of the Internet opens up the opportunity for information trading. In this context, the Internet cannot only be used to close the transaction, but also to deliver the product - desired information - to the user. Early attempts to implement such business models have fallen short of expectations. In this paper, we discuss the limitations of such practices and present a modified business model for information trading, which uses a reverse auction approach together with a multiple-buyer price discovery process

  8. Task-specific visual cues for improving process model understanding

    NARCIS (Netherlands)

    Petrusel, Razvan; Mendling, Jan; Reijers, Hajo A.

    2016-01-01

    Context Business process models support various stakeholders in managing business processes and designing process-aware information systems. In order to make effective use of these models, they have to be readily understandable. Objective Prior research has emphasized the potential of visual cues to

  9. Modeling business processes: theoretical and practical aspects

    Directory of Open Access Journals (Sweden)

    V.V. Dubininа

    2015-06-01

    Full Text Available The essence of process-oriented enterprise management has been examined in the article. The content and types of information technology have been analyzed in the article, due to the complexity and differentiation of existing methods, as well as the specificity of language, terminology of the enterprise business processes modeling. The theoretical aspects of business processes modeling have been reviewed and the modern traditional modeling techniques received practical application in the visualization model of retailers activity have been studied in the article. In the process of theoretical analysis of the modeling methods found that UFO-toolkit method that has been developed by Ukrainian scientists due to it systemology integrated opportunities, is the most suitable for structural and object analysis of retailers business processes. It was designed visualized simulation model of the business process "sales" as is" of retailers using a combination UFO-elements with the aim of the further practical formalization and optimization of a given business process.

  10. Levy Random Bridges and the Modelling of Financial Information

    OpenAIRE

    Hoyle, Edward; Hughston, Lane P.; Macrina, Andrea

    2009-01-01

    The information-based asset-pricing framework of Brody, Hughston and Macrina (BHM) is extended to include a wider class of models for market information. In the BHM framework, each asset is associated with a collection of random cash flows. The price of the asset is the sum of the discounted conditional expectations of the cash flows. The conditional expectations are taken with respect to a filtration generated by a set of "information processes". The information processes carry imperfect inf...

  11. Process information systems in nuclear reprocessing

    International Nuclear Information System (INIS)

    Jaeschke, A.; Keller, H.; Orth, H.

    1987-01-01

    On a production management level, a process information system in a nuclear reprocessing plant (NRP) has to fulfill conventional operating functions and functions for nuclear material surveillance (safeguards). Based on today's state of the art of on-line process control technology, the progress in hardware and software technology allows to introduce more process-specific intelligence into process information systems. Exemplified by an expert-system-aided laboratory management system as component of a NRP process information system, the paper demonstrates that these technologies can be applied already. (DG) [de

  12. Modelling the ICE standard with a formal language for information commerce

    OpenAIRE

    Wombacher, A.; Aberer, K.

    2001-01-01

    Automatizing information commerce requires languages to represent the typical information commerce processes. Existing languages and standards cover either only very specific types of business models or are too general to capture in a concise way the specific properties of information commerce processes. We introduce a language that is specifically designed for information commerce. It can be directly used for the implementation of the processes and communication required in information comme...

  13. Models of organisation and information system design | Mohamed ...

    African Journals Online (AJOL)

    We devote this paper to the models of organisation, and see which is best suited to provide a basis for information processing and transmission. In this respect we shall be dealing with four models of organisation, namely: the classical mode, the behavioural model, the systems model and the cybernetic model of ...

  14. An Empirical Study of the Volkswagen Crisis in China: Customers' Information Processing and Behavioral Intentions.

    Science.gov (United States)

    Wei, Jiuchang; Zhao, Ming; Wang, Fei; Cheng, Peng; Zhao, Dingtao

    2016-01-01

    Product-harm crises usually lead to product recalls, which may cause consumers concern about the product quality and safety. This study systematically examines customers' immediate responses to the Volkswagen product recall crisis in China. Particular attention was given to customers' responses to the risk information influencing their behavioral intentions. By combining the protective action decision model and the heuristic-systematic model, we constructed a hypothetical model to explore this issue. A questionnaire survey was conducted to collect data involving 467 participants drawn from the customers of Volkswagen. We used structural equation modeling to explore the model. The results show that customers' product knowledge plays an important role in their responses to the crisis. Having more knowledge would make them perceive a lower risk, but they might need even more information, making them more likely to seek and process information, and subsequently increasing their positive behavioral intentions toward the firm (that is pro-firm behavioral intentions). Risk perception increased customers' information needs, information seeking, and information processing but decreased their pro-firm behavioral intentions. In addition to promoting information seeking, information needed to also facilitate customers' systematic processing and thus increase their behavioral intentions to take corrective action. Customers' behavioral intentions were also spurred by systematic processing, but failed to be predicted by information seeking. In summary, theoretical and practical implications and suggestions for further research are also discussed. © 2015 Society for Risk Analysis.

  15. Modelling the Replication Management in Information Systems

    Directory of Open Access Journals (Sweden)

    Cezar TOADER

    2017-01-01

    Full Text Available In the modern economy, the benefits of Web services are significant because they facilitates the activities automation in the framework of Internet distributed businesses as well as the cooperation between organizations through interconnection process running in the computer systems. This paper presents the development stages of a model for a reliable information system. This paper describes the communication between the processes within the distributed system, based on the message exchange, and also presents the problem of distributed agreement among processes. A list of objectives for the fault-tolerant systems is defined and a framework model for distributed systems is proposed. This framework makes distinction between management operations and execution operations. The proposed model promotes the use of a central process especially designed for the coordination and control of other application processes. The execution phases and the protocols for the management and the execution components are presented. This model of a reliable system could be a foundation for an entire class of distributed systems models based on the management of replication process.

  16. original papers : Returns to scale in one-shot information processing when hours count

    OpenAIRE

    Catherine de Fontenay; Kieron J. Meagher

    2001-01-01

    The decentralized information processing approach pioneered by Radner and Van Zandt endogenously determines the optimal hierarchy for decision making within an organization. The simplest information processing model is the one-shot problem (one set of information to process) which serves as the testing ground for ever richer descriptions of managers and their tasks. Meagher and Van Zandt observed that an hours-based measure should be used for calculating managerial costs rather than the fixed...

  17. Conceptual information processing: A robust approach to KBS-DBMS integration

    Science.gov (United States)

    Lazzara, Allen V.; Tepfenhart, William; White, Richard C.; Liuzzi, Raymond

    1987-01-01

    Integrating the respective functionality and architectural features of knowledge base and data base management systems is a topic of considerable interest. Several aspects of this topic and associated issues are addressed. The significance of integration and the problems associated with accomplishing that integration are discussed. The shortcomings of current approaches to integration and the need to fuse the capabilities of both knowledge base and data base management systems motivates the investigation of information processing paradigms. One such paradigm is concept based processing, i.e., processing based on concepts and conceptual relations. An approach to robust knowledge and data base system integration is discussed by addressing progress made in the development of an experimental model for conceptual information processing.

  18. Group creativity and innovation: a motivated information processing perspective

    NARCIS (Netherlands)

    de Dreu, C.K.W.; Nijstad, B.A.; Bechtoldt, M.N.; Baas, M.

    2011-01-01

    The authors review the Motivated Information Processing in Groups Model (De Dreu, Nijstad, & Van Knippenberg, 2008) to understand group creativity and innovation. Although distinct phenomena, group creativity and innovation are both considered a function of epistemic motivation (EM; the degree to

  19. Promoting information diffusion through interlayer recovery processes in multiplex networks

    Science.gov (United States)

    Wang, Xin; Li, Weihua; Liu, Longzhao; Pei, Sen; Tang, Shaoting; Zheng, Zhiming

    2017-09-01

    For information diffusion in multiplex networks, the effect of interlayer contagion on spreading dynamics has been explored in different settings. Nevertheless, the impact of interlayer recovery processes, i.e., the transition of nodes to stiflers in all layers after they become stiflers in any layer, still remains unclear. In this paper, we propose a modified ignorant-spreader-stifler model of rumor spreading equipped with an interlayer recovery mechanism. We find that the information diffusion can be effectively promoted for a range of interlayer recovery rates. By combining the mean-field approximation and the Markov chain approach, we derive the evolution equations of the diffusion process in two-layer homogeneous multiplex networks. The optimal interlayer recovery rate that achieves the maximal enhancement can be calculated by solving the equations numerically. In addition, we find that the promoting effect on a certain layer can be strengthened if information spreads more extensively within the counterpart layer. When applying the model to two-layer scale-free multiplex networks, with or without degree correlation, similar promoting effect is also observed in simulations. Our work indicates that the interlayer recovery process is beneficial to information diffusion in multiplex networks, which may have implications for designing efficient spreading strategies.

  20. A Model for Information

    Directory of Open Access Journals (Sweden)

    Paul Walton

    2014-09-01

    Full Text Available This paper uses an approach drawn from the ideas of computer systems modelling to produce a model for information itself. The model integrates evolutionary, static and dynamic views of information and highlights the relationship between symbolic content and the physical world. The model includes what information technology practitioners call “non-functional” attributes, which, for information, include information quality and information friction. The concepts developed in the model enable a richer understanding of Floridi’s questions “what is information?” and “the informational circle: how can information be assessed?” (which he numbers P1 and P12.

  1. Employee Communication during Crises: The Effects of Stress on Information Processing.

    Science.gov (United States)

    Pincus, J. David; Acharya, Lalit

    Based on multidisciplinary research findings, this report proposes an information processing model of employees' response to highly stressful information environments arising during organizational crises. The introduction stresses the importance of management's handling crisis communication with employees skillfully. The second section points out…

  2. [Effects of an implicit internal working model on attachment in information processing assessed using Go/No-Go Association Task].

    Science.gov (United States)

    Fujii, Tsutomu; Uebuchi, Hisashi; Yamada, Kotono; Saito, Masahiro; Ito, Eriko; Tonegawa, Akiko; Uebuchi, Marie

    2015-06-01

    The purposes of the present study were (a) to use both a relational-anxiety Go/No-Go Association Task (GNAT) and an avoidance-of-intimacy GNAT in order to assess an implicit Internal Working Model (IWM) of attachment; (b) to verify the effects of both measured implicit relational anxiety and implicit avoidance of intimacy on information processing. The implicit IWM measured by GNAT differed from the explicit IWM measured by questionnaires in terms of the effects on information processing. In particular, in subliminal priming tasks involving with others, implicit avoidance of intimacy predicted accelerated response times with negative stimulus words about attachment. Moreover, after subliminally priming stimulus words about self, implicit relational anxiety predicted delayed response times with negative stimulus words about attachment.

  3. A Social Information Processing Approach to Job Attitudes and Task Design

    Science.gov (United States)

    Salancik, Gerald R.; Pfeffer, Jeffrey

    1978-01-01

    In comparison with need-satisfaction and expectancy models of job attitudes and motivation, the social information processing perspective emphasizes the effects of context and the consequences of past choices, rather than individual predispositions and rational decision-making processes. (Author)

  4. Models of neural networks temporal aspects of coding and information processing in biological systems

    CERN Document Server

    Hemmen, J; Schulten, Klaus

    1994-01-01

    Since the appearance of Vol. 1 of Models of Neural Networks in 1991, the theory of neural nets has focused on two paradigms: information coding through coherent firing of the neurons and functional feedback. Information coding through coherent neuronal firing exploits time as a cardinal degree of freedom. This capacity of a neural network rests on the fact that the neuronal action potential is a short, say 1 ms, spike, localized in space and time. Spatial as well as temporal correlations of activity may represent different states of a network. In particular, temporal correlations of activity may express that neurons process the same "object" of, for example, a visual scene by spiking at the very same time. The traditional description of a neural network through a firing rate, the famous S-shaped curve, presupposes a wide time window of, say, at least 100 ms. It thus fails to exploit the capacity to "bind" sets of coherently firing neurons for the purpose of both scene segmentation and figure-ground segregatio...

  5. Safety, Liveness and Run-time Refinement for Modular Process-Aware Information Systems with Dynamic Sub Processes

    DEFF Research Database (Denmark)

    Debois, Søren; Hildebrandt, Thomas; Slaats, Tijs

    2015-01-01

    and verification of flexible, run-time adaptable process-aware information systems, moved into practice via the Dynamic Condition Response (DCR) Graphs notation co-developed with our industrial partner. Our key contributions are: (1) A formal theory of dynamic sub-process instantiation for declarative, event......We study modularity, run-time adaptation and refinement under safety and liveness constraints in event-based process models with dynamic sub-process instantiation. The study is part of a larger programme to provide semantically well-founded technologies for modelling, implementation......-based processes under safety and liveness constraints, given as the DCR* process language, equipped with a compositional operational semantics and conservatively extending the DCR Graphs notation; (2) an expressiveness analysis revealing that the DCR* process language is Turing-complete, while the fragment cor...

  6. Modeling information diffusion in time-varying community networks

    Science.gov (United States)

    Cui, Xuelian; Zhao, Narisa

    2017-12-01

    Social networks are rarely static, and they typically have time-varying network topologies. A great number of studies have modeled temporal networks and explored social contagion processes within these models; however, few of these studies have considered community structure variations. In this paper, we present a study of how the time-varying property of a modular structure influences the information dissemination. First, we propose a continuous-time Markov model of information diffusion where two parameters, mobility rate and community attractiveness, are introduced to address the time-varying nature of the community structure. The basic reproduction number is derived, and the accuracy of this model is evaluated by comparing the simulation and theoretical results. Furthermore, numerical results illustrate that generally both the mobility rate and community attractiveness significantly promote the information diffusion process, especially in the initial outbreak stage. Moreover, the strength of this promotion effect is much stronger when the modularity is higher. Counterintuitively, it is found that when all communities have the same attractiveness, social mobility no longer accelerates the diffusion process. In addition, we show that the local spreading in the advantage group has been greatly enhanced due to the agglomeration effect caused by the social mobility and community attractiveness difference, which thus increases the global spreading.

  7. Models, Metaphors and Symbols for Information and Knowledge Systems

    Directory of Open Access Journals (Sweden)

    David Williams

    2014-01-01

    Full Text Available A literature search indicates that Data, Information and Knowledge continue to be placed into a hierarchical construct where it is considered that information is more valuable than data and that information can be processed into becoming precious knowledge. Wisdom continues to be added to the model to further confuse the issue. This model constrains our ability to think more logically about how and why we develop knowledge management systems to support and enhance knowledge- intensive processes, tasks or projects. This paper seeks to summarise development of the Data-Information-Knowledge-Wisdom hierarchy, explore the extensive criticism of it and present a more logical (and accurate construct for the elements of intellectual capital when developing and managing Knowledge Management Systems.

  8. Markov Decision Process Measurement Model.

    Science.gov (United States)

    LaMar, Michelle M

    2018-03-01

    Within-task actions can provide additional information on student competencies but are challenging to model. This paper explores the potential of using a cognitive model for decision making, the Markov decision process, to provide a mapping between within-task actions and latent traits of interest. Psychometric properties of the model are explored, and simulation studies report on parameter recovery within the context of a simple strategy game. The model is then applied to empirical data from an educational game. Estimates from the model are found to correlate more strongly with posttest results than a partial-credit IRT model based on outcome data alone.

  9. 5D Building Information Modelling – A Practicability Review

    Directory of Open Access Journals (Sweden)

    Lee Xia Sheng

    2016-01-01

    Full Text Available Quality, time and cost are the three most important elements in any construction project. Building information that comes timely and accurately in multiple dimensions will facilitate a refined decision making process which can improve the construction quality, time and cost. 5 dimensional Building Information Modelling or 5D BIM is an emerging trend in the construction industry that integrates all the major information starting from the initial design to the final construction stage. After that, the integrated information is arranged and communicated through Virtual Design and Construction (VDC. This research is to gauge the practicability of 5D BIM with an action research type pilot study by the means of hands-on modelling of a conceptual bungalow design based on one of the most popular BIM tools. A bungalow is selected as a study subject to simulate the major stages of 5D BIM digital workflow. The whole process starts with developing drawings (2D into digital model (3D, and is followed by the incorporation of time (4D and cost (5D. Observations are focused on the major factors that will affect the practicability of 5D BIM, including the modelling effort, inter-operability, information output and limitations. This research concludes that 5D BIM certainly has high level practicability which further differentiates BIM from Computer Aided Design (CAD. The integration of information not only enhanced the efficiency and accuracy of process in all stages, but also enabled decision makers to have a sophisticated interpretation of information which is almost impossible with the conventional 2D CAD workflow. Although it is possible to incorporate more than 5 dimensions of information, it is foreseeable that excessive information may escalate the complexity unfavourably for BIM implementation. 5D BIM has achieved a significant level of practicability; further research should be conducted to streamline implementation. Once 5D BIM is matured and widely

  10. Strategies to Automatically Derive a Process Model from a Configurable Process Model Based on Event Data

    Directory of Open Access Journals (Sweden)

    Mauricio Arriagada-Benítez

    2017-10-01

    Full Text Available Configurable process models are frequently used to represent business workflows and other discrete event systems among different branches of large organizations: they unify commonalities shared by all branches and describe their differences, at the same time. The configuration of such models is usually done manually, which is challenging. On the one hand, when the number of configurable nodes in the configurable process model grows, the size of the search space increases exponentially. On the other hand, the person performing the configuration may lack the holistic perspective to make the right choice for all configurable nodes at the same time, since choices influence each other. Nowadays, information systems that support the execution of business processes create event data reflecting how processes are performed. In this article, we propose three strategies (based on exhaustive search, genetic algorithms and a greedy heuristic that use event data to automatically derive a process model from a configurable process model that better represents the characteristics of the process in a specific branch. These strategies have been implemented in our proposed framework and tested in both business-like event logs as recorded in a higher educational enterprise resource planning system and a real case scenario involving a set of Dutch municipalities.

  11. Anti-nuclear behavioral intentions: The role of perceived knowledge, information processing, and risk perception

    International Nuclear Information System (INIS)

    Zhu, Weiwei; Wei, Jiuchang; Zhao, Dingtao

    2016-01-01

    This study explored the key factors underlying people's anti-nuclear behavioral intentions. The protective action decision model and the heuristic–systematic model were integrated and adapted from a risk information perspective to construct a hypothetical model. A questionnaire study was conducted on a sample of residents near the Haiyang Nuclear Power Plant, which is under construction in Shandong Province, China (N=487). Results show that, as expected, perceived knowledge is vital in predicting people's information insufficiency, information seeking, systematic processing, and risk perception. Moreover, the inverted U relationship between perceived knowledge and anti-nuclear behavioral intentions is indicated in the study. Information insufficiency and information seeking also significantly predict systematic processing. Furthermore, people's behavioral intentions are motivated by risk perception but fail to be stimulated by systematic processing. Implications and recommendations for future research are discussed. - Highlights: • The study explores anti-nuclear behavior from a risk information perspective. • Risk perception and knowledge matter to anti-nuclear behavioral intentions. • Inverted U relationship between knowledge and behavioral intentions is indicated. • More understanding of nuclear power could reduce public opposition.

  12. Management of information in development projects – a proposed integrated model

    Directory of Open Access Journals (Sweden)

    C. Bester

    2008-11-01

    Full Text Available The first section of the article focuses on the need for development in Africa and the specific challenges of development operations. It describes the need for a holistic and integrated information management model as part of the project management body of knowledge aimed at managing the information flow between communities and development project teams. It is argued that information, and access to information, is crucial in development projects and can therefore be seen as a critical success factor in any development project. In the second section of the article, the three information areas of the holistic and integrated information management model are described. In the section thereafter we suggest roles and actions for information managers to facilitate information processes integral to the model. These processes seek to create a developing information community that aligns itself with the development project, and supports and sustains it.

  13. Bayesian or Laplacien inference, entropy and information theory and information geometry in data and signal processing

    Science.gov (United States)

    Mohammad-Djafari, Ali

    2015-01-01

    The main object of this tutorial article is first to review the main inference tools using Bayesian approach, Entropy, Information theory and their corresponding geometries. This review is focused mainly on the ways these tools have been used in data, signal and image processing. After a short introduction of the different quantities related to the Bayes rule, the entropy and the Maximum Entropy Principle (MEP), relative entropy and the Kullback-Leibler divergence, Fisher information, we will study their use in different fields of data and signal processing such as: entropy in source separation, Fisher information in model order selection, different Maximum Entropy based methods in time series spectral estimation and finally, general linear inverse problems.

  14. Towards elicitation of users requirements for hospital information system: from a care process modelling technique to a web based collaborative tool.

    Science.gov (United States)

    Staccini, Pascal M; Joubert, Michel; Quaranta, Jean-Francois; Fieschi, Marius

    2002-01-01

    Growing attention is being given to the use of process modeling methodology for user requirements elicitation. In the analysis phase of hospital information systems, the usefulness of care-process models has been investigated to evaluate the conceptual applicability and practical understandability by clinical staff and members of users teams. Nevertheless, there still remains a gap between users and analysts in their mutual ability to share conceptual views and vocabulary, keeping the meaning of clinical context while providing elements for analysis. One of the solutions for filling this gap is to consider the process model itself in the role of a hub as a centralized means of facilitating communication between team members. Starting with a robust and descriptive technique for process modeling called IDEF0/SADT, we refined the basic data model by extracting concepts from ISO 9000 process analysis and from enterprise ontology. We defined a web-based architecture to serve as a collaborative tool and implemented it using an object-oriented database. The prospects of such a tool are discussed notably regarding to its ability to generate data dictionaries and to be used as a navigation tool through the medium of hospital-wide documentation.

  15. BRICS and Quantum Information Processing

    DEFF Research Database (Denmark)

    Schmidt, Erik Meineche

    1998-01-01

    BRICS is a research centre and international PhD school in theoretical computer science, based at the University of Aarhus, Denmark. The centre has recently become engaged in quantum information processing in cooperation with the Department of Physics, also University of Aarhus. This extended...... abstract surveys activities at BRICS with special emphasis on the activities in quantum information processing....

  16. The Role of Unconscious Information Processing in the Acquisition and Learning of Instructional Messages

    Science.gov (United States)

    Kuldas, Seffetullah; Bakar, Zainudin Abu; Ismail, Hairul Nizam

    2012-01-01

    This review investigates how the unconscious information processing can create satisfactory learning outcomes, and can be used to ameliorate the challenges of teaching students to regulate their learning processes. The search for the ideal model of human information processing as regards achievement of teaching and learning objectives is a…

  17. A Heuristic Approach for Discovering Reference Models by Mining Process Model Variants

    NARCIS (Netherlands)

    Li, C.; Reichert, M.U.; Wombacher, Andreas

    Recently, a new generation of adaptive Process-Aware Information Systems (PAISs) has emerged, which enables structural process changes during runtime while preserving PAIS robustness and consistency. Such flexibility, in turn, leads to a large number of process variants derived from the same model,

  18. Learning to rank for information retrieval and natural language processing

    CERN Document Server

    Li, Hang

    2014-01-01

    Learning to rank refers to machine learning techniques for training a model in a ranking task. Learning to rank is useful for many applications in information retrieval, natural language processing, and data mining. Intensive studies have been conducted on its problems recently, and significant progress has been made. This lecture gives an introduction to the area including the fundamental problems, major approaches, theories, applications, and future work.The author begins by showing that various ranking problems in information retrieval and natural language processing can be formalized as tw

  19. Measurement and modeling of advanced coal conversion processes

    Energy Technology Data Exchange (ETDEWEB)

    Solomon, P.R.; Serio, M.A.; Hamblen, D.G. (Advanced Fuel Research, Inc., East Hartford, CT (United States)); Smoot, L.D.; Brewster, B.S. (Brigham Young Univ., Provo, UT (United States))

    1991-01-01

    The objective of this study are to establish the mechanisms and rates of basic steps in coal conversion processes, to integrate and incorporate this information into comprehensive computer models for coal conversion processes, to evaluate these models and to apply them to gasification, mild gasification and combustion in heat engines.

  20. Measurement and modeling of advanced coal conversion processes

    Energy Technology Data Exchange (ETDEWEB)

    Solomon, P.R.; Serio, M.A.; Hamblen, D.G. (Advanced Fuel Research, Inc., East Hartford, CT (United States)); Smoot, L.D.; Brewster, B.S. (Brigham Young Univ., Provo, UT (United States))

    1991-09-25

    The objectives of this study are to establish the mechanisms and rates of basic steps in coal conversion processes, to integrate and incorporate this information into comprehensive computer models for coal conversion processes, to evaluate these models and to apply them to gasification, mild gasification and combustion in heat engines. (VC)

  1. Mathematical model as means of optimization of the automation system of the process of incidents of information security management

    Directory of Open Access Journals (Sweden)

    Yulia G. Krasnozhon

    2018-03-01

    Full Text Available Modern information technologies have an increasing importance for development dynamics and management structure of an enterprise. The management efficiency of implementation of modern information technologies directly related to the quality of information security incident management. However, issues of assessment of the impact of information security incidents management on quality and efficiency of the enterprise management system are not sufficiently highlighted neither in Russian nor in foreign literature. The main direction to approach these problems is the optimization of the process automation system of the information security incident management. Today a special attention is paid to IT-technologies while dealing with information security incidents at mission-critical facilities in Russian Federation such as the Federal Tax Service of Russia (FTS. It is proposed to use the mathematical apparatus of queueing theory in order to build a mathematical model of the system optimization. The developed model allows to estimate quality of the management taking into account the rules and restrictions imposed on the system by the effects of information security incidents. Here an example is given in order to demonstrate the system in work. The obtained statistical data are shown. An implementation of the system discussed here will improve the quality of the Russian FTS services and make responses to information security incidents faster.

  2. Liveness and Reachability Analysis of BPMN Process Models

    Directory of Open Access Journals (Sweden)

    Anass Rachdi

    2016-06-01

    Full Text Available Business processes are usually defined by business experts who require intuitive and informal graphical notations such as BPMN (Business Process Management Notation for documenting and communicating their organization activities and behavior. However, BPMN has not been provided with a formal semantics, which limits the analysis of BPMN models to using solely informal techniques such as simulation. In order to address this limitation and use formal verification, it is necessary to define a certain “mapping” between BPMN and a formal language such as Concurrent Sequential Processes (CSP and Petri Nets (PN. This paper proposes a method for the verification of BPMN models by defining formal semantics of BPMN in terms of a mapping to Time Petri Nets (TPN, which are equipped with very efficient analytical techniques. After the translation of BPMN models to TPN, verification is done to ensure that some functional properties are satisfied by the model under investigation, namely liveness and reachability properties. The main advantage of our approach over existing ones is that it takes into account the time components in modeling Business process models. An example is used throughout the paper to illustrate the proposed method.

  3. Information dissemination model for social media with constant updates

    Science.gov (United States)

    Zhu, Hui; Wu, Heng; Cao, Jin; Fu, Gang; Li, Hui

    2018-07-01

    With the development of social media tools and the pervasiveness of smart terminals, social media has become a significant source of information for many individuals. However, false information can spread rapidly, which may result in negative social impacts and serious economic losses. Thus, reducing the unfavorable effects of false information has become an urgent challenge. In this paper, a new competitive model called DMCU is proposed to describe the dissemination of information with constant updates in social media. In the model, we focus on the competitive relationship between the original false information and updated information, and then propose the priority of related information. To more effectively evaluate the effectiveness of the proposed model, data sets containing actual social media activity are utilized in experiments. Simulation results demonstrate that the DMCU model can precisely describe the process of information dissemination with constant updates, and that it can be used to forecast information dissemination trends on social media.

  4. Risk perception and information processing: the development and validation of a questionnaire to assess self-reported information processing.

    Science.gov (United States)

    Smerecnik, Chris M R; Mesters, Ilse; Candel, Math J J M; De Vries, Hein; De Vries, Nanne K

    2012-01-01

    The role of information processing in understanding people's responses to risk information has recently received substantial attention. One limitation of this research concerns the unavailability of a validated questionnaire of information processing. This article presents two studies in which we describe the development and validation of the Information-Processing Questionnaire to meet that need. Study 1 describes the development and initial validation of the questionnaire. Participants were randomized to either a systematic processing or a heuristic processing condition after which they completed a manipulation check and the initial 15-item questionnaire and again two weeks later. The questionnaire was subjected to factor reliability and validity analyses on both measurement times for purposes of cross-validation of the results. A two-factor solution was observed representing a systematic processing and a heuristic processing subscale. The resulting scale showed good reliability and validity, with the systematic condition scoring significantly higher on the systematic subscale and the heuristic processing condition significantly higher on the heuristic subscale. Study 2 sought to further validate the questionnaire in a field study. Results of the second study corresponded with those of Study 1 and provided further evidence of the validity of the Information-Processing Questionnaire. The availability of this information-processing scale will be a valuable asset for future research and may provide researchers with new research opportunities. © 2011 Society for Risk Analysis.

  5. Dimensional modeling: beyond data processing constraints.

    Science.gov (United States)

    Bunardzic, A

    1995-01-01

    The focus of information processing requirements is shifting from the on-line transaction processing (OLTP) issues to the on-line analytical processing (OLAP) issues. While the former serves to ensure the feasibility of the real-time on-line transaction processing (which has already exceeded a level of up to 1,000 transactions per second under normal conditions), the latter aims at enabling more sophisticated analytical manipulation of data. The OLTP requirements, or how to efficiently get data into the system, have been solved by applying the Relational theory in the form of Entity-Relation model. There is presently no theory related to OLAP that would resolve the analytical processing requirements as efficiently as Relational theory provided for the transaction processing. The "relational dogma" also provides the mathematical foundation for the Centralized Data Processing paradigm in which mission-critical information is incorporated as 'one and only one instance' of data, thus ensuring data integrity. In such surroundings, the information that supports business analysis and decision support activities is obtained by running predefined reports and queries that are provided by the IS department. In today's intensified competitive climate, businesses are finding that this traditional approach is not good enough. The only way to stay on top of things, and to survive and prosper, is to decentralize the IS services. The newly emerging Distributed Data Processing, with its increased emphasis on empowering the end user, does not seem to find enough merit in the relational database model to justify relying upon it. Relational theory proved too rigid and complex to accommodate the analytical processing needs. In order to satisfy the OLAP requirements, or how to efficiently get the data out of the system, different models, metaphors, and theories have been devised. All of them are pointing to the need for simplifying the highly non-intuitive mathematical constraints found

  6. Effect of Linked Rules on Business Process Model Understanding

    DEFF Research Database (Denmark)

    Wang, Wei; Indulska, Marta; Sadiq, Shazia

    2017-01-01

    Business process models are widely used in organizations by information systems analysts to represent complex business requirements and by business users to understand business operations and constraints. This understanding is extracted from graphical process models as well as business rules. Prior...

  7. Information Systems’ Portfolio: Contributions of Enterprise and Process Architecture

    Directory of Open Access Journals (Sweden)

    Silvia Fernandes

    2017-09-01

    Full Text Available We are witnessing a need for a quick and intelligent reaction from organizations to the level and speed of change in business processes.New information technologies and systems (IT/IS are challenging business models and products. One of the great shakes comes from the online and/or mobile apps and platforms.These are having a tremendous impact in launching innovative and competitive services through the combination of digital and physical features. This leads to actively rethink enterprise information systems’ portfolio, its management and suitability. One relevant way for enterprises to manage their IT/IS in order to cope with those challenges is enterprise and process architecture. A decision-making culture based on processes helps to understand and define the different elements that shape an organization and how those elements inter-relate inside and outside it. IT/IS portfolio management requires an increasing need of modeling data and process flows for better discerning and acting at its selection and alignment with business goals. The new generation of enterprise architecture (NGEA helps to design intelligent processes that answer quickly and creatively to new and challenging trends. This has to be open, agile and context-aware to allow well-designed services that match users’ expectations. This study includes two real cases/problems to solve quickly in companies and solutions are presented in line with this architectural approach.

  8. Management information system model supporting the quality assurance of schools in Thailand

    Directory of Open Access Journals (Sweden)

    Daoprakai Raso

    2017-07-01

    Full Text Available Management Information Systems are very important tools for Thai Schools in supporting the quality assurance process. This research therefore aimed to develop a Management Information System (MIS model which consisted of two phases. Phase 1 was the design of MIS model used in Thai school quality assurance (QA. Phase 2 was the evaluation of the model which consisted of four parts. There were the MIS circle which consisted of 1 System Investigation, System Analysis, System Design, System Implementation and System Maintenance. 2 The Management Information System, which consisted of data collecting, data processing, information presenting, information saving, and procedure controlling. 3 The factors that support the MIS, which includes information tools and equipment used factor and the information operator’s factor, and 4 the system theory which consisted of input, process, and output. The results showed that the level of opinions in all aspects was at a “high” level.

  9. Leading research on brain functional information processing; No kino joho shori no sendo kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This research aims at searching the concept of an information processing device with a fully different architecture from a previous ones based on the study on human brain function, sense and perception, and developing the basic fabrication technology for such system, and realizing the human-like information processing mechanism of memorization, learning, association, perception, intuition and value judgement. As an approach deriving biological and technological models from experimental brain studies, the model was derived from the brain functional information processing based on brain development/differentiation mechanism, the control mechanism/material of brain activities, and the knowledge obtained from brain measurement and study. In addition, for understanding a brain oscillation phenomenon by computational neuroscience, the cerebral cortex neural network model composed of realistic neuron models was proposed. Evaluation of the previous large-scale neural network chip system showed its ability of learning and fast processing, however, the next-generation brain computer requires further R and D of some novel architecture, device and system. 184 refs., 41 figs., 2 tabs.

  10. Computational Methods for Physical Model Information Management: Opening the Aperture

    International Nuclear Information System (INIS)

    Moser, F.; Kirgoeze, R.; Gagne, D.; Calle, D.; Murray, J.; Crowley, J.

    2015-01-01

    The volume, velocity and diversity of data available to analysts are growing exponentially, increasing the demands on analysts to stay abreast of developments in their areas of investigation. In parallel to the growth in data, technologies have been developed to efficiently process, store, and effectively extract information suitable for the development of a knowledge base capable of supporting inferential (decision logic) reasoning over semantic spaces. These technologies and methodologies, in effect, allow for automated discovery and mapping of information to specific steps in the Physical Model (Safeguard's standard reference of the Nuclear Fuel Cycle). This paper will describe and demonstrate an integrated service under development at the IAEA that utilizes machine learning techniques, computational natural language models, Bayesian methods and semantic/ontological reasoning capabilities to process large volumes of (streaming) information and associate relevant, discovered information to the appropriate process step in the Physical Model. The paper will detail how this capability will consume open source and controlled information sources and be integrated with other capabilities within the analysis environment, and provide the basis for a semantic knowledge base suitable for hosting future mission focused applications. (author)

  11. Business process model abstraction : a definition, catalog, and survey

    NARCIS (Netherlands)

    Smirnov, S.; Reijers, H.A.; Weske, M.H.; Nugteren, T.

    2012-01-01

    The discipline of business process management aims at capturing, understanding, and improving work in organizations by using process models as central artifacts. Since business-oriented tasks require different information from such models to be highlighted, a range of abstraction techniques has been

  12. Non-Integrated Information and Communication Technologies in the Kidney Transplantation Process in Brazil.

    Science.gov (United States)

    Peres Penteado, Alissa; Fábio Maciel, Rafael; Erbs, João; Feijó Ortolani, Cristina Lucia; Aguiar Roza, Bartira; Torres Pisa, Ivan

    2015-01-01

    The entire kidney transplantation process in Brazil is defined through laws, decrees, ordinances, and resolutions, but there is no defined theoretical map describing this process. From this representation it's possible to perform analysis, such as the identification of bottlenecks and information and communication technologies (ICTs) that support this process. The aim of this study was to analyze and represent the kidney transplantation workflow using business process modeling notation (BPMN) and then to identify the ICTs involved in the process. This study was conducted in eight steps, including document analysis and professional evaluation. The results include the BPMN model of the kidney transplantation process in Brazil and the identification of ICTs. We discovered that there are great delays in the process due to there being many different ICTs involved, which can cause information to be poorly integrated.

  13. Aligning business processes and information systems new approaches to continuous quality engineering

    CERN Document Server

    Heinrich, Robert

    2014-01-01

    Business processes and information systems mutually affect each other in non-trivial ways. Frequently, processes are designed without taking the systems' impact into account, and vice versa. Missing alignment at design-time results in quality problems at run-time. Robert Heinrich gives examples from research and practice for an integrated design of process and system quality. A quality reference-model characterizes process quality and a process notation is extended to operationalize the model. Simulation is a powerful means to predict the mutual quality impact, to compare design alternatives,

  14. Proprioceptive information processing in schizophrenia

    DEFF Research Database (Denmark)

    Arnfred, Sidse M H

    of the left somatosensory cortex and it was suggested to be in accordance with two theories of schizophrenic information processing: the theory of deficiency of corollary discharge and the theory of weakening of the influence of past regularities. No gating deficiency was observed and the imprecision...... Rado (1890-1972) suggested that one of two un-reducible deficits in schizophrenia was a disorder of proprioception. Exploration of proprioceptive information processing is possible through the measurement of evoked and event related potentials. Event related EEG can be analyzed as conventional time...... and amplitude attenuation was not a general phenomenon across the entire brain response. Summing up, in support of Rado's hypothesis, schizophrenia spectrum patients demonstrated abnormalities in proprioceptive information processing. Future work needs to extend the findings in larger un-medicated, non...

  15. A methodology proposal for collaborative business process elaboration using a model-driven approach

    Science.gov (United States)

    Mu, Wenxin; Bénaben, Frédérick; Pingaud, Hervé

    2015-05-01

    Business process management (BPM) principles are commonly used to improve processes within an organisation. But they can equally be applied to supporting the design of an Information System (IS). In a collaborative situation involving several partners, this type of BPM approach may be useful to support the design of a Mediation Information System (MIS), which would ensure interoperability between the partners' ISs (which are assumed to be service oriented). To achieve this objective, the first main task is to build a collaborative business process cartography. The aim of this article is to present a method for bringing together collaborative information and elaborating collaborative business processes from the information gathered (by using a collaborative situation framework, an organisational model, an informational model, a functional model and a metamodel and by using model transformation rules).

  16. Event Modeling in UML. Unified Modeling Language and Unified Process

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2002-01-01

    We show how events can be modeled in terms of UML. We view events as change agents that have consequences and as information objects that represent information. We show how to create object-oriented structures that represent events in terms of attributes, associations, operations, state charts......, and messages. We outline a run-time environment for the processing of events with multiple participants....

  17. A business process modeling experience in a complex information system re-engineering.

    Science.gov (United States)

    Bernonville, Stéphanie; Vantourout, Corinne; Fendeler, Geneviève; Beuscart, Régis

    2013-01-01

    This article aims to share a business process modeling experience in a re-engineering project of a medical records department in a 2,965-bed hospital. It presents the modeling strategy, an extract of the results and the feedback experience.

  18. Representing Information in Patient Reports Using Natural Language Processing and the Extensible Markup Language

    Science.gov (United States)

    Friedman, Carol; Hripcsak, George; Shagina, Lyuda; Liu, Hongfang

    1999-01-01

    Objective: To design a document model that provides reliable and efficient access to clinical information in patient reports for a broad range of clinical applications, and to implement an automated method using natural language processing that maps textual reports to a form consistent with the model. Methods: A document model that encodes structured clinical information in patient reports while retaining the original contents was designed using the extensible markup language (XML), and a document type definition (DTD) was created. An existing natural language processor (NLP) was modified to generate output consistent with the model. Two hundred reports were processed using the modified NLP system, and the XML output that was generated was validated using an XML validating parser. Results: The modified NLP system successfully processed all 200 reports. The output of one report was invalid, and 199 reports were valid XML forms consistent with the DTD. Conclusions: Natural language processing can be used to automatically create an enriched document that contains a structured component whose elements are linked to portions of the original textual report. This integrated document model provides a representation where documents containing specific information can be accurately and efficiently retrieved by querying the structured components. If manual review of the documents is desired, the salient information in the original reports can also be identified and highlighted. Using an XML model of tagging provides an additional benefit in that software tools that manipulate XML documents are readily available. PMID:9925230

  19. Security Process Capability Model Based on ISO/IEC 15504 Conformant Enterprise SPICE

    Directory of Open Access Journals (Sweden)

    Mitasiunas Antanas

    2014-07-01

    Full Text Available In the context of modern information systems, security has become one of the most critical quality attributes. The purpose of this paper is to address the problem of quality of information security. An approach to solve this problem is based on the main assumption that security is a process oriented activity. According to this approach, product quality can be achieved by means of process quality - process capability. Introduced in the paper, SPICE conformant information security process capability model is based on process capability modeling elaborated by world-wide software engineering community during the last 25 years, namely ISO/IEC 15504 that defines the capability dimension and the requirements for process definition and domain independent integrated model for enterprise-wide assessment and Enterprise SPICE improvement

  20. High fidelity information processing in folic acid chemotaxis of Dictyostelium amoebae.

    Science.gov (United States)

    Segota, Igor; Mong, Surin; Neidich, Eitan; Rachakonda, Archana; Lussenhop, Catherine J; Franck, Carl

    2013-11-06

    Living cells depend upon the detection of chemical signals for their existence. Eukaryotic cells can sense a concentration difference as low as a few per cent across their bodies. This process was previously suggested to be limited by the receptor-ligand binding fluctuations. Here, we first determine the chemotaxis response of Dictyostelium cells to static folic acid gradients and show that they can significantly exceed this sensitivity, responding to gradients as shallow as 0.2% across the cell body. Second, using a previously developed information theory framework, we compare the total information gained about the gradient (based on the cell response) to its upper limit: the information gained at the receptor-ligand binding step. We find that the model originally applied to cAMP sensing fails as demonstrated by the violation of the data processing inequality, i.e. the total information exceeds the information at the receptor-ligand binding step. We propose an extended model with multiple known receptor types and with cells allowed to perform several independent measurements of receptor occupancy. This does not violate the data processing inequality and implies the receptor-ligand binding noise dominates both for low- and high-chemoattractant concentrations. We also speculate that the interplay between exploration and exploitation is used as a strategy for accurate sensing of otherwise unmeasurable levels of a chemoattractant.

  1. Teaching modelling of distributed information and control systems to students using the method of simulation modelling

    Directory of Open Access Journals (Sweden)

    A. V. Gabalin

    2017-01-01

    Full Text Available Mathematical modelling is one of the most effective means to study the complex systems and processes. One of the most convenient means of mathematical modelling used in the analysis of functioning of systems of this class are simulation models that describe the structure and behavior of the system in the form of a program for the PC and allow conducting computer experiments with the aim of obtaining the necessary data on the functioning of the elements and the system as a whole during certain time intervals. Currently, the simulation tools market presents a large number of different simulation systems. However, the selection of suitable tools is very important. Specialized programs in particular are GPSS World, MATLAB/ Simulink, and AnyLogic. Distributed information and control systems (ICS are dispersed in space multifunctional coherent set of stationary and moving elements with developed technical means of reception, transmission and processing of information. The task is to determine a rational structure of ICS, system-planned indicators of quality of development and functioning of which meet specified requirements under given structural constraints, characteristics of information flows, and parameters of technical tools. For experimental research of functioning processes of the described system a simulation model was developed. This model allows obtaining and evaluating such functional characteristics as the degree of technical means utilization, waiting time of information in queues for service, the level of efficiency of transmission and processing of information, the time of forming a single media and etc. The model also allows evaluating the performance of the system depending on the flight schedule, flight paths, characteristics of technical means, the system structure, failure of individual elements and depending on other parameters. The developed simulation model in GPSS allows students to master the subject area deeply enough – the

  2. A KPI framework for process-based benchmarking of hospital information systems.

    Science.gov (United States)

    Jahn, Franziska; Winter, Alfred

    2011-01-01

    Benchmarking is a major topic for monitoring, directing and elucidating the performance of hospital information systems (HIS). Current approaches neglect the outcome of the processes that are supported by the HIS and their contribution to the hospital's strategic goals. We suggest to benchmark HIS based on clinical documentation processes and their outcome. A framework consisting of a general process model and outcome criteria for clinical documentation processes is introduced.

  3. Poor sleep quality predicts deficient emotion information processing over time in early adolescence.

    Science.gov (United States)

    Soffer-Dudek, Nirit; Sadeh, Avi; Dahl, Ronald E; Rosenblat-Stein, Shiran

    2011-11-01

    There is deepening understanding of the effects of sleep on emotional information processing. Emotion information processing is a key aspect of social competence, which undergoes important maturational and developmental changes in adolescence; however, most research in this area has focused on adults. Our aim was to test the links between sleep and emotion information processing during early adolescence. Sleep and facial information processing were assessed objectively during 3 assessment waves, separated by 1-year lags. Data were obtained in natural environments-sleep was assessed in home settings, and facial information processing was assessed at school. 94 healthy children (53 girls, 41 boys), aged 10 years at Time 1. N/A. Facial information processing was tested under neutral (gender identification) and emotional (emotional expression identification) conditions. Sleep was assessed in home settings using actigraphy for 7 nights at each assessment wave. Waking > 5 min was considered a night awakening. Using multilevel modeling, elevated night awakenings and decreased sleep efficiency significantly predicted poor performance only in the emotional information processing condition (e.g., b = -1.79, SD = 0.52, confidence interval: lower boundary = -2.82, upper boundary = -0.076, t(416.94) = -3.42, P = 0.001). Poor sleep quality is associated with compromised emotional information processing during early adolescence, a sensitive period in socio-emotional development.

  4. An evolving user-oriented model of Internet health information seeking.

    Science.gov (United States)

    Gaie, Martha J

    2006-01-01

    This paper presents an evolving user-oriented model of Internet health information seeking (IS) based on qualitative data collected from 22 lung cancer (LC) patients and caregivers. This evolving model represents information search behavior as more highly individualized, complex, and dynamic than previous models, including pre-search psychological activity, use of multiple heuristics throughout the process, and cost-benefit evaluation of search results. This study's findings suggest that IS occurs in four distinct phases: search initiation/continuation, selective exposure, message processing, and message evaluation. The identification of these phases and the heuristics used within them suggests a higher order of complexity in the decision-making processes that underlie IS, which could lead to the development of a conceptual framework that more closely reflects the complex nature of contextualized IS. It also illustrates the advantages of using qualitative methods to extract more subtle details of the IS process and fill in the gaps in existing models.

  5. Possibilities of water run-off models by using geological information systems

    International Nuclear Information System (INIS)

    Oeverland, H.; Kleeberg, H.B.

    1992-01-01

    The movement of water in a given region is determined by a number of regional factors, e.g. land use and topography. However, the available precipitation-runoff models take little account of this regional information. Geological information systems, on the other hand, are instruments for efficient management, presentation and evaluation of local information, so the best approach would be a combination of the two types of models. The requirements to be met by such a system are listed; they result from the processes to be modelled (continuous runoff, high-water runoff, mass transfer) but also from the available data and their acquisition and processing. Ten of the best-known precipitation-runoff models are presented and evaluated on the basis of the requirements listed. The basic concept of an integrated model is outlined, and additional modulus required for modelling are defined. (orig./BBR) [de

  6. A work process and information flow description of control room operations

    International Nuclear Information System (INIS)

    Davey, E.; Matthews, G.

    2007-01-01

    The control room workplace is the location from which all plant operations are supervised and controlled on a shift-to-shift basis. The activities comprising plant operations are structured into a number of work processes, and information is the common currency that is used to convey work requirements, communicate business and operating decisions, specify work practice, and describe the ongoing plant and work status. This paper describes the motivation for and early experience with developing a work process and information flow model of CANDU control room operations, and discusses some of the insights developed from model examination that suggest ways in which changes in control centre work specification, organization of resources, or asset layout could be undertaken to achieve operational improvements. (author)

  7. Effects of clutter on information processing deficits in individuals with hoarding disorder.

    Science.gov (United States)

    Raines, Amanda M; Timpano, Kiara R; Schmidt, Norman B

    2014-09-01

    Current cognitive behavioral models of hoarding view hoarding as a multifaceted problem stemming from various information processing deficits. However, there is also reason to suspect that the consequences of hoarding may in turn impact or modulate deficits in information processing. The current study sought to expand upon the existing literature by manipulating clutter to examine whether the presence of a cluttered environment affects information processing. Participants included 34 individuals with hoarding disorder. Participants were randomized into a clutter or non-clutter condition and asked to complete various neuropsychological tasks of memory and attention. Results revealed that hoarding severity was associated with difficulties in sustained attention. However, individuals in the clutter condition relative to the non-clutter condition did not experience greater deficits in information processing. Limitations include the cross-sectional design and small sample size. The current findings add considerably to a growing body of literature on the relationships between information processing deficits and hoarding behaviors. Research of this type is integral to understanding the etiology and maintenance of hoarding. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Mindfulness training alters emotional memory recall compared to active controls: support for an emotional information processing model of mindfulness

    Directory of Open Access Journals (Sweden)

    Doug eRoberts-Wolfe

    2012-02-01

    Full Text Available Objectives: While mindfulness-based interventions have received widespread application in both clinical and non-clinical populations, the mechanism by which mindfulness meditation improves well-being remains elusive. One possibility is that mindfulness training alters the processing of emotional information, similar to prevailing cognitive models of depression and anxiety. The aim of this study was to investigating the effects of mindfulness training on emotional information processing (i.e. memory biases in relation to both clinical symptomatology and well-being in comparison to active control conditions.Methods: Fifty-eight university students (28 female, age = 20.1 ± 2.7 years participated in either a 12-week course containing a "meditation laboratory" or an active control course with similar content or experiential practice laboratory format (music. Participants completed an emotional word recall task and self-report questionnaires of well-being and clinical symptoms before and after the 12-week course.Results: Meditators showed greater increases in positive word recall compared to controls F(1, 56 = 6.6, p = .02. The meditation group increased significantly more on measures of well-being [F(1, 56 = 6.6, p = .01], with a marginal decrease in depression and anxiety [(F(1, 56 = 3.0, p = .09] compared to controls. Increased positive word recall was associated with increased psychological well-being [r = 0.31, p = .02] and decreased clinical symptoms [r = -0.29, p = .03].Conclusion: Mindfulness training was associated with greater improvements in processing efficiency for positively valenced stimuli than active control conditions. This change in emotional information processing was associated with improvements in psychological well-being and less depression and anxiety. These data suggest that mindfulness training may improve well-being via changes in emotional information processing.

  9. Investigation of signal models and methods for evaluating structures of processing telecommunication information exchange systems under acoustic noise conditions

    Science.gov (United States)

    Kropotov, Y. A.; Belov, A. A.; Proskuryakov, A. Y.; Kolpakov, A. A.

    2018-05-01

    The paper considers models and methods for estimating signals during the transmission of information messages in telecommunication systems of audio exchange. One-dimensional probability distribution functions that can be used to isolate useful signals, and acoustic noise interference are presented. An approach to the estimation of the correlation and spectral functions of the parameters of acoustic signals is proposed, based on the parametric representation of acoustic signals and the components of the noise components. The paper suggests an approach to improving the efficiency of interference cancellation and highlighting the necessary information when processing signals from telecommunications systems. In this case, the suppression of acoustic noise is based on the methods of adaptive filtering and adaptive compensation. The work also describes the models of echo signals and the structure of subscriber devices in operational command telecommunications systems.

  10. A generalized model via random walks for information filtering

    Energy Technology Data Exchange (ETDEWEB)

    Ren, Zhuo-Ming, E-mail: zhuomingren@gmail.com [Department of Physics, University of Fribourg, Chemin du Musée 3, CH-1700, Fribourg (Switzerland); Chongqing Institute of Green and Intelligent Technology, Chinese Academy of Sciences, ChongQing, 400714 (China); Kong, Yixiu [Department of Physics, University of Fribourg, Chemin du Musée 3, CH-1700, Fribourg (Switzerland); Shang, Ming-Sheng, E-mail: msshang@cigit.ac.cn [Chongqing Institute of Green and Intelligent Technology, Chinese Academy of Sciences, ChongQing, 400714 (China); Zhang, Yi-Cheng [Department of Physics, University of Fribourg, Chemin du Musée 3, CH-1700, Fribourg (Switzerland)

    2016-08-06

    There could exist a simple general mechanism lurking beneath collaborative filtering and interdisciplinary physics approaches which have been successfully applied to online E-commerce platforms. Motivated by this idea, we propose a generalized model employing the dynamics of the random walk in the bipartite networks. Taking into account the degree information, the proposed generalized model could deduce the collaborative filtering, interdisciplinary physics approaches and even the enormous expansion of them. Furthermore, we analyze the generalized model with single and hybrid of degree information on the process of random walk in bipartite networks, and propose a possible strategy by using the hybrid degree information for different popular objects to toward promising precision of the recommendation. - Highlights: • We propose a generalized recommendation model employing the random walk dynamics. • The proposed model with single and hybrid of degree information is analyzed. • A strategy with the hybrid degree information improves precision of recommendation.

  11. A generalized model via random walks for information filtering

    International Nuclear Information System (INIS)

    Ren, Zhuo-Ming; Kong, Yixiu; Shang, Ming-Sheng; Zhang, Yi-Cheng

    2016-01-01

    There could exist a simple general mechanism lurking beneath collaborative filtering and interdisciplinary physics approaches which have been successfully applied to online E-commerce platforms. Motivated by this idea, we propose a generalized model employing the dynamics of the random walk in the bipartite networks. Taking into account the degree information, the proposed generalized model could deduce the collaborative filtering, interdisciplinary physics approaches and even the enormous expansion of them. Furthermore, we analyze the generalized model with single and hybrid of degree information on the process of random walk in bipartite networks, and propose a possible strategy by using the hybrid degree information for different popular objects to toward promising precision of the recommendation. - Highlights: • We propose a generalized recommendation model employing the random walk dynamics. • The proposed model with single and hybrid of degree information is analyzed. • A strategy with the hybrid degree information improves precision of recommendation.

  12. 40 CFR 68.65 - Process safety information.

    Science.gov (United States)

    2010-07-01

    ... (CONTINUED) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.65 Process safety... 40 Protection of Environment 15 2010-07-01 2010-07-01 false Process safety information. 68.65... compilation of written process safety information before conducting any process hazard analysis required by...

  13. Cognitive processes, models and metaphors in decision research

    Directory of Open Access Journals (Sweden)

    Ben Newell

    2008-03-01

    Full Text Available Decision research in psychology has traditionally been influenced by the extit{homo oeconomicus} metaphor with its emphasis on normative models and deviations from the predictions of those models. In contrast, the principal metaphor of cognitive psychology conceptualizes humans as `information processors', employing processes of perception, memory, categorization, problem solving and so on. Many of the processes described in cognitive theories are similar to those involved in decision making, and thus increasing cross-fertilization between the two areas is an important endeavour. A wide range of models and metaphors has been proposed to explain and describe `information processing' and many models have been applied to decision making in ingenious ways. This special issue encourages cross-fertilization between cognitive psychology and decision research by providing an overview of current perspectives in one area that continues to highlight the benefits of the synergistic approach: cognitive modeling of multi-attribute decision making. In this introduction we discuss aspects of the cognitive system that need to be considered when modeling multi-attribute decision making (e.g., automatic versus controlled processing, learning and memory constraints, metacognition and illustrate how such aspects are incorporated into the approaches proposed by contributors to the special issue. We end by discussing the challenges posed by the contrasting and sometimes incompatible assumptions of the models and metaphors.

  14. Aligning Business Process Quality and Information System Quality

    OpenAIRE

    Heinrich, Robert

    2013-01-01

    Business processes and information systems mutually affect each other in non-trivial ways. Frequently, the business process design and the information system design are not well aligned. This means that business processes are designed without taking the information system impact into account, and vice versa. Missing alignment at design time often results in quality problems at runtime, such as large response times of information systems, large process execution times, overloaded information s...

  15. The Influence of Social and Environmental Labels on Purchasing: An Information and Systematic-heuristic Processing Approach

    Directory of Open Access Journals (Sweden)

    Raquel Redondo Palomo

    2015-07-01

    Full Text Available This paper aims at exploring how social and environmental (SE labels influence purchasing. By drawing on the information processing and the systematic-heuristic models, this study tests the process followed by consumers when purchasing SE labeled-products. Information was gathered through a structured questionnaire in personal interviews with 400 consumers responsible for household shopping of Fast-moving Consumer Goods (FMCG, who were randomly approached at shopping malls in four areas of Madrid, Spain. They were asked about recognition, knowledge, credibility, perceived utility and purchases on 12 different labels; the influence of these variables on purchase is modeled and tested by path analysis. This study suggests that a systematic-heuristic information processing occurs when consumers buy SE-labeled FMCG products, as the purchase of this type of goods depends on the recognition of a label, knowledge of the issue/issuer, as well as the credibility and the perceived utility of SE labels. Motivation for being informed influences the process, being an antecedent of awareness, comprehension and perceived utility. This model shows a dual processing mode: systematic and heuristic, where the lack of cognitive capacity could explain why these two processing modes co-occur. This paper adds value to existing literature on SE labels and consumption by applying the information processing model, which has not been used before in the field of responsible consumption, in addition to open a promising avenue for research, by offering complementary theories to the existing ones, based on attitudes.

  16. High resolution reservoir geological modelling using outcrop information

    Energy Technology Data Exchange (ETDEWEB)

    Zhang Changmin; Lin Kexiang; Liu Huaibo [Jianghan Petroleum Institute, Hubei (China)] [and others

    1997-08-01

    This is China`s first case study of high resolution reservoir geological modelling using outcrop information. The key of the modelling process is to build a prototype model and using the model as a geological knowledge bank. Outcrop information used in geological modelling including seven aspects: (1) Determining the reservoir framework pattern by sedimentary depositional system and facies analysis; (2) Horizontal correlation based on the lower and higher stand duration of the paleo-lake level; (3) Determining the model`s direction based on the paleocurrent statistics; (4) Estimating the sandbody communication by photomosaic and profiles; (6) Estimating reservoir properties distribution within sandbody by lithofacies analysis; and (7) Building the reservoir model in sandbody scale by architectural element analysis and 3-D sampling. A high resolution reservoir geological model of Youshashan oil field has been built by using this method.

  17. Science-based information processing in the process control of power stations

    International Nuclear Information System (INIS)

    Weisang, C.

    1992-01-01

    Through the application of specialized systems, future-orientated information processing integrates the sciences of processes, control systems, process control strategies, user behaviour and ergonomics. Improvements in process control can be attained, inter alia, by the preparation of the information contained (e.g. by suppressing the flow of signals and replacing it with signals which are found on substance) and also by an ergonomic representation of the study of the process. (orig.) [de

  18. Industrial process system assessment: bridging process engineering and life cycle assessment through multiscale modeling.

    Science.gov (United States)

    The Industrial Process System Assessment (IPSA) methodology is a multiple step allocation approach for connecting information from the production line level up to the facility level and vice versa using a multiscale model of process systems. The allocation procedure assigns inpu...

  19. A Multi-Level Model of Information Seeking in the Clinical Domain

    Science.gov (United States)

    Hung, Peter W.; Johnson, Stephen B.; Kaufman, David R.; Mendonça, Eneida A.

    2008-01-01

    Objective: Clinicians often have difficulty translating information needs into effective search strategies to find appropriate answers. Information retrieval systems employing an intelligent search agent that generates adaptive search strategies based on human search expertise could be helpful in meeting clinician information needs. A prerequisite for creating such systems is an information seeking model that facilitates the representation of human search expertise. The purpose of developing such a model is to provide guidance to information seeking system development and to shape an empirical research program. Design: The information seeking process was modeled as a complex problem-solving activity. After considering how similarly complex activities had been modeled in other domains, we determined that modeling context-initiated information seeking across multiple problem spaces allows the abstraction of search knowledge into functionally consistent layers. The knowledge layers were identified in the information science literature and validated through our observations of searches performed by health science librarians. Results: A hierarchical multi-level model of context-initiated information seeking is proposed. Each level represents (1) a problem space that is traversed during the online search process, and (2) a distinct layer of knowledge that is required to execute a successful search. Grand strategy determines what information resources will be searched, for what purpose, and in what order. The strategy level represents an overall approach for searching a single resource. Tactics are individual moves made to further a strategy. Operations are mappings of abstract intentions to information resource-specific concrete input. Assessment is the basis of interaction within the strategic hierarchy, influencing the direction of the search. Conclusion: The described multi-level model provides a framework for future research and the foundation for development of an

  20. Making a difference: incorporating theories of autonomy into models of informed consent.

    Science.gov (United States)

    Delany, C

    2008-09-01

    Obtaining patients' informed consent is an ethical and legal obligation in healthcare practice. Whilst the law provides prescriptive rules and guidelines, ethical theories of autonomy provide moral foundations. Models of practice of consent, have been developed in the bioethical literature to assist in understanding and integrating the ethical theory of autonomy and legal obligations into the clinical process of obtaining a patient's informed consent to treatment. To review four models of consent and analyse the way each model incorporates the ethical meaning of autonomy and how, as a consequence, they might change the actual communicative process of obtaining informed consent within clinical contexts. An iceberg framework of consent is used to conceptualise how ethical theories of autonomy are positioned and underpin the above surface, and visible clinical communication, including associated legal guidelines and ethical rules. Each model of consent is critically reviewed from the perspective of how it might shape the process of informed consent. All four models would alter the process of obtaining consent. Two models provide structure and guidelines for the content and timing of obtaining patients' consent. The two other models rely on an attitudinal shift in clinicians. They provide ideas for consent by focusing on underlying values, attitudes and meaning associated with the ethical meaning of autonomy. The paper concludes that models of practice that explicitly incorporate the underlying ethical meaning of autonomy as their basis, provide less prescriptive, but more theoretically rich guidance for healthcare communicative practices.

  1. Simulation Models of Human Decision-Making Processes

    Directory of Open Access Journals (Sweden)

    Nina RIZUN

    2014-10-01

    Full Text Available The main purpose of the paper is presentation of the new concept of human decision-making process modeling via using the analogy with Automatic Control Theory. From the author's point of view this concept allows to develop and improve the theory of decision-making in terms of the study and classification of specificity of the human intellectual processes in different conditions. It was proved that the main distinguishing feature between the Heuristic / Intuitive and Rational Decision-Making Models is the presence of so-called phenomenon of "enrichment" of the input information with human propensity, hobbies, tendencies, expectations, axioms and judgments, presumptions or bias and their justification. In order to obtain additional knowledge about the basic intellectual processes as well as the possibility of modeling the decision results in various parameters characterizing the decision-maker, the complex of the simulation models was developed. These models are based on the assumptions that:  basic intellectual processes of the Rational Decision-Making Model can be adequately simulated and identified by the transient processes of the proportional-integral-derivative controller; basic intellectual processes of the Bounded Rationality and Intuitive Models can be adequately simulated and identified by the transient processes of the nonlinear elements.The taxonomy of the most typical automatic control theory elements and their compliance with certain decision-making models with a point of view of decision-making process specificity and decision-maker behavior during a certain time of professional activity was obtained.

  2. Personality and self-regulation: trait and information-processing perspectives.

    Science.gov (United States)

    Hoyle, Rick H

    2006-12-01

    This article introduces the special issue of Journal of Personality on personality and self-regulation. The goal of the issue is to illustrate and inspire research that integrates personality and process-oriented accounts of self-regulation. The article begins by discussing the trait perspective on self-regulation--distinguishing between temperament and personality accounts--and the information-processing perspective. Three approaches to integrating these perspectives are then presented. These range from methodological approaches, in which constructs representing the two perspectives are examined in integrated statistical models, to conceptual approaches, in which the two perspectives are unified in a holistic theoretical model of self-regulation. The article concludes with an overview of the special issue contributions, which are organized in four sections: broad, integrative models of personality and self-regulation; models that examine the developmental origins of self-regulation and self-regulatory styles; focused programs of research that concern specific aspects or applications of self-regulation; and strategies for increasing the efficiency and effectiveness of self-regulation.

  3. Towards an Information Model of Consistency Maintenance in Distributed Interactive Applications

    Directory of Open Access Journals (Sweden)

    Xin Zhang

    2008-01-01

    Full Text Available A novel framework to model and explore predictive contract mechanisms in distributed interactive applications (DIAs using information theory is proposed. In our model, the entity state update scheme is modelled as an information generation, encoding, and reconstruction process. Such a perspective facilitates a quantitative measurement of state fidelity loss as a result of the distribution protocol. Results from an experimental study on a first-person shooter game are used to illustrate the utility of this measurement process. We contend that our proposed model is a starting point to reframe and analyse consistency maintenance in DIAs as a problem in distributed interactive media compression.

  4. Process-aware information systems : design, enactment and analysis

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Wah, B.W.

    2009-01-01

    Process-aware information systems support operational business processes by combining advances in information technology with recent insights from management science. Workflow management systems are typical examples of such systems. However, many other types of information systems are also "process

  5. Modelling aspects of distributed processing in telecommunication networks

    NARCIS (Netherlands)

    Tomasgard, A; Audestad, JA; Dye, S; Stougie, L; van der Vlerk, MH; Wallace, SW

    1998-01-01

    The purpose of this paper is to formally describe new optimization models for telecommunication networks with distributed processing. Modem distributed networks put more focus on the processing of information and less on the actual transportation of data than we are traditionally used to in

  6. Information paths within the new product development process

    DEFF Research Database (Denmark)

    Jespersen, Kristina Risom

    2007-01-01

    collection platform to obtain measurements from within the NPD process. 42 large, international companies participated in the data collecting simulation. Results revealed five different information paths that were not connecting all stages of the NPD process. Moreover, results show that the front......-end is not driving the information acquisition through the stages of the NPD process, and that environmental turbulence disconnects stages from the information paths in the NPD process. This implies that information is at the same time a key to success and a key to entrapment in the NPD process....

  7. Theory of Neural Information Processing Systems

    International Nuclear Information System (INIS)

    Galla, Tobias

    2006-01-01

    It is difficult not to be amazed by the ability of the human brain to process, to structure and to memorize information. Even by the toughest standards the behaviour of this network of about 10 11 neurons qualifies as complex, and both the scientific community and the public take great interest in the growing field of neuroscience. The scientific endeavour to learn more about the function of the brain as an information processing system is here a truly interdisciplinary one, with important contributions from biology, computer science, physics, engineering and mathematics as the authors quite rightly point out in the introduction of their book. The role of the theoretical disciplines here is to provide mathematical models of information processing systems and the tools to study them. These models and tools are at the centre of the material covered in the book by Coolen, Kuehn and Sollich. The book is divided into five parts, providing basic introductory material on neural network models as well as the details of advanced techniques to study them. A mathematical appendix complements the main text. The range of topics is extremely broad, still the presentation is concise and the book well arranged. To stress the breadth of the book let me just mention a few keywords here: the material ranges from the basics of perceptrons and recurrent network architectures to more advanced aspects such as Bayesian learning and support vector machines; Shannon's theory of information and the definition of entropy are discussed, and a chapter on Amari's information geometry is not missing either. Finally the statistical mechanics chapters cover Gardner theory and the replica analysis of the Hopfield model, not without being preceded by a brief introduction of the basic concepts of equilibrium statistical physics. The book also contains a part on effective theories of the macroscopic dynamics of neural networks. Many dynamical aspects of neural networks are usually hard to find in the

  8. IT Business Value Model for Information Intensive Organizations

    Directory of Open Access Journals (Sweden)

    Antonio Carlos Gastaud Maçada

    2012-01-01

    Full Text Available Many studies have highlighted the capacity Information Technology (IT has for generating value for organizations. Investments in IT made by organizations have increased each year. Therefore, the purpose of the present study is to analyze the IT Business Value for Information Intensive Organizations (IIO - e.g. banks, insurance companies and securities brokers. The research method consisted of a survey that used and combined the models from Weill and Broadbent (1998 and Gregor, Martin, Fernandez, Stern and Vitale (2006. Data was gathered using an adapted instrument containing 5 dimensions (Strategic, Informational, Transactional, Transformational and Infra-structure with 27 items. The instrument was refined by employing statistical techniques such as Exploratory and Confirmatory Factorial Analysis through Structural Equations (first and second order Model Measurement. The final model is composed of four factors related to IT Business Value: Strategic, Informational, Transactional and Transformational, arranged in 15 items. The dimension Infra-structure was excluded during the model refinement process because it was discovered during interviews that managers were unable to perceive it as a distinct dimension of IT Business Value.

  9. Spiral model pilot project information model

    Science.gov (United States)

    1991-01-01

    The objective was an evaluation of the Spiral Model (SM) development approach to allow NASA Marshall to develop an experience base of that software management methodology. A discussion is presented of the Information Model (IM) that was used as part of the SM methodology. A key concept of the SM is the establishment of an IM to be used by management to track the progress of a project. The IM is the set of metrics that is to be measured and reported throughout the life of the project. These metrics measure both the product and the process to ensure the quality of the final delivery item and to ensure the project met programmatic guidelines. The beauty of the SM, along with the IM, is the ability to measure not only the correctness of the specification and implementation of the requirements but to also obtain a measure of customer satisfaction.

  10. A generalized model via random walks for information filtering

    Science.gov (United States)

    Ren, Zhuo-Ming; Kong, Yixiu; Shang, Ming-Sheng; Zhang, Yi-Cheng

    2016-08-01

    There could exist a simple general mechanism lurking beneath collaborative filtering and interdisciplinary physics approaches which have been successfully applied to online E-commerce platforms. Motivated by this idea, we propose a generalized model employing the dynamics of the random walk in the bipartite networks. Taking into account the degree information, the proposed generalized model could deduce the collaborative filtering, interdisciplinary physics approaches and even the enormous expansion of them. Furthermore, we analyze the generalized model with single and hybrid of degree information on the process of random walk in bipartite networks, and propose a possible strategy by using the hybrid degree information for different popular objects to toward promising precision of the recommendation.

  11. The Processing of Somatosensory Information shifts from an early parallel into a serial processing mode: a combined fMRI/MEG study.

    Directory of Open Access Journals (Sweden)

    Carsten Michael Klingner

    2016-12-01

    Full Text Available The question regarding whether somatosensory inputs are processed in parallel or in series has not been clearly answered. Several studies that have applied dynamic causal modeling (DCM to fMRI data have arrived at seemingly divergent conclusions. However, these divergent results could be explained by the hypothesis that the processing route of somatosensory information changes with time. Specifically, we suggest that somatosensory stimuli are processed in parallel only during the early stage, whereas the processing is later dominated by serial processing. This hypothesis was revisited in the present study based on fMRI analyses of tactile stimuli and the application of DCM to magnetoencephalographic (MEG data collected during sustained (260 ms tactile stimulation. Bayesian model comparisons were used to infer the processing stream. We demonstrated that the favored processing stream changes over time. We found that the neural activity elicited in the first 100 ms following somatosensory stimuli is best explained by models that support a parallel processing route, whereas a serial processing route is subsequently favored. These results suggest that the secondary somatosensory area (SII receives information regarding a new stimulus in parallel with the primary somatosensory area (SI, whereas later processing in the SII is dominated by the preprocessed input from the SI.

  12. The Processing of Somatosensory Information Shifts from an Early Parallel into a Serial Processing Mode: A Combined fMRI/MEG Study.

    Science.gov (United States)

    Klingner, Carsten M; Brodoehl, Stefan; Huonker, Ralph; Witte, Otto W

    2016-01-01

    The question regarding whether somatosensory inputs are processed in parallel or in series has not been clearly answered. Several studies that have applied dynamic causal modeling (DCM) to fMRI data have arrived at seemingly divergent conclusions. However, these divergent results could be explained by the hypothesis that the processing route of somatosensory information changes with time. Specifically, we suggest that somatosensory stimuli are processed in parallel only during the early stage, whereas the processing is later dominated by serial processing. This hypothesis was revisited in the present study based on fMRI analyses of tactile stimuli and the application of DCM to magnetoencephalographic (MEG) data collected during sustained (260 ms) tactile stimulation. Bayesian model comparisons were used to infer the processing stream. We demonstrated that the favored processing stream changes over time. We found that the neural activity elicited in the first 100 ms following somatosensory stimuli is best explained by models that support a parallel processing route, whereas a serial processing route is subsequently favored. These results suggest that the secondary somatosensory area (SII) receives information regarding a new stimulus in parallel with the primary somatosensory area (SI), whereas later processing in the SII is dominated by the preprocessed input from the SI.

  13. A Generic Modeling Process to Support Functional Fault Model Development

    Science.gov (United States)

    Maul, William A.; Hemminger, Joseph A.; Oostdyk, Rebecca; Bis, Rachael A.

    2016-01-01

    Functional fault models (FFMs) are qualitative representations of a system's failure space that are used to provide a diagnostic of the modeled system. An FFM simulates the failure effect propagation paths within a system between failure modes and observation points. These models contain a significant amount of information about the system including the design, operation and off nominal behavior. The development and verification of the models can be costly in both time and resources. In addition, models depicting similar components can be distinct, both in appearance and function, when created individually, because there are numerous ways of representing the failure space within each component. Generic application of FFMs has the advantages of software code reuse: reduction of time and resources in both development and verification, and a standard set of component models from which future system models can be generated with common appearance and diagnostic performance. This paper outlines the motivation to develop a generic modeling process for FFMs at the component level and the effort to implement that process through modeling conventions and a software tool. The implementation of this generic modeling process within a fault isolation demonstration for NASA's Advanced Ground System Maintenance (AGSM) Integrated Health Management (IHM) project is presented and the impact discussed.

  14. Structured spatio-temporal shot-noise Cox point process models, with a view to modelling forest fires

    DEFF Research Database (Denmark)

    Møller, Jesper; Diaz-Avalos, Carlos

    Spatio-temporal Cox point process models with a multiplicative structure for the driving random intensity, incorporating covariate information into temporal and spatial components, and with a residual term modelled by a shot-noise process, are considered. Such models are flexible and tractable fo...... dataset consisting of 2796 days and 5834 spatial locations of fires. The model is compared with a spatio-temporal log-Gaussian Cox point process model, and likelihood-based methods are discussed to some extent....

  15. Structured Spatio-temporal shot-noise Cox point process models, with a view to modelling forest fires

    DEFF Research Database (Denmark)

    Møller, Jesper; Diaz-Avalos, Carlos

    2010-01-01

    Spatio-temporal Cox point process models with a multiplicative structure for the driving random intensity, incorporating covariate information into temporal and spatial components, and with a residual term modelled by a shot-noise process, are considered. Such models are flexible and tractable fo...... data set consisting of 2796 days and 5834 spatial locations of fires. The model is compared with a spatio-temporal log-Gaussian Cox point process model, and likelihood-based methods are discussed to some extent....

  16. Enhancing the performance of model-based elastography by incorporating additional a priori information in the modulus image reconstruction process

    International Nuclear Information System (INIS)

    Doyley, Marvin M; Srinivasan, Seshadri; Dimidenko, Eugene; Soni, Nirmal; Ophir, Jonathan

    2006-01-01

    Model-based elastography is fraught with problems owing to the ill-posed nature of the inverse elasticity problem. To overcome this limitation, we have recently developed a novel inversion scheme that incorporates a priori information concerning the mechanical properties of the underlying tissue structures, and the variance incurred during displacement estimation in the modulus image reconstruction process. The information was procured by employing standard strain imaging methodology, and introduced in the reconstruction process through the generalized Tikhonov approach. In this paper, we report the results of experiments conducted on gelatin phantoms to evaluate the performance of modulus elastograms computed with the generalized Tikhonov (GTK) estimation criterion relative to those computed by employing the un-weighted least-squares estimation criterion, the weighted least-squares estimation criterion and the standard Tikhonov method (i.e., the generalized Tikhonov method with no modulus prior). The results indicate that modulus elastograms computed with the generalized Tikhonov approach had superior elastographic contrast discrimination and contrast recovery. In addition, image reconstruction was more resilient to structural decorrelation noise when additional constraints were imposed on the reconstruction process through the GTK method

  17. The Automation of Nowcast Model Assessment Processes

    Science.gov (United States)

    2016-09-01

    secondly, provide modelers with the information needed to understand the model errors and how their algorithm changes might mitigate these errors. In...by ARL modelers. 2. Development Environment The automation of Point-Stat processes (i.e., PSA) was developed using Python 3.5.* Python was selected...because it is easy to use, widely used for scripting, and satisfies all the requirements to automate the implementation of the Point-Stat tool. In

  18. Modelling the ICE standard with a formal language for information commerce

    NARCIS (Netherlands)

    Wombacher, Andreas; Aberer, K.

    Automatizing information commerce requires languages to represent the typical information commerce processes. Existing languages and standards cover either only very specific types of business models or are too general to capture in a concise way the specific properties of information commerce

  19. Physics as Information Processing

    International Nuclear Information System (INIS)

    D'Ariano, Giacomo Mauro

    2011-01-01

    I review some recent advances in foundational research at Pavia QUIT group. The general idea is that there is only Quantum Theory without quantization rules, and the whole Physics - including space-time and relativity - is emergent from the quantum-information processing. And since Quantum Theory itself is axiomatized solely on informational principles, the whole Physics must be reformulated in information-theoretical terms: this is the It from bit of J. A. Wheeler.The review is divided into four parts: a) the informational axiomatization of Quantum Theory; b) how space-time and relativistic covariance emerge from quantum computation; c) what is the information-theoretical meaning of inertial mass and of (ℎ/2π), and how the quantum field emerges; d) an observational consequence of the new quantum field theory: a mass-dependent refraction index of vacuum. I will conclude with the research lines that will follow in the immediate future.

  20. Mindfulness training alters emotional memory recall compared to active controls: support for an emotional information processing model of mindfulness.

    Science.gov (United States)

    Roberts-Wolfe, Douglas; Sacchet, Matthew D; Hastings, Elizabeth; Roth, Harold; Britton, Willoughby

    2012-01-01

    While mindfulness-based interventions have received widespread application in both clinical and non-clinical populations, the mechanism by which mindfulness meditation improves well-being remains elusive. One possibility is that mindfulness training alters the processing of emotional information, similar to prevailing cognitive models of depression and anxiety. The aim of this study was to investigate the effects of mindfulness training on emotional information processing (i.e., memory) biases in relation to both clinical symptomatology and well-being in comparison to active control conditions. Fifty-eight university students (28 female, age = 20.1 ± 2.7 years) participated in either a 12-week course containing a "meditation laboratory" or an active control course with similar content or experiential practice laboratory format (music). Participants completed an emotional word recall task and self-report questionnaires of well-being and clinical symptoms before and after the 12-week course. Meditators showed greater increases in positive word recall compared to controls [F(1, 56) = 6.6, p = 0.02]. The meditation group increased significantly more on measures of well-being [F(1, 56) = 6.6, p = 0.01], with a marginal decrease in depression and anxiety [F(1, 56) = 3.0, p = 0.09] compared to controls. Increased positive word recall was associated with increased psychological well-being (r = 0.31, p = 0.02) and decreased clinical symptoms (r = -0.29, p = 0.03). Mindfulness training was associated with greater improvements in processing efficiency for positively valenced stimuli than active control conditions. This change in emotional information processing was associated with improvements in psychological well-being and less depression and anxiety. These data suggest that mindfulness training may improve well-being via changes in emotional information processing. Future research with a fully randomized design will be

  1. Capturing value from external NPD collaboration — the significant role of market information processing

    DEFF Research Database (Denmark)

    Tandrup, Thomas

    . By including customers, suppliers, competitors, universities, and other external experts in the development process, firms gain access to information, knowledge, and ideas that otherwise would have been out of reach. Extensive previous research has documented the beneficial effects of collaborating with many...... sources.This study contributes to the existing knowledge of firms’ use of external sources in new product development. A model is presented that tests the effectiveness of external collaboration when multiple external sources have to be managed simultaneously. Also, firms’ ability to process information...... of determining whether it is any more difficult to collaborate with external sources and process information about products that are completely new to the market.This thesis presents a model that points out how difficult it is to collaborate with many external sources unless the firm has the right formal...

  2. A process framework for information security management

    Directory of Open Access Journals (Sweden)

    Knut Haufe

    2016-01-01

    Full Text Available Securing sensitive organizational data has become increasingly vital to organizations. An Information Security Management System (ISMS is a systematic approach for establishing, implementing, operating, monitoring, reviewing, maintaining and improving an organization's information security. Key elements of the operation of an ISMS are ISMS processes. However, and in spite of its importance, an ISMS process framework with a description of ISMS processes and their interaction as well as the interaction with other management processes is not available in the literature. Cost benefit analysis of information security investments regarding single measures protecting information and ISMS processes are not in the focus of current research, mostly focused on economics. This article aims to fill this research gap by proposing such an ISMS process framework as the main contribution. Based on a set of agreed upon ISMS processes in existing standards like ISO 27000 series, COBIT and ITIL. Within the framework, identified processes are described and their interaction and interfaces are specified. This framework helps to focus on the operation of the ISMS, instead of focusing on measures and controls. By this, as a main finding, the systemic character of the ISMS consisting of processes and the perception of relevant roles of the ISMS is strengthened.

  3. Comparison on information-seeking behavior of postgraduated students in Isfahan University of Medical Sciences and University of Isfahan in writing dissertation based on Kuhlthau model of information search process.

    Science.gov (United States)

    Abedi, Mahnaz; Ashrafi-Rizi, Hasan; Zare-Farashbandi, Firoozeh; Nouri, Rasoul; Hassanzadeh, Akbar

    2014-01-01

    Information-seeking behaviors have been one of the main focuses of researchers in order to identify and solve the problems users face in information recovery. The aim of this research is Comparative on Information-Seeking Behavior of the Postgraduate Students in Isfahan University of Medical Sciences and Isfahan University in Writing Dissertation based on Kuhlthau Model of Information Search Process in 2012. The research method followed is survey and the data collection tool is Narmenji questionnaire. Statistical population was all postgraduate students in Isfahan University of Medical Sciences and Isfahan University. The sample size was 196 people and sampling was stratified randomly. The type of statistical analyses were descriptive (mean and frequency) and inferential (independent t test and Pearson's correlation) and the software used was SPSS20. The findings showed that Isfahan Medical Sciences University followed 20% of the order steps of this model and Isfahan University did not follow this model. In the first stage (Initiation) and sixth (Presentation) of feelings aspects and in actions (total stages) significant difference was found between students from the two universities. Between gender and fourth stage (Formulation) and the total score of feelings the Kuhlthau model there has a significant relationship. Also there was a significant and inverse relationship between the third stage (Exploration) of feelings and age of the students. The results showed that in writing dissertation there were some major differences in following up the Kuhlthau model between students of the two Universities. There are significant differences between some of the stages of feelings and actions of students' information-seeking behavior from the two universities. There is a significant relationship between the fourth stage (Formulation) of feelings in the Kuhlthau Model with gender and third stage of the Feelings (Exploration) with age.

  4. Comparative analysis of business rules and business process modeling languages

    Directory of Open Access Journals (Sweden)

    Audrius Rima

    2013-03-01

    Full Text Available During developing an information system is important to create clear models and choose suitable modeling languages. The article analyzes the SRML, SBVR, PRR, SWRL, OCL rules specifying language and UML, DFD, CPN, EPC and IDEF3 BPMN business process modeling language. The article presents business rules and business process modeling languages theoretical comparison. The article according to selected modeling aspects of the comparison between different business process modeling languages ​​and business rules representation languages sets. Also, it is selected the best fit of language set for three layer framework for business rule based software modeling.

  5. Interactivity, Information Processing, and Learning on the World Wide Web.

    Science.gov (United States)

    Tremayne, Mark; Dunwoody, Sharon

    2001-01-01

    Examines the role of interactivity in the presentation of science news on the World Wide Web. Proposes and tests a model of interactive information processing that suggests that characteristics of users and Web sites influence interactivity, which influences knowledge acquisition. Describes use of a think-aloud method to study participants' mental…

  6. MODELING INFORMATION SYSTEM AVAILABILITY BY USING BAYESIAN BELIEF NETWORK APPROACH

    Directory of Open Access Journals (Sweden)

    Semir Ibrahimović

    2016-03-01

    Full Text Available Modern information systems are expected to be always-on by providing services to end-users, regardless of time and location. This is particularly important for organizations and industries where information systems support real-time operations and mission-critical applications that need to be available on 24  7  365 basis. Examples of such entities include process industries, telecommunications, healthcare, energy, banking, electronic commerce and a variety of cloud services. This article presents a modified Bayesian Belief Network model for predicting information system availability, introduced initially by Franke, U. and Johnson, P. (in article “Availability of enterprise IT systems – an expert based Bayesian model”. Software Quality Journal 20(2, 369-394, 2012 based on a thorough review of several dimensions of the information system availability, we proposed a modified set of determinants. The model is parameterized by using probability elicitation process with the participation of experts from the financial sector of Bosnia and Herzegovina. The model validation was performed using Monte Carlo simulation.

  7. Bim Automation: Advanced Modeling Generative Process for Complex Structures

    Science.gov (United States)

    Banfi, F.; Fai, S.; Brumana, R.

    2017-08-01

    The new paradigm of the complexity of modern and historic structures, which are characterised by complex forms, morphological and typological variables, is one of the greatest challenges for building information modelling (BIM). Generation of complex parametric models needs new scientific knowledge concerning new digital technologies. These elements are helpful to store a vast quantity of information during the life cycle of buildings (LCB). The latest developments of parametric applications do not provide advanced tools, resulting in time-consuming work for the generation of models. This paper presents a method capable of processing and creating complex parametric Building Information Models (BIM) with Non-Uniform to NURBS) with multiple levels of details (Mixed and ReverseLoD) based on accurate 3D photogrammetric and laser scanning surveys. Complex 3D elements are converted into parametric BIM software and finite element applications (BIM to FEA) using specific exchange formats and new modelling tools. The proposed approach has been applied to different case studies: the BIM of modern structure for the courtyard of West Block on Parliament Hill in Ottawa (Ontario) and the BIM of Masegra Castel in Sondrio (Italy), encouraging the dissemination and interaction of scientific results without losing information during the generative process.

  8. Post-processing of Deep Web Information Extraction Based on Domain Ontology

    Directory of Open Access Journals (Sweden)

    PENG, T.

    2013-11-01

    Full Text Available Many methods are utilized to extract and process query results in deep Web, which rely on the different structures of Web pages and various designing modes of databases. However, some semantic meanings and relations are ignored. So, in this paper, we present an approach for post-processing deep Web query results based on domain ontology which can utilize the semantic meanings and relations. A block identification model (BIM based on node similarity is defined to extract data blocks that are relevant to specific domain after reducing noisy nodes. Feature vector of domain books is obtained by result set extraction model (RSEM based on vector space model (VSM. RSEM, in combination with BIM, builds the domain ontology on books which can not only remove the limit of Web page structures when extracting data information, but also make use of semantic meanings of domain ontology. After extracting basic information of Web pages, a ranking algorithm is adopted to offer an ordered list of data records to users. Experimental results show that BIM and RSEM extract data blocks and build domain ontology accurately. In addition, relevant data records and basic information are extracted and ranked. The performances precision and recall show that our proposed method is feasible and efficient.

  9. The Application of Use Case Modeling in Designing Medical Imaging Information Systems

    International Nuclear Information System (INIS)

    Safdari, Reza; Farzi, Jebraeil; Ghazisaeidi, Marjan; Mirzaee, Mahboobeh; Goodini, Azadeh

    2013-01-01

    Introduction. The essay at hand is aimed at examining the application of use case modeling in analyzing and designing information systems to support Medical Imaging services. Methods. The application of use case modeling in analyzing and designing health information systems was examined using electronic databases (Pubmed, Google scholar) resources and the characteristics of the modeling system and its effect on the development and design of the health information systems were analyzed. Results. Analyzing the subject indicated that Provident modeling of health information systems should provide for quick access to many health data resources in a way that patients' data can be used in order to expand distant services and comprehensive Medical Imaging advices. Also these experiences show that progress in the infrastructure development stages through gradual and repeated evolution process of user requirements is stronger and this can lead to a decline in the cycle of requirements engineering process in the design of Medical Imaging information systems. Conclusion. Use case modeling approach can be effective in directing the problems of health and Medical Imaging information systems towards understanding, focusing on the start and analysis, better planning, repetition, and control

  10. Cerebro-cerebellar interactions underlying temporal information processing.

    Science.gov (United States)

    Aso, Kenji; Hanakawa, Takashi; Aso, Toshihiko; Fukuyama, Hidenao

    2010-12-01

    The neural basis of temporal information processing remains unclear, but it is proposed that the cerebellum plays an important role through its internal clock or feed-forward computation functions. In this study, fMRI was used to investigate the brain networks engaged in perceptual and motor aspects of subsecond temporal processing without accompanying coprocessing of spatial information. Direct comparison between perceptual and motor aspects of time processing was made with a categorical-design analysis. The right lateral cerebellum (lobule VI) was active during a time discrimination task, whereas the left cerebellar lobule VI was activated during a timed movement generation task. These findings were consistent with the idea that the cerebellum contributed to subsecond time processing in both perceptual and motor aspects. The feed-forward computational theory of the cerebellum predicted increased cerebro-cerebellar interactions during time information processing. In fact, a psychophysiological interaction analysis identified the supplementary motor and dorsal premotor areas, which had a significant functional connectivity with the right cerebellar region during a time discrimination task and with the left lateral cerebellum during a timed movement generation task. The involvement of cerebro-cerebellar interactions may provide supportive evidence that temporal information processing relies on the simulation of timing information through feed-forward computation in the cerebellum.

  11. Modelling a uranium ore bioleaching process

    International Nuclear Information System (INIS)

    Chien, D.C.H.; Douglas, P.L.; Herman, D.H.; Marchbank, A.

    1990-01-01

    A dynamic simulation model for the bioleaching of uranium ore in a stope leaching process has been developed. The model incorporates design and operating conditions, reaction kinetics enhanced by Thiobacillus ferroxidans present in the leaching solution and transport properties. Model predictions agree well with experimental data with an average deviation of about ± 3%. The model is sensitive to small errors in the estimates of fragment size and ore grade. Because accurate estimates are difficult to obtain a parameter estimation approach was developed to update the value of fragment size and ore grade using on-line plant information

  12. Success and failure factors in the regional health information system design process--results from a constructive evaluation study.

    Science.gov (United States)

    Nykänen, P; Karimaa, E

    2006-01-01

    To identify success and failure factors in the design process of a regional health information system. A constructive evaluation study including interviews, observations, usability study and document analysis. Modelling was found to be a key element for the successful implementation of a health information system. The developed service chain model helped to define use cases and to implement seamless service chains. User participation in the design process was a success factor resulting in good user acceptance and signs of positive impacts on work practices. Evaluation study also helped system developers to guide the system's further development. An important failure factor identified was the lack of semantic interoperability of the system components. The results emphasize the socio-technical nature of health information systems. The starting point for development should be thorough insight into the health care work practices where the information systems are to be used. Successful system design should start from modelling of work processes, data and information flows and definition of concepts and their relations. Health informatics as a scientific discipline provides theories and models for the design and development process.

  13. Multiple health risk perception and information processing among African Americans and whites living in poverty.

    Science.gov (United States)

    Hovick, Shelly R; Freimuth, Vicki S; Johnson-Turbes, Ashani; Chervin, Doryn D

    2011-11-01

    We investigated the risk-information-processing behaviors of people living at or near the poverty line. Because significant gaps in health and communication exist among high- and low-income groups, increasing the information seeking and knowledge of poor individuals may help them better understand risks to their health and increase their engagement in health-protective behaviors. Most earlier studies assessed only a single health risk selected by the researcher, whereas we listed 10 health risks and allowed the respondents to identify the one that they worried about most but took little action to prevent. Using this risk, we tested one pathway inspired by the risk information seeking and processing model to examine predictors of information insufficiency and of systematic processing and extended this pathway to include health-protective action. A phone survey was conducted of African Americans and whites living in the southern United States with an annual income of ≤$35,000 (N= 431). The results supported the model pathway: worry partially mediated the relationship between perceived risk and information insufficiency, which, in turn, increased systematic processing. In addition, systematic processing increased health-protective action. Compared with whites and better educated respondents, African Americans and respondents with little education had significantly higher levels of information insufficiency but higher levels of systematic processing and health-protective action. That systematic processing and knowledge influenced health behavior suggests a potential strategy for reducing health disparities. © 2011 Society for Risk Analysis.

  14. PHYSICAL RESOURCES OF INFORMATION PROCESSES AND TECHNOLOGIES

    Directory of Open Access Journals (Sweden)

    Mikhail O. Kolbanev

    2014-11-01

    Full Text Available Subject of study. The paper describes basic information technologies for automating of information processes of data storage, distribution and processing in terms of required physical resources. It is shown that the study of these processes with such traditional objectives of modern computer science, as the ability to transfer knowledge, degree of automation, information security, coding, reliability, and others, is not enough. The reasons are: on the one hand, the increase in the volume and intensity of information exchange in the subject of human activity and, on the other hand, drawing near to the limit of information systems efficiency based on semiconductor technologies. Creation of such technologies, which not only provide support for information interaction, but also consume a rational amount of physical resources, has become an actual problem of modern engineering development. Thus, basic information technologies for storage, distribution and processing of information to support the interaction between people are the object of study, and physical temporal, spatial and energy resources required for implementation of these technologies are the subject of study. Approaches. An attempt is made to enlarge the possibilities of traditional cybernetics methodology, which replaces the consideration of material information component by states search for information objects. It is done by taking explicitly into account the amount of physical resources required for changes in the states of information media. Purpose of study. The paper deals with working out of a common approach to the comparison and subsequent selection of basic information technologies for storage, distribution and processing of data, taking into account not only the requirements for the quality of information exchange in particular subject area and the degree of technology application, but also the amounts of consumed physical resources. Main findings. Classification of resources

  15. THE MODEL OF DISTINCTION OF ACCESS RIGHTS TO INFORMATION OBJECTS OF THE SYSTEM OF CONTROLLING OF BUSINESS PROCESSES OF AN AVIATION ENTERPRISE

    Directory of Open Access Journals (Sweden)

    Andrey V. Degtyarev

    2014-01-01

    Full Text Available On the basis of the analysis of controlling system of business processes ofaviation enterprise was formulated the approach for set up an hierarchicalmodel of personal permissions to information resources of an automatic of thesystem of controlling of projects and contracts (ASCPC on the instrumentaland procedure levels. On the model base structure of personalized key wasdeveloped. This model reflective of possibilities of the every category of userswhen working with ASCPC.

  16. Common data model for natural language processing based on two existing standard information models: CDA+GrAF.

    Science.gov (United States)

    Meystre, Stéphane M; Lee, Sanghoon; Jung, Chai Young; Chevrier, Raphaël D

    2012-08-01

    An increasing need for collaboration and resources sharing in the Natural Language Processing (NLP) research and development community motivates efforts to create and share a common data model and a common terminology for all information annotated and extracted from clinical text. We have combined two existing standards: the HL7 Clinical Document Architecture (CDA), and the ISO Graph Annotation Format (GrAF; in development), to develop such a data model entitled "CDA+GrAF". We experimented with several methods to combine these existing standards, and eventually selected a method wrapping separate CDA and GrAF parts in a common standoff annotation (i.e., separate from the annotated text) XML document. Two use cases, clinical document sections, and the 2010 i2b2/VA NLP Challenge (i.e., problems, tests, and treatments, with their assertions and relations), were used to create examples of such standoff annotation documents, and were successfully validated with the XML schemata provided with both standards. We developed a tool to automatically translate annotation documents from the 2010 i2b2/VA NLP Challenge format to GrAF, and automatically generated 50 annotation documents using this tool, all successfully validated. Finally, we adapted the XSL stylesheet provided with HL7 CDA to allow viewing annotation XML documents in a web browser, and plan to adapt existing tools for translating annotation documents between CDA+GrAF and the UIMA and GATE frameworks. This common data model may ease directly comparing NLP tools and applications, combining their output, transforming and "translating" annotations between different NLP applications, and eventually "plug-and-play" of different modules in NLP applications. Copyright © 2011 Elsevier Inc. All rights reserved.

  17. Informational and Causal Architecture of Discrete-Time Renewal Processes

    Directory of Open Access Journals (Sweden)

    Sarah E. Marzen

    2015-07-01

    Full Text Available Renewal processes are broadly used to model stochastic behavior consisting of isolated events separated by periods of quiescence, whose durations are specified by a given probability law. Here, we identify the minimal sufficient statistic for their prediction (the set of causal states, calculate the historical memory capacity required to store those states (statistical complexity, delineate what information is predictable (excess entropy, and decompose the entropy of a single measurement into that shared with the past, future, or both. The causal state equivalence relation defines a new subclass of renewal processes with a finite number of causal states despite having an unbounded interevent count distribution. We use the resulting formulae to analyze the output of the parametrized Simple Nonunifilar Source, generated by a simple two-state hidden Markov model, but with an infinite-state ϵ-machine presentation. All in all, the results lay the groundwork for analyzing more complex processes with infinite statistical complexity and infinite excess entropy.

  18. A Typology for Modeling Processes in Clinical Guidelines and Protocols

    Science.gov (United States)

    Tu, Samson W.; Musen, Mark A.

    We analyzed the graphical representations that are used by various guideline-modeling methods to express process information embodied in clinical guidelines and protocols. From this analysis, we distilled four modeling formalisms and the processes they typically model: (1) flowcharts for capturing problem-solving processes, (2) disease-state maps that link decision points in managing patient problems over time, (3) plans that specify sequences of activities that contribute toward a goal, (4) workflow specifications that model care processes in an organization. We characterized the four approaches and showed that each captures some aspect of what a guideline may specify. We believe that a general guideline-modeling system must provide explicit representation for each type of process.

  19. Review of "Conceptual Structures: Information Processing in Mind and Machine."

    Science.gov (United States)

    Smoliar, Stephen W.

    This review of the book, "Conceptual Structures: Information Processing in Mind and Machine," by John F. Sowa, argues that anyone who plans to get involved with issues of knowledge representation should have at least a passing acquaintance with Sowa's conceptual graphs for a database interface. (Used to model the underlying semantics of…

  20. Introduction to spiking neural networks: Information processing, learning and applications.

    Science.gov (United States)

    Ponulak, Filip; Kasinski, Andrzej

    2011-01-01

    The concept that neural information is encoded in the firing rate of neurons has been the dominant paradigm in neurobiology for many years. This paradigm has also been adopted by the theory of artificial neural networks. Recent physiological experiments demonstrate, however, that in many parts of the nervous system, neural code is founded on the timing of individual action potentials. This finding has given rise to the emergence of a new class of neural models, called spiking neural networks. In this paper we summarize basic properties of spiking neurons and spiking networks. Our focus is, specifically, on models of spike-based information coding, synaptic plasticity and learning. We also survey real-life applications of spiking models. The paper is meant to be an introduction to spiking neural networks for scientists from various disciplines interested in spike-based neural processing.

  1. Conjoint Management of Business Processes and Information Technologies

    DEFF Research Database (Denmark)

    Siurdyban, Artur

    and improve business processes. As a consequence, there is a growing need to address managerial aspects of the relationships between information technologies and business processes. The aim of this PhD study is to investigate how the practice of conjoint management of business processes and information...... technologies can be supported and improved. The study is organized into five research papers and this summary. Each paper addresses a different aspect of conjoint management of business processes and information technologies, i.e. problem development and managerial practices on software...... and information technologies in a project environment. It states that both elements are intrinsically related and should be designed and considered together. The second case examines the relationships between information technology management and business process management. It discusses the multi-faceted role...

  2. Modeling Dynamic Food Choice Processes to Understand Dietary Intervention Effects.

    Science.gov (United States)

    Marcum, Christopher Steven; Goldring, Megan R; McBride, Colleen M; Persky, Susan

    2018-02-17

    Meal construction is largely governed by nonconscious and habit-based processes that can be represented as a collection of in dividual, micro-level food choices that eventually give rise to a final plate. Despite this, dietary behavior intervention research rarely captures these micro-level food choice processes, instead measuring outcomes at aggregated levels. This is due in part to a dearth of analytic techniques to model these dynamic time-series events. The current article addresses this limitation by applying a generalization of the relational event framework to model micro-level food choice behavior following an educational intervention. Relational event modeling was used to model the food choices that 221 mothers made for their child following receipt of an information-based intervention. Participants were randomized to receive either (a) control information; (b) childhood obesity risk information; (c) childhood obesity risk information plus a personalized family history-based risk estimate for their child. Participants then made food choices for their child in a virtual reality-based food buffet simulation. Micro-level aspects of the built environment, such as the ordering of each food in the buffet, were influential. Other dynamic processes such as choice inertia also influenced food selection. Among participants receiving the strongest intervention condition, choice inertia decreased and the overall rate of food selection increased. Modeling food selection processes can elucidate the points at which interventions exert their influence. Researchers can leverage these findings to gain insight into nonconscious and uncontrollable aspects of food selection that influence dietary outcomes, which can ultimately improve the design of dietary interventions.

  3. Synergistic Information Processing Encrypts Strategic Reasoning in Poker.

    Science.gov (United States)

    Frey, Seth; Albino, Dominic K; Williams, Paul L

    2018-06-14

    There is a tendency in decision-making research to treat uncertainty only as a problem to be overcome. But it is also a feature that can be leveraged, particularly in social interaction. Comparing the behavior of profitable and unprofitable poker players, we reveal a strategic use of information processing that keeps decision makers unpredictable. To win at poker, a player must exploit public signals from others. But using public inputs makes it easier for an observer to reconstruct that player's strategy and predict his or her behavior. How should players trade off between exploiting profitable opportunities and remaining unexploitable themselves? Using a recent multivariate approach to information theoretic data analysis and 1.75 million hands of online two-player No-Limit Texas Hold'em, we find that the important difference between winning and losing players is not in the amount of information they process, but how they process it. In particular, winning players are better at integrative information processing-creating new information from the interaction between their cards and their opponents' signals. We argue that integrative information processing does not just produce better decisions, it makes decision-making harder for others to reverse engineer, as an expert poker player's cards act like the private key in public-key cryptography. Poker players encrypt their reasoning with the way they process information. The encryption function of integrative information processing makes it possible for players to exploit others while remaining unexploitable. By recognizing the act of information processing as a strategic behavior in its own right, we offer a detailed account of how experts use endemic uncertainty to conceal their intentions in high-stakes competitive environments, and we highlight new opportunities between cognitive science, information theory, and game theory. Copyright © 2018 Cognitive Science Society, Inc.

  4. Modelling information dissemination under privacy concerns in social media

    Science.gov (United States)

    Zhu, Hui; Huang, Cheng; Lu, Rongxing; Li, Hui

    2016-05-01

    Social media has recently become an important platform for users to share news, express views, and post messages. However, due to user privacy preservation in social media, many privacy setting tools are employed, which inevitably change the patterns and dynamics of information dissemination. In this study, a general stochastic model using dynamic evolution equations was introduced to illustrate how privacy concerns impact the process of information dissemination. Extensive simulations and analyzes involving the privacy settings of general users, privileged users, and pure observers were conducted on real-world networks, and the results demonstrated that user privacy settings affect information differently. Finally, we also studied the process of information diffusion analytically and numerically with different privacy settings using two classic networks.

  5. Processing of recognition information and additional cues: A model-based analysis of choice, confidence, and response time

    Directory of Open Access Journals (Sweden)

    Andreas Glockner

    2011-02-01

    Full Text Available Research on the processing of recognition information has focused on testing the recognition heuristic (RH. On the aggregate, the noncompensatory use of recognition information postulated by the RH was rejected in several studies, while RH could still account for a considerable proportion of choices. These results can be explained if either a a part of the subjects used RH or b nobody used it but its choice predictions were accidentally in line with predictions of the strategy used. In the current study, which exemplifies a new approach to model testing, we determined individuals' decision strategies based on a maximum-likelihood classification method, taking into account choices, response times and confidence ratings simultaneously. Unlike most previous studies of the RH, our study tested the RH under conditions in which we provided information about cue values of unrecognized objects (which we argue is fairly common and thus of some interest. For 77.5% of the subjects, overall behavior was best explained by a compensatory parallel constraint satisfaction (PCS strategy. The proportion of subjects using an enhanced RH heuristic (RHe was negligible (up to 7.5%; 15% of the subjects seemed to use a take the best strategy (TTB. A more-fine grained analysis of the supplemental behavioral parameters conditional on strategy use supports PCS but calls into question process assumptions for apparent users of RH, RHe, and TTB within our experimental context. Our results are consistent with previous literature highlighting the importance of individual strategy classification as compared to aggregated analyses.

  6. Quantum information processing beyond ten ion-qubits

    International Nuclear Information System (INIS)

    Monz, T.

    2011-01-01

    Successful processing of quantum information is, to a large degree, based on two aspects: a) the implementation of high-fidelity quantum gates, as well as b) avoiding or suppressing decoherence processes that destroy quantum information. The presented work shows our progress in the field of experimental quantum information processing over the last years: the implementation and characterisation of several quantum operations, amongst others the first realisation of the quantum Toffoli gate in an ion-trap based quantum computer. The creation of entangled states with up to 14 qubits serves as basis for investigations of decoherence processes. Based on the realised quantum operations as well as the knowledge about dominant noise processes in the employed apparatus, entanglement swapping as well as quantum operations within a decoherence-free subspace are demonstrated. (author) [de

  7. An experimental approach to estimate operator’s information processing capacity for diagnosing tasks in NPPs

    International Nuclear Information System (INIS)

    Kim, Ji Tae; Shin, Seung Ki; Kim, Jong Hyun; Seong, Poong Hyun

    2013-01-01

    Highlights: • Main control room operator’s information processing capacity is determined. • The relationship between the information processing capacity and human factors is described. • The information processing capacity results from the subjective and physiological measures are nearly identical. - Abstract: The objectives of this research are: (1) to determine information processing capacity of an operator in a main control room and (2) to describe the relationship between the information processing capacity and human factors. This research centers on the relationship, as experimentally determined, between an operator’s mental workload and information flow during accident diagnosis tasks at nuclear power plants. Based on this relationship, the operator’s information processing capacity is established. In this paper, the information processing capacity is defined as the operator’s ability to manage the amount of bits in a second when diagnosing tasks or accidents. If the operator’s performance decreases rapidly as the information flow rate (bit/s) increases, it is possible to determine the operator’s information processing capacity. The cognitive information of a diagnosis task can be quantified using an information flow model and the operator’s mental workload is measured by subjective and physiological measures. NASA-TLX (Task Load indeX) is selected as the subjective method and an eye tracking system is used as the physiological measure for the workload. In addition, the information processing capacity related to human factors is investigated. Once the information processing capacity of operators is known, then it will be possible to apply it to predict the operators’ performance, design diagnosis tasks, and design human–machine interface

  8. Design and analysis of information model hotel complex

    Directory of Open Access Journals (Sweden)

    Garyaev Nikolai

    2016-01-01

    Full Text Available The article analyzes the innovation in 3D modeling and development of process design approaches based on visualization of information technology and computer-aided design systems. The problems arising in the modern design and the approach to address them.

  9. The relations among body consciousness, somatic symptom report, and information processing speed in chronic fatigue syndrome.

    NARCIS (Netherlands)

    Werf, S.P. van der; Vree, B.P.W. de; Meer, J.W.M. van der; Bleijenberg, G.

    2002-01-01

    OBJECTIVE: The aim of this study was to assess the potential influence of body consciousness and levels of somatic symptom report upon information processing speed in patients with chronic fatigue syndrome (CFS). BACKGROUND: According to a model of a fixed information processing capacity, it was

  10. Extracting business vocabularies from business process models: SBVR and BPMN standards-based approach

    Science.gov (United States)

    Skersys, Tomas; Butleris, Rimantas; Kapocius, Kestutis

    2013-10-01

    Approaches for the analysis and specification of business vocabularies and rules are very relevant topics in both Business Process Management and Information Systems Development disciplines. However, in common practice of Information Systems Development, the Business modeling activities still are of mostly empiric nature. In this paper, basic aspects of the approach for business vocabularies' semi-automated extraction from business process models are presented. The approach is based on novel business modeling-level OMG standards "Business Process Model and Notation" (BPMN) and "Semantics for Business Vocabularies and Business Rules" (SBVR), thus contributing to OMG's vision about Model-Driven Architecture (MDA) and to model-driven development in general.

  11. Bridging the Operational Divide: An Information-Processing Model of Internal Supply Chain Integration

    Science.gov (United States)

    Rosado Feger, Ana L.

    2009-01-01

    Supply Chain Management, the coordination of upstream and downstream flows of product, services, finances, and information from a source to a customer, has risen in prominence over the past fifteen years. The delivery of a product to the consumer is a complex process requiring action from several independent entities. An individual firm consists…

  12. Development of Energy Models for Production Systems and Processes to Inform Environmentally Benign Decision-Making

    Science.gov (United States)

    Diaz-Elsayed, Nancy

    Between 2008 and 2035 global energy demand is expected to grow by 53%. While most industry-level analyses of manufacturing in the United States (U.S.) have traditionally focused on high energy consumers such as the petroleum, chemical, paper, primary metal, and food sectors, the remaining sectors account for the majority of establishments in the U.S. Specifically, of the establishments participating in the Energy Information Administration's Manufacturing Energy Consumption Survey in 2006, the non-energy intensive" sectors still consumed 4*109 GJ of energy, i.e., one-quarter of the energy consumed by the manufacturing sectors, which is enough to power 98 million homes for a year. The increasing use of renewable energy sources and the introduction of energy-efficient technologies in manufacturing operations support the advancement towards a cleaner future, but having a good understanding of how the systems and processes function can reduce the environmental burden even further. To facilitate this, methods are developed to model the energy of manufacturing across three hierarchical levels: production equipment, factory operations, and industry; these methods are used to accurately assess the current state and provide effective recommendations to further reduce energy consumption. First, the energy consumption of production equipment is characterized to provide machine operators and product designers with viable methods to estimate the environmental impact of the manufacturing phase of a product. The energy model of production equipment is tested and found to have an average accuracy of 97% for a product requiring machining with a variable material removal rate profile. However, changing the use of production equipment alone will not result in an optimal solution since machines are part of a larger system. Which machines to use, how to schedule production runs while accounting for idle time, the design of the factory layout to facilitate production, and even the

  13. Introduction to information processing

    CERN Document Server

    Dietel, Harvey M

    2014-01-01

    An Introduction to Information Processing provides an informal introduction to the computer field. This book introduces computer hardware, which is the actual computing equipment.Organized into three parts encompassing 12 chapters, this book begins with an overview of the evolution of personal computing and includes detailed case studies on two of the most essential personal computers for the 1980s, namely, the IBM Personal Computer and Apple's Macintosh. This text then traces the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapte

  14. Implications of Information Theory for Computational Modeling of Schizophrenia.

    Science.gov (United States)

    Silverstein, Steven M; Wibral, Michael; Phillips, William A

    2017-10-01

    Information theory provides a formal framework within which information processing and its disorders can be described. However, information theory has rarely been applied to modeling aspects of the cognitive neuroscience of schizophrenia. The goal of this article is to highlight the benefits of an approach based on information theory, including its recent extensions, for understanding several disrupted neural goal functions as well as related cognitive and symptomatic phenomena in schizophrenia. We begin by demonstrating that foundational concepts from information theory-such as Shannon information, entropy, data compression, block coding, and strategies to increase the signal-to-noise ratio-can be used to provide novel understandings of cognitive impairments in schizophrenia and metrics to evaluate their integrity. We then describe more recent developments in information theory, including the concepts of infomax, coherent infomax, and coding with synergy, to demonstrate how these can be used to develop computational models of schizophrenia-related failures in the tuning of sensory neurons, gain control, perceptual organization, thought organization, selective attention, context processing, predictive coding, and cognitive control. Throughout, we demonstrate how disordered mechanisms may explain both perceptual/cognitive changes and symptom emergence in schizophrenia. Finally, we demonstrate that there is consistency between some information-theoretic concepts and recent discoveries in neurobiology, especially involving the existence of distinct sites for the accumulation of driving input and contextual information prior to their interaction. This convergence can be used to guide future theory, experiment, and treatment development.

  15. Organizational information assets classification model and security architecture methodology

    Directory of Open Access Journals (Sweden)

    Mostafa Tamtaji

    2015-12-01

    Full Text Available Today's, Organizations are exposed with huge and diversity of information and information assets that are produced in different systems shuch as KMS, financial and accounting systems, official and industrial automation sysytems and so on and protection of these information is necessary. Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released.several benefits of this model cuses that organization has a great trend to implementing Cloud computing. Maintaining and management of information security is the main challenges in developing and accepting of this model. In this paper, at first, according to "design science research methodology" and compatible with "design process at information systems research", a complete categorization of organizational assets, including 355 different types of information assets in 7 groups and 3 level, is presented to managers be able to plan corresponding security controls according to importance of each groups. Then, for directing of organization to architect it’s information security in cloud computing environment, appropriate methodology is presented. Presented cloud computing security architecture , resulted proposed methodology, and presented classification model according to Delphi method and expers comments discussed and verified.

  16. Modeling styles in business process modeling

    NARCIS (Netherlands)

    Pinggera, J.; Soffer, P.; Zugal, S.; Weber, B.; Weidlich, M.; Fahland, D.; Reijers, H.A.; Mendling, J.; Bider, I.; Halpin, T.; Krogstie, J.; Nurcan, S.; Proper, E.; Schmidt, R.; Soffer, P.; Wrycza, S.

    2012-01-01

    Research on quality issues of business process models has recently begun to explore the process of creating process models. As a consequence, the question arises whether different ways of creating process models exist. In this vein, we observed 115 students engaged in the act of modeling, recording

  17. An experimental approach to estimation of human information processing capacity for diagnosis tasks in NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ji Tae

    2006-02-15

    root of cause of the faults or accidents. In this research, using both an information flow model and Conant's model, the cognitive information of a diagnosis task can be quantified. Information flow models of eight accident diagnosis tasks are analyzed and quantified. Eight accident cases considered in this research are: Reactor Coolant Pump (RCP) trip, Pressurizer (PZR) spray failure, Main Steam Isolation Valve (MSIV) failure, PZR spray and heater failure, Feedwater Line Break (FLB), Loss Of Coolant Accident (LOCA), Steam Generator Tube Rupture (SGTR), and Steam Line Break (SLB). As mentioned above, to determine the information processing capacity, the workload and an information flow model are used. To measure the operator's workload, subjective and physiological (objective) measures are considered. Subjective methods are used in the various fields including aviation, automobile, and the nuclear industry. In this research NASA-TLX (Task Load ineX) is selected as the subjective method. NASA-TLX is multi-dimensional rating technique and provides an overall workload score based on the weighted average of ratings on six subscales. An eye tracking system is used as a physiological (objective) measure for the workload in this research. The duration of eye closures and the rate of eye blinks decease when the mental demand of a task increases. As visual input is disabled during eye closure, reduced blink rates help to maintain continuous visual input when high levels of visual attention are required. Thorough the experiments, two groups were divided according to a subject's gaze for area of interests. The two groups were a skilled group and a less skilled group, and the information processing capacity of each group was found respectively. Both information processing capacity results from the subjective method (NASA-TLX) and physiological measure (the eye tracking system) were nearly identical. In addition, the information processing capacity related to human

  18. An experimental approach to estimation of human information processing capacity for diagnosis tasks in NPPs

    International Nuclear Information System (INIS)

    Kim, Ji Tae

    2006-02-01

    research, using both an information flow model and Conant's model, the cognitive information of a diagnosis task can be quantified. Information flow models of eight accident diagnosis tasks are analyzed and quantified. Eight accident cases considered in this research are: Reactor Coolant Pump (RCP) trip, Pressurizer (PZR) spray failure, Main Steam Isolation Valve (MSIV) failure, PZR spray and heater failure, Feedwater Line Break (FLB), Loss Of Coolant Accident (LOCA), Steam Generator Tube Rupture (SGTR), and Steam Line Break (SLB). As mentioned above, to determine the information processing capacity, the workload and an information flow model are used. To measure the operator's workload, subjective and physiological (objective) measures are considered. Subjective methods are used in the various fields including aviation, automobile, and the nuclear industry. In this research NASA-TLX (Task Load ineX) is selected as the subjective method. NASA-TLX is multi-dimensional rating technique and provides an overall workload score based on the weighted average of ratings on six subscales. An eye tracking system is used as a physiological (objective) measure for the workload in this research. The duration of eye closures and the rate of eye blinks decease when the mental demand of a task increases. As visual input is disabled during eye closure, reduced blink rates help to maintain continuous visual input when high levels of visual attention are required. Thorough the experiments, two groups were divided according to a subject's gaze for area of interests. The two groups were a skilled group and a less skilled group, and the information processing capacity of each group was found respectively. Both information processing capacity results from the subjective method (NASA-TLX) and physiological measure (the eye tracking system) were nearly identical. In addition, the information processing capacity related to human factors was investigated

  19. Development and validation of the social information processing application: a Web-based measure of social information processing patterns in elementary school-age boys.

    Science.gov (United States)

    Kupersmidt, Janis B; Stelter, Rebecca; Dodge, Kenneth A

    2011-12-01

    The purpose of this study was to evaluate the psychometric properties of an audio computer-assisted self-interviewing Web-based software application called the Social Information Processing Application (SIP-AP) that was designed to assess social information processing skills in boys in 3rd through 5th grades. This study included a racially and ethnically diverse sample of 244 boys ages 8 through 12 (M = 9.4) from public elementary schools in 3 states. The SIP-AP includes 8 videotaped vignettes, filmed from the first-person perspective, that depict common misunderstandings among boys. Each vignette shows a negative outcome for the victim and ambiguous intent on the part of the perpetrator. Boys responded to 16 Web-based questions representing the 5 social information processing mechanisms, after viewing each vignette. Parents and teachers completed measures assessing boys' antisocial behavior. Confirmatory factor analyses revealed that a model positing the original 5 cognitive mechanisms fit the data well when the items representing prosocial cognitions were included on their own factor, creating a 6th factor. The internal consistencies for each of the 16 individual cognitions as well as for the 6 cognitive mechanism scales were excellent. Boys with elevated scores on 5 of the 6 cognitive mechanisms exhibited more antisocial behavior than boys whose scores were not elevated. These findings highlight the need for further research on the measurement of prosocial cognitions or cognitive strengths in boys in addition to assessing cognitive deficits. Findings suggest that the SIP-AP is a reliable and valid tool for use in future research of social information processing skills in boys.

  20. Information Processing and Human Abilities

    Science.gov (United States)

    Kirby, John R.; Das, J. P.

    1978-01-01

    The simultaneous and successive processing model of cognitive abilities was compared to a traditional primary mental abilities model. Simultaneous processing was found to be primarily related to spatial ability; and to a lesser extent, to memory and inductive reasoning. Subjects were 104 fourth-grade urban males. (Author/GD C)

  1. Visual speech information: a help or hindrance in perceptual processing of dysarthric speech.

    Science.gov (United States)

    Borrie, Stephanie A

    2015-03-01

    This study investigated the influence of visual speech information on perceptual processing of neurologically degraded speech. Fifty listeners identified spastic dysarthric speech under both audio (A) and audiovisual (AV) conditions. Condition comparisons revealed that the addition of visual speech information enhanced processing of the neurologically degraded input in terms of (a) acuity (percent phonemes correct) of vowels and consonants and (b) recognition (percent words correct) of predictive and nonpredictive phrases. Listeners exploited stress-based segmentation strategies more readily in AV conditions, suggesting that the perceptual benefit associated with adding visual speech information to the auditory signal-the AV advantage-has both segmental and suprasegmental origins. Results also revealed that the magnitude of the AV advantage can be predicted, to some degree, by the extent to which an individual utilizes syllabic stress cues to inform word recognition in AV conditions. Findings inform the development of a listener-specific model of speech perception that applies to processing of dysarthric speech in everyday communication contexts.

  2. Language Learning Strategies and English Proficiency: Interpretations from Information-Processing Theory

    Science.gov (United States)

    Rao, Zhenhui

    2016-01-01

    The research reported here investigated the relationship between students' use of language learning strategies and their English proficiency, and then interpreted the data from two models in information-processing theory. Results showed that the students' English proficiency significantly affected their use of learning strategies, with high-level…

  3. Quantitative Model for Economic Analyses of Information Security Investment in an Enterprise Information System

    Directory of Open Access Journals (Sweden)

    Bojanc Rok

    2012-11-01

    Full Text Available The paper presents a mathematical model for the optimal security-technology investment evaluation and decision-making processes based on the quantitative analysis of security risks and digital asset assessments in an enterprise. The model makes use of the quantitative analysis of different security measures that counteract individual risks by identifying the information system processes in an enterprise and the potential threats. The model comprises the target security levels for all identified business processes and the probability of a security accident together with the possible loss the enterprise may suffer. The selection of security technology is based on the efficiency of selected security measures. Economic metrics are applied for the efficiency assessment and comparative analysis of different protection technologies. Unlike the existing models for evaluation of the security investment, the proposed model allows direct comparison and quantitative assessment of different security measures. The model allows deep analyses and computations providing quantitative assessments of different options for investments, which translate into recommendations facilitating the selection of the best solution and the decision-making thereof. The model was tested using empirical examples with data from real business environment.

  4. Study on intelligent processing system of man-machine interactive garment frame model

    Science.gov (United States)

    Chen, Shuwang; Yin, Xiaowei; Chang, Ruijiang; Pan, Peiyun; Wang, Xuedi; Shi, Shuze; Wei, Zhongqian

    2018-05-01

    A man-machine interactive garment frame model intelligent processing system is studied in this paper. The system consists of several sensor device, voice processing module, mechanical parts and data centralized acquisition devices. The sensor device is used to collect information on the environment changes brought by the body near the clothes frame model, the data collection device is used to collect the information of the environment change induced by the sensor device, voice processing module is used for speech recognition of nonspecific person to achieve human-machine interaction, mechanical moving parts are used to make corresponding mechanical responses to the information processed by data collection device.it is connected with data acquisition device by a means of one-way connection. There is a one-way connection between sensor device and data collection device, two-way connection between data acquisition device and voice processing module. The data collection device is one-way connection with mechanical movement parts. The intelligent processing system can judge whether it needs to interact with the customer, realize the man-machine interaction instead of the current rigid frame model.

  5. Effect of warning placement on the information processing of college students reading an OTC drug facts panel.

    Science.gov (United States)

    Bhansali, Archita H; Sangani, Darshan S; Mhatre, Shivani K; Sansgiry, Sujit S

    2018-01-01

    To compare three over-the-counter (OTC) Drug Facts panel versions for information processing optimization among college students. University of Houston students (N = 210) participated in a cross-sectional survey from January to May 2010. A current FDA label was compared to two experimental labels developed using the theory of CHREST to test information processing by re-positioning the warning information within the Drug Facts panel. Congruency was defined as placing like information together. Information processing was evaluated using the OTC medication Label Evaluation Process Model (LEPM): label comprehension, ease-of-use, attitude toward the product, product evaluation, and purchase intention. Experimental label with chunked congruent information (uses-directions-other information-warnings) was rated significantly higher than the current FDA label and had the best average scores among the LEPM information processing variables. If replications uphold these findings, the FDA label design might be revised to improve information processing.

  6. A New Technique For Information Processing of CLIC Technical Documentation

    CERN Document Server

    Tzermpinos, Konstantinos

    2013-01-01

    The scientific work presented in this paper could be described as a novel, systemic approach to the process of organization of CLIC documentation. The latter refers to the processing of various sets of archived data found on various CERN archiving services in a more friendly and organized way. From physics aspect, this is equal to having an initial system characterized by high entropy, which after some transformation of energy and matter will produce a final system of reduced entropy. However, this reduction in entropy can be considered valid for open systems only, which are sub-systems of grander isolated systems, to which the total entropy will always increase. Thus, using as basis elements from information theory, systems theory and thermodynamics, the unorganized form of data pending to be organized to a higher form, is modeled as an initial open sub-system with increased entropy, which, after the processing of information, will produce a final system with decreased entropy. This systemic approach to the ...

  7. Variational estimation of process parameters in a simplified atmospheric general circulation model

    Science.gov (United States)

    Lv, Guokun; Koehl, Armin; Stammer, Detlef

    2016-04-01

    Parameterizations are used to simulate effects of unresolved sub-grid-scale processes in current state-of-the-art climate model. The values of the process parameters, which determine the model's climatology, are usually manually adjusted to reduce the difference of model mean state to the observed climatology. This process requires detailed knowledge of the model and its parameterizations. In this work, a variational method was used to estimate process parameters in the Planet Simulator (PlaSim). The adjoint code was generated using automatic differentiation of the source code. Some hydrological processes were switched off to remove the influence of zero-order discontinuities. In addition, the nonlinearity of the model limits the feasible assimilation window to about 1day, which is too short to tune the model's climatology. To extend the feasible assimilation window, nudging terms for all state variables were added to the model's equations, which essentially suppress all unstable directions. In identical twin experiments, we found that the feasible assimilation window could be extended to over 1-year and accurate parameters could be retrieved. Although the nudging terms transform to a damping of the adjoint variables and therefore tend to erases the information of the data over time, assimilating climatological information is shown to provide sufficient information on the parameters. Moreover, the mechanism of this regularization is discussed.

  8. MarsSI: Martian surface data processing information system

    Science.gov (United States)

    Quantin-Nataf, C.; Lozac'h, L.; Thollot, P.; Loizeau, D.; Bultel, B.; Fernando, J.; Allemand, P.; Dubuffet, F.; Poulet, F.; Ody, A.; Clenet, H.; Leyrat, C.; Harrisson, S.

    2018-01-01

    MarsSI (Acronym for Mars System of Information, https://emars.univ-lyon1.fr/MarsSI/, is a web Geographic Information System application which helps managing and processing martian orbital data. The MarsSI facility is part of the web portal called PSUP (Planetary SUrface Portal) developed by the Observatories of Paris Sud (OSUPS) and Lyon (OSUL) to provide users with efficient and easy access to data products dedicated to the martian surface. The portal proposes 1) the management and processing of data thanks to MarsSI and 2) the visualization and merging of high level (imagery, spectral, and topographic) products and catalogs via a web-based user interface (MarsVisu). The portal PSUP as well as the facility MarsVisu is detailed in a companion paper (Poulet et al., 2018). The purpose of this paper is to describe the facility MarsSI. From this application, users are able to easily and rapidly select observations, process raw data via automatic pipelines, and get back final products which can be visualized under Geographic Information Systems. Moreover, MarsSI also contains an automatic stereo-restitution pipeline in order to produce Digital Terrain Models (DTM) on demand from HiRISE (High Resolution Imaging Science Experiment) or CTX (Context Camera) pair-images. This application is funded by the European Union's Seventh Framework Programme (FP7/2007-2013) (ERC project eMars, No. 280168) and has been developed in the scope of Mars, but the design is applicable to any other planetary body of the solar system.

  9. Process modeling style

    CERN Document Server

    Long, John

    2014-01-01

    Process Modeling Style focuses on other aspects of process modeling beyond notation that are very important to practitioners. Many people who model processes focus on the specific notation used to create their drawings. While that is important, there are many other aspects to modeling, such as naming, creating identifiers, descriptions, interfaces, patterns, and creating useful process documentation. Experience author John Long focuses on those non-notational aspects of modeling, which practitioners will find invaluable. Gives solid advice for creating roles, work produ

  10. Social Information Processing Mechanisms and Victimization: A Literature Review.

    Science.gov (United States)

    van Reemst, Lisa; Fischer, Tamar F C; Zwirs, Barbara W C

    2016-01-01

    The aim of the current literature review, which is based on 64 empirical studies, was to assess to what extent mechanisms of the Social Information Processing (SIP) model of Crick and Dodge (1994) are related to victimization. The reviewed studies have provided support for the relation between victimization and several social information processing mechanisms, especially the interpretation of cues and self-efficacy (as part of the response decision). The relationship between victimization and other mechanisms, such as the response generation, was only studied in a few articles. Until now research has often focused on just one step of the model, instead of attempting to measure the associations between multiple mechanisms and victimization in multivariate analyses. Such analyses would be interesting to gain more insight into the SIP model and its relationship with victimization. The few available longitudinal studies show that mechanisms both predict victimization (internal locus of control, negative self-evaluations and less assertive response selection) and are predicted by victimization (hostile attribution of intent and negative evaluations of others). Associations between victimization and SIP mechanisms vary across different types and severity of victimization (stronger in personal and severe victimization), and different populations (stronger among young victims). Practice could focus on these stronger associations and the interpretation of cues. More research is needed however, to investigate whether intervention programs that address SIP mechanisms are suitable for victimization and all relevant populations. © The Author(s) 2014.

  11. Testing an alternate informed consent process.

    Science.gov (United States)

    Yates, Bernice C; Dodendorf, Diane; Lane, Judy; LaFramboise, Louise; Pozehl, Bunny; Duncan, Kathleen; Knodel, Kendra

    2009-01-01

    One of the main problems in conducting clinical trials is low participation rate due to potential participants' misunderstanding of the rationale for the clinical trial or perceptions of loss of control over treatment decisions. The objective of this study was to test an alternate informed consent process in cardiac rehabilitation participants that involved the use of a multimedia flip chart to describe a future randomized clinical trial and then asked, hypothetically, if they would participate in the future trial. An attractive and inviting visual presentation of the study was created in the form of a 23-page flip chart that included 24 color photographs displaying information about the purpose of the study, similarities and differences between the two treatment groups, and the data collection process. We tested the flip chart in 35 cardiac rehabilitation participants. Participants were asked if they would participate in this future study on two occasions: immediately after the description of the flip chart and 24 hours later, after reading through the informed consent document. Participants were also asked their perceptions of the flip chart and consent process. Of the 35 participants surveyed, 19 (54%) indicated that they would participate in the future study. No participant changed his or her decision 24 hours later after reading the full consent form. The participation rate improved 145% over that of an earlier feasibility study where the recruitment rate was 22%. Most participants stated that the flip chart was helpful and informative and that the photographs were effective in communicating the purpose of the study. Participation rates could be enhanced in future clinical trials by using a visual presentation to explain and describe the study as part of the informed consent process. More research is needed to test alternate methods of obtaining informed consent.

  12. Occurrence reporting and processing of operations information

    International Nuclear Information System (INIS)

    1997-01-01

    DOE O 232.1A, Occurrence Reporting and Processing of Operations Information, and 10 CFR 830.350, Occurrence Reporting and Processing of Operations Information (when it becomes effective), along with this manual, set forth occurrence reporting requirements for Department of Energy (DOE) Departmental Elements and contractors responsible for the management and operation of DOE-owned and -leased facilities. These requirements include categorization of occurrences related to safety, security, environment, health, or operations (''Reportable Occurrences''); DOE notification of these occurrences; and the development and submission of documented follow-up reports. This Manual provides detailed information for categorizing and reporting occurrences at DOE facilities. Information gathered by the Occurrence Reporting and processing System is used for analysis of the Department's performance in environmental protection, safeguards and security, and safety and health of its workers and the public. This information is also used to develop lessons learned and document events that significantly impact DOE operations

  13. Occurrence reporting and processing of operations information

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-07-21

    DOE O 232.1A, Occurrence Reporting and Processing of Operations Information, and 10 CFR 830.350, Occurrence Reporting and Processing of Operations Information (when it becomes effective), along with this manual, set forth occurrence reporting requirements for Department of Energy (DOE) Departmental Elements and contractors responsible for the management and operation of DOE-owned and -leased facilities. These requirements include categorization of occurrences related to safety, security, environment, health, or operations (``Reportable Occurrences``); DOE notification of these occurrences; and the development and submission of documented follow-up reports. This Manual provides detailed information for categorizing and reporting occurrences at DOE facilities. Information gathered by the Occurrence Reporting and processing System is used for analysis of the Department`s performance in environmental protection, safeguards and security, and safety and health of its workers and the public. This information is also used to develop lessons learned and document events that significantly impact DOE operations.

  14. Information density converges in dialogue: Towards an information-theoretic model.

    Science.gov (United States)

    Xu, Yang; Reitter, David

    2018-01-01

    The principle of entropy rate constancy (ERC) states that language users distribute information such that words tend to be equally predictable given previous contexts. We examine the applicability of this principle to spoken dialogue, as previous findings primarily rest on written text. The study takes into account the joint-activity nature of dialogue and the topic shift mechanisms that are different from monologue. It examines how the information contributions from the two dialogue partners interactively evolve as the discourse develops. The increase of local sentence-level information density (predicted by ERC) is shown to apply to dialogue overall. However, when the different roles of interlocutors in introducing new topics are identified, their contribution in information content displays a new converging pattern. We draw explanations to this pattern from multiple perspectives: Casting dialogue as an information exchange system would mean that the pattern is the result of two interlocutors maintaining their own context rather than sharing one. Second, we present some empirical evidence that a model of Interactive Alignment may include information density to explain the effect. Third, we argue that building common ground is a process analogous to information convergence. Thus, we put forward an information-theoretic view of dialogue, under which some existing theories of human dialogue may eventually be unified. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Building information modelling (BIM: now and beyond

    Directory of Open Access Journals (Sweden)

    Salman Azhar

    2015-10-01

    Full Text Available Building Information Modeling (BIM, also called n-D Modeling or Virtual Prototyping Technology, is a revolutionary development that is quickly reshaping the Architecture-Engineering-Construction (AEC industry. BIM is both a technology and a process. The technology component of BIM helps project stakeholders to visualize what is to be built in a simulated environment to identify any potential design, construction or operational issues. The process component enables close collaboration and encourages integration of the roles of all stakeholders on a project. The paper presents an overview of BIM with focus on its core concepts, applications in the project life cycle and benefits for project stakeholders with the help of case studies. The paper also elaborates risks and barriers to BIM implementation and future trends.

  16. Building information modelling (BIM: now and beyond

    Directory of Open Access Journals (Sweden)

    Salman Azhar

    2012-12-01

    Full Text Available Building Information Modeling (BIM, also called n-D Modeling or Virtual Prototyping Technology, is a revolutionary development that is quickly reshaping the Architecture-Engineering-Construction (AEC industry. BIM is both a technology and a process. The technology component of BIM helps project stakeholders to visualize what is to be built in a simulated environment to identify any potential design, construction or operational issues. The process component enables close collaboration and encourages integration of the roles of all stakeholders on a project. The paper presents an overview of BIM with focus on its core concepts, applications in the project life cycle and benefits for project stakeholders with the help of case studies. The paper also elaborates risks and barriers to BIM implementation and future trends.

  17. New Product Development (Npd) Process In Subsidiary: Information Perspectives

    OpenAIRE

    Firmanzah

    2008-01-01

    Information is an important resource for new product development (NPD) process in subsidiary. However, we still lack of research to analyze NPD process from information perspective in subsidiary context. This research is an exploratory research and it exploited 8 cases of NPD process in consumer goods subsidiaries operating in Indonesian market. Three types of information have been identified and analyzed NPD process; global, regional and local information. The result of this research ...

  18. Information System Model as a Mobbing Prevention: A Case Study

    Directory of Open Access Journals (Sweden)

    Ersin Karaman

    2014-06-01

    Full Text Available In this study, it is aimed to detect mobbing issues in Atatürk University, Economics and Administrative Science Facultyand provide an information system model to prevent mobbing and reduce the risk. The study consists of two parts;i detect mobbing situation via questionnaire and ii design an information system based on the findings of the first part. The questionnaire was applied to research assistants in the faculty. Five factors were analyzed and it is concluded that research assistants have not been exposed to mobbing except the fact that they have mobbing perception about task assignment process. Results show that task operational difficulty, task time and task period are the common mobbing issues.  In order to develop an information system to cope with these issues,   assignment of exam proctor process is addressed. Exam time, instructor location, classroom location and exam duration are the considered as decision variables to developed linear programming (LP model. Coefficients of these variables and constraints about the LP model are specified in accordance with the findings. It is recommended that research assistants entrusting process should be conducted by using this method to prevent and reduce the risk of mobbing perception in the organization.

  19. A comparative study of the proposed models for the components of the national health information system.

    Science.gov (United States)

    Ahmadi, Maryam; Damanabi, Shahla; Sadoughi, Farahnaz

    2014-04-01

    National Health Information System plays an important role in ensuring timely and reliable access to Health information, which is essential for strategic and operational decisions that improve health, quality and effectiveness of health care. In other words, using the National Health information system you can improve the quality of health data, information and knowledge used to support decision making at all levels and areas of the health sector. Since full identification of the components of this system - for better planning and management influential factors of performanceseems necessary, therefore, in this study different attitudes towards components of this system are explored comparatively. This is a descriptive and comparative kind of study. The society includes printed and electronic documents containing components of the national health information system in three parts: input, process and output. In this context, search for information using library resources and internet search were conducted, and data analysis was expressed using comparative tables and qualitative data. The findings showed that there are three different perspectives presenting the components of national health information system Lippeveld and Sauerborn and Bodart model in 2000, Health Metrics Network (HMN) model from World Health Organization in 2008, and Gattini's 2009 model. All three models outlined above in the input (resources and structure) require components of management and leadership, planning and design programs, supply of staff, software and hardware facilities and equipment. Plus, in the "process" section from three models, we pointed up the actions ensuring the quality of health information system, and in output section, except for Lippeveld Model, two other models consider information products and use and distribution of information as components of the national health information system. the results showed that all the three models have had a brief discussion about the

  20. Information processing of sexual abuse in elders.

    Science.gov (United States)

    Burgess, Ann W; Clements, Paul T

    2006-01-01

    Sexual abuse is considered to be a pandemic contemporary public health issue, with significant physical and psychosocial consequences for its victims. However, the incidence of elder sexual assault is difficult to estimate with any degree of confidence. A convenience sample of 284 case records were reviewed for Post-Traumatic Stress Disorder (PTSD) symptoms. The purpose of this paper is to present the limited data noted on record review on four PTSD symptoms of startle, physiological upset, anger, and numbness. A treatment model for information processing of intrapsychic trauma is presented to describe domain disruption within a nursing diagnosis of rape trauma syndrome and provide guidance for sensitive assessment and intervention.

  1. Spatial frequency information modulates response inhibition and decision-making processes.

    Directory of Open Access Journals (Sweden)

    Sara Jahfari

    Full Text Available We interact with the world through the assessment of available, but sometimes imperfect, sensory information. However, little is known about how variance in the quality of sensory information affects the regulation of controlled actions. In a series of three experiments, comprising a total of seven behavioral studies, we examined how different types of spatial frequency information affect underlying processes of response inhibition and selection. Participants underwent a stop-signal task, a two choice speed/accuracy balance experiment, and a variant of both these tasks where prior information was given about the nature of stimuli. In all experiments, stimuli were either intact, or contained only high-, or low- spatial frequencies. Overall, drift diffusion model analysis showed a decreased rate of information processing when spatial frequencies were removed, whereas the criterion for information accumulation was lowered. When spatial frequency information was intact, the cost of response inhibition increased (longer SSRT, while a correct response was produced faster (shorter reaction times and with more certainty (decreased errors. When we manipulated the motivation to respond with a deadline (i.e., be fast or accurate, removal of spatial frequency information slowed response times only when instructions emphasized accuracy. However, the slowing of response times did not improve error rates, when compared to fast instruction trials. These behavioral studies suggest that the removal of spatial frequency information differentially affects the speed of response initiation, inhibition, and the efficiency to balance fast or accurate responses. More generally, the present results indicate a task-independent influence of basic sensory information on strategic adjustments in action control.

  2. Information Processing in Nursing Information Systems: An Evaluation Study from a Developing Country.

    Science.gov (United States)

    Samadbeik, Mahnaz; Shahrokhi, Nafiseh; Saremian, Marzieh; Garavand, Ali; Birjandi, Mahdi

    2017-01-01

    In recent years, information technology has been introduced in the nursing departments of many hospitals to support their daily tasks. Nurses are the largest end user group in Hospital Information Systems (HISs). This study was designed to evaluate data processing in the Nursing Information Systems (NISs) utilized in many university hospitals in Iran. This was a cross-sectional study. The population comprised all nurse managers and NIS users of the five training hospitals in Khorramabad city ( N = 71). The nursing subset of HIS-Monitor questionnaire was used to collect the data. Data were analyzed by the descriptive-analytical method and the inductive content analysis. The results indicated that the nurses participating in the study did not take a desirable advantage of paper (2.02) and computerized (2.34) information processing tools to perform nursing tasks. Moreover, the less work experience nurses have, the further they utilize computer tools for processing patient discharge information. The "readability of patient information" and "repetitive and time-consuming documentation" were stated as the most important expectations and problems regarding the HIS by the participating nurses, respectively. The nurses participating in the present study used to utilize paper and computerized information processing tools together to perform nursing practices. Therefore, it is recommended that the nursing process redesign coincides with NIS implementation in the health care centers.

  3. Lipid Processing Technology: Building a Multilevel Modeling Network

    DEFF Research Database (Denmark)

    Diaz Tovar, Carlos Axel; Mustaffa, Azizul Azri; Hukkerikar, Amol

    2011-01-01

    of a computer aided multilevel modeling network consisting a collection of new and adopted models, methods and tools for the systematic design and analysis of processes employing lipid technology. This is achieved by decomposing the problem into four levels of modeling: 1. pure component properties; 2. mixtures...... and phase behavior; 3. unit operations; and 4. process synthesis and design. The methods and tools in each level include: For the first level, a lipid‐database of collected experimental data from the open literature, confidential data from industry and generated data from validated predictive property...... of these unit operations with respect to performance parameters such as minimum total cost, product yield improvement, operability etc., and process intensification for the retrofit of existing biofuel plants. In the fourth level the information and models developed are used as building blocks...

  4. Quantum teleportation for continuous variables and related quantum information processing

    International Nuclear Information System (INIS)

    Furusawa, Akira; Takei, Nobuyuki

    2007-01-01

    Quantum teleportation is one of the most important subjects in quantum information science. This is because quantum teleportation can be regarded as not only quantum information transfer but also a building block for universal quantum information processing. Furthermore, deterministic quantum information processing is very important for efficient processing and it can be realized with continuous-variable quantum information processing. In this review, quantum teleportation for continuous variables and related quantum information processing are reviewed from these points of view

  5. Automated Physico-Chemical Cell Model Development through Information Theory

    Energy Technology Data Exchange (ETDEWEB)

    Peter J. Ortoleva

    2005-11-29

    The objective of this project was to develop predictive models of the chemical responses of microbial cells to variations in their surroundings. The application of these models is optimization of environmental remediation and energy-producing biotechnical processes.The principles on which our project is based are as follows: chemical thermodynamics and kinetics; automation of calibration through information theory; integration of multiplex data (e.g. cDNA microarrays, NMR, proteomics), cell modeling, and bifurcation theory to overcome cellular complexity; and the use of multiplex data and information theory to calibrate and run an incomplete model. In this report we review four papers summarizing key findings and a web-enabled, multiple module workflow we have implemented that consists of a set of interoperable systems biology computational modules.

  6. Modelling process integration and its management – case of a public housing delivery organization in United Arab Emirates

    Directory of Open Access Journals (Sweden)

    Venkatachalam Senthilkumar

    2017-01-01

    Full Text Available Huge volume of project information are generated during the life cycle of an AEC projects. These project information are categorized in to technical and administrative information and managed through appropriate processes. There are many tools such as Document Management Systems, Building Information Modeling (BIM available to manage and integrate the technical information. However, the administrative information and its related processes such as the payment, status, authorization, approval etc. are not effectively managed. The current study aims to explore the administrative information management process of a local housing delivery public agency. This agency manages more than 2000 housing projects at any time of a year. The administrative processesare characterized withdelivery inconsistencies among various project participants. Though there are many commercially available process management systems, there exist limitations on the customization of the modules/ systems. Hence there is a need to develop an information management system which can integrates and manage these housing projects processes effectively. This requires the modeling of administrative processes and its interfaces among the various stakeholder processes. Hence this study aims to model the administrative processes and its related information during the life cycle of the project using IDEF0 and IDEF1X modeling. The captured processes and information interfaces are analyzed and appropriate process integration is suggested to avoid the delay in their project delivery processes. Further, the resultant model can be used for effectively managing the housing delivery projects.

  7. Information Technology Process Improvement Decision-Making: An Exploratory Study from the Perspective of Process Owners and Process Managers

    Science.gov (United States)

    Lamp, Sandra A.

    2012-01-01

    There is information available in the literature that discusses information technology (IT) governance and investment decision making from an executive-level perception, yet there is little information available that offers the perspective of process owners and process managers pertaining to their role in IT process improvement and investment…

  8. Bayesian inference with information content model check for Langevin equations

    DEFF Research Database (Denmark)

    Krog, Jens F. C.; Lomholt, Michael Andersen

    2017-01-01

    The Bayesian data analysis framework has been proven to be a systematic and effective method of parameter inference and model selection for stochastic processes. In this work we introduce an information content model check which may serve as a goodness-of-fit, like the chi-square procedure...

  9. Communicating climate information: travelling through the decision-making process

    International Nuclear Information System (INIS)

    Stoverinck, F.; Dubois, G.; Amelung, B.

    2013-01-01

    Climate change forces society to adapt. Adaptation strategies are preferably based on the best available climate information. Climate projections, however, often inform adaptation strategies after being interpreted once or several times. This process affects the original message put forward by climate scientists when presenting the basic climate projections, in particular regarding uncertainties. The nature of this effect and its implications for decision-making are as yet poorly understood. This paper explores the nature and consequences of a) the communication tools used by scientists and experts, and b)changes in the communicated information as it travels through the decision-making process. It does so by analysing the interpretative steps taken in a sample of 25 documents, pertaining to the field of public policies for climate change impact assessment and adaptation strategies. Five phases in the provisioning of climate information are distinguished: pre-existing knowledge (i.e. climate models and data), climate- change projection, impact assessment, adaptation strategy, and adaptation plan. Between the phases, climate information is summarized and synthesised in order to be passed on. The results show that in the sample information on uncertainty is under-represented: e.g. studies focus on only one scenario, and/or disregard probability distributions. In addition, visualization tools are often used ineffectively, leading to confusion and unintended interpretations. Several recommendations are presented. A better training of climatologists to communication issues, but also a training to climatology for decision makers are required, as well as more cautious and robust adaptation strategies, accounting for the uncertainty inherent to climate projections. (authors)

  10. A multiprofessional information model for Brazilian primary care: Defining a consensus model towards an interoperable electronic health record.

    Science.gov (United States)

    Braga, Renata Dutra

    2016-06-01

    To develop a multiprofessional information model to be used in the decision-making process in primary care in Brazil. This was an observational study with a descriptive and exploratory approach, using action research associated with the Delphi method. A group of 13 health professionals made up a panel of experts that, through individual and group meetings, drew up a preliminary health information records model. The questionnaire used to validate this model included four questions based on a Likert scale. These questions evaluated the completeness and relevance of information on each of the four pillars that composed the model. The changes suggested in each round of evaluation were included when accepted by the majority (≥ 50%). This process was repeated as many times as necessary to obtain the desirable and recommended consensus level (> 50%), and the final version became the consensus model. Multidisciplinary health training of the panel of experts allowed a consensus model to be obtained based on four categories of health information, called pillars: Data Collection, Diagnosis, Care Plan and Evaluation. The obtained consensus model was considered valid by the experts and can contribute to the collection and recording of multidisciplinary information in primary care, as well as the identification of relevant concepts for defining electronic health records at this level of complexity in health care. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. Bayesian inference for Markov jump processes with informative observations.

    Science.gov (United States)

    Golightly, Andrew; Wilkinson, Darren J

    2015-04-01

    In this paper we consider the problem of parameter inference for Markov jump process (MJP) representations of stochastic kinetic models. Since transition probabilities are intractable for most processes of interest yet forward simulation is straightforward, Bayesian inference typically proceeds through computationally intensive methods such as (particle) MCMC. Such methods ostensibly require the ability to simulate trajectories from the conditioned jump process. When observations are highly informative, use of the forward simulator is likely to be inefficient and may even preclude an exact (simulation based) analysis. We therefore propose three methods for improving the efficiency of simulating conditioned jump processes. A conditioned hazard is derived based on an approximation to the jump process, and used to generate end-point conditioned trajectories for use inside an importance sampling algorithm. We also adapt a recently proposed sequential Monte Carlo scheme to our problem. Essentially, trajectories are reweighted at a set of intermediate time points, with more weight assigned to trajectories that are consistent with the next observation. We consider two implementations of this approach, based on two continuous approximations of the MJP. We compare these constructs for a simple tractable jump process before using them to perform inference for a Lotka-Volterra system. The best performing construct is used to infer the parameters governing a simple model of motility regulation in Bacillus subtilis.

  12. Cognitive Structures in Vocational Information Processing and Decision Making.

    Science.gov (United States)

    Nevill, Dorothy D.; And Others

    1986-01-01

    Tested the assumptions that the structural features of vocational schemas affect vocational information processing and career self-efficacy. Results indicated that effective vocational information processing was facilitated by well-integrated systems that processed information along fewer dimensions. The importance of schematic organization on the…

  13. A discrimination-association model for decomposing component processes of the implicit association test.

    Science.gov (United States)

    Stefanutti, Luca; Robusto, Egidio; Vianello, Michelangelo; Anselmi, Pasquale

    2013-06-01

    A formal model is proposed that decomposes the implicit association test (IAT) effect into three process components: stimuli discrimination, automatic association, and termination criterion. Both response accuracy and reaction time are considered. Four independent and parallel Poisson processes, one for each of the four label categories of the IAT, are assumed. The model parameters are the rate at which information accrues on the counter of each process and the amount of information that is needed before a response is given. The aim of this study is to present the model and an illustrative application in which the process components of a Coca-Pepsi IAT are decomposed.

  14. Can Intrinsic Fluctuations Increase Efficiency in Neural Information Processing?

    Science.gov (United States)

    Liljenström, Hans

    2003-05-01

    All natural processes are accompanied by fluctuations, characterized as noise or chaos. Biological systems, which have evolved during billions of years, are likely to have adapted, not only to cope with such fluctuations, but also to make use of them. We investigate how the complex dynamics of the brain, including oscillations, chaos and noise, can affect the efficiency of neural information processing. In particular, we consider the amplification and functional role of internal fluctuations. Using computer simulations of a neural network model of the olfactory cortex and hippocampus, we demonstrate how microscopic fluctuations can result in global effects at the network level. We show that the rate of information processing in associative memory tasks can be maximized for optimal noise levels, analogous to stochastic resonance phenomena. Noise can also induce transitions between different dynamical states, which could be of significance for learning and memory. A chaotic-like behavior, induced by noise or by an increase in neuronal excitability, can enhance system performance if it is transient and converges to a limit cycle memory state. We speculate whether this dynamical behavior perhaps could be related to (creative) thinking.

  15. Mathematics of Information Processing and the Internet

    Science.gov (United States)

    Hart, Eric W.

    2010-01-01

    The mathematics of information processing and the Internet can be organized around four fundamental themes: (1) access (finding information easily); (2) security (keeping information confidential); (3) accuracy (ensuring accurate information); and (4) efficiency (data compression). In this article, the author discusses each theme with reference to…

  16. Developing the support business-processes information system for self-regulation institute

    Directory of Open Access Journals (Sweden)

    Kravchenko Anatoly Vasilevich

    2011-11-01

    Full Text Available Studying the methods of sructurization, optimization and the automation for modern self-regulation institute business-processes are considered in the article. The aim of the text is to show the strategy of modeling self-regulation information system and the strategy of the membership dues accounting and agency compensation.

  17. Basic disturbances of information processing in psychosis prediction.

    Science.gov (United States)

    Bodatsch, Mitja; Klosterkötter, Joachim; Müller, Ralf; Ruhrmann, Stephan

    2013-01-01

    The basic symptoms (BS) approach provides a valid instrument in predicting psychosis onset and represents moreover a significant heuristic framework for research. The term "basic symptoms" denotes subtle changes of cognition and perception in the earliest and prodromal stages of psychosis development. BS are thought to correspond to disturbances of neural information processing. Following the heuristic implications of the BS approach, the present paper aims at exploring disturbances of information processing, revealed by functional magnetic resonance imaging (fMRI) and electro-encephalographic as characteristics of the at-risk state of psychosis. Furthermore, since high-risk studies employing ultra-high-risk criteria revealed non-conversion rates commonly exceeding 50%, thus warranting approaches that increase specificity, the potential contribution of neural information processing disturbances to psychosis prediction is reviewed. In summary, the at-risk state seems to be associated with information processing disturbances. Moreover, fMRI investigations suggested that disturbances of language processing domains might be a characteristic of the prodromal state. Neurophysiological studies revealed that disturbances of sensory processing may assist psychosis prediction in allowing for a quantification of risk in terms of magnitude and time. The latter finding represents a significant advancement since an estimation of the time to event has not yet been achieved by clinical approaches. Some evidence suggests a close relationship between self-experienced BS and neural information processing. With regard to future research, the relationship between neural information processing disturbances and different clinical risk concepts warrants further investigations. Thereby, a possible time sequence in the prodromal phase might be of particular interest.

  18. Multidimensional Models of Information Need

    OpenAIRE

    Yun-jie (Calvin) Xu; Kai Huang (Joseph) Tan

    2009-01-01

    User studies in information science have recognised relevance as a multidimensional construct. An implication of multidimensional relevance is that a user's information need should be modeled by multiple data structures to represent different relevance dimensions. While the extant literature has attempted to model multiple dimensions of a user's information need, the fundamental assumption that a multidimensional model is better than a uni-dimensional model has not been addressed. This study ...

  19. ORGANIZATION OF INFORMATION INTERACTION OF AIRPORT PRODUCTION PROCESSES

    Directory of Open Access Journals (Sweden)

    Yakov Mikhajlovich Dalinger

    2017-01-01

    Full Text Available The organization of service production attributed to airports activity is analyzed. The importance and the actuality of information interaction problem solution between productive processes as a problem of organization of modern produc- tion are shown.Possibilities and features of information interaction system construction in form of multi-level hierarchical struc- ture have been shown. The airport is considered as an enterprise aimed at service production where it is necessary to analyze much in- formation in a limited time-frame. The production schedule often changes under the influence of many factors. This leads to the increase of the role of computerization and informatization of production processes what predetermines automation of production, creation of information environment and organization of information interaction needed for realization of production processes. The integrated organization form is proposed because it is oriented to the integration of different processes into a universal production system and it allows to conduct the coordination of local goals of particular processes in the context of the global purpose aimed at the improvement of the effectiveness of the airport activity. The main conditions needed for organization of information interaction between production processes and techno- logical operations are considered, and the list of the following problems is determined. The attention is paid to the necessity of compatibility of structure and organization of interaction system in the conditions of the airline and the necessity of be- ing its reflection in the information space of the airline. The usefulness of the intergrated organization form of information interaction based on information exchange between processes and service customers according to the network structure is explained. Multi-level character of this structure confirms its advantage over other items, however it also has a series of features presented

  20. Quantum information processing in nanostructures

    International Nuclear Information System (INIS)

    Reina Estupinan, John-Henry

    2002-01-01

    Since information has been regarded os a physical entity, the field of quantum information theory has blossomed. This brings novel applications, such as quantum computation. This field has attracted the attention of numerous researchers with backgrounds ranging from computer science, mathematics and engineering, to the physical sciences. Thus, we now have an interdisciplinary field where great efforts are being made in order to build devices that should allow for the processing of information at a quantum level, and also in the understanding of the complex structure of some physical processes at a more basic level. This thesis is devoted to the theoretical study of structures at the nanometer-scale, 'nanostructures', through physical processes that mainly involve the solid-state and quantum optics, in order to propose reliable schemes for the processing of quantum information. Initially, the main results of quantum information theory and quantum computation are briefly reviewed. Next, the state-of-the-art of quantum dots technology is described. In so doing, the theoretical background and the practicalities required for this thesis are introduced. A discussion of the current quantum hardware used for quantum information processing is given. In particular, the solid-state proposals to date are emphasised. A detailed prescription is given, using an optically-driven coupled quantum dot system, to reliably prepare and manipulate exciton maximally entangled Bell and Greenberger-Horne-Zeilinger (GHZ) states. Manipulation of the strength and duration of selective light-pulses needed for producing these highly entangled states provides us with crucial elements for the processing of solid-state based quantum information. The all-optical generation of states of the so-called Bell basis for a system of two quantum dots (QDs) is exploited for performing the quantum teleportation of the excitonic state of a dot in an array of three coupled QDs. Theoretical predictions suggest

  1. ICT-Enabled Time-Critical Clinical Practices: Examining the Affordances of an Information Processing Solution

    Directory of Open Access Journals (Sweden)

    Leonard Hoon

    2015-11-01

    Full Text Available In this paper, we present a case study of a decision-support system deployment at The Alfred Hospital, in Melbourne, Australia. This work outlines Information and Communications Technology (ICT affordances and their actualisations in time-critical clinical practices to enable better information processing. From our study findings, we present a stage-wise model describing the role played by ICT in the context of the Trauma Centre practices. This addresses a knowledge gap surrounding the role and impact of ICT in the delivery of quality improvements to processes and culture in time-critical environments, amid increasing expenditure on ICT globally. Our model has implications for research and practice, such that we observe for the first time how information standards, synergy and renewal are developed between the system and its users in order to reduce error rates in the healthcare context. Through the study findings, we demonstrate that healthcare quality can be further refined as ICT allows for knowledge dissemination and informs existing practices.

  2. Algorithmic information theory mathematics of digital information processing

    CERN Document Server

    Seibt, Peter

    2007-01-01

    Treats the Mathematics of many important areas in digital information processing. This book covers, in a unified presentation, five topics: Data Compression, Cryptography, Sampling (Signal Theory), Error Control Codes, Data Reduction. It is useful for teachers, students and practitioners in Electronic Engineering, Computer Science and Mathematics.

  3. [Development method of healthcare information system integration based on business collaboration model].

    Science.gov (United States)

    Li, Shasha; Nie, Hongchao; Lu, Xudong; Duan, Huilong

    2015-02-01

    Integration of heterogeneous systems is the key to hospital information construction due to complexity of the healthcare environment. Currently, during the process of healthcare information system integration, people participating in integration project usually communicate by free-format document, which impairs the efficiency and adaptability of integration. A method utilizing business process model and notation (BPMN) to model integration requirement and automatically transforming it to executable integration configuration was proposed in this paper. Based on the method, a tool was developed to model integration requirement and transform it to integration configuration. In addition, an integration case in radiology scenario was used to verify the method.

  4. Transforming Collaborative Process Models into Interface Process Models by Applying an MDA Approach

    Science.gov (United States)

    Lazarte, Ivanna M.; Chiotti, Omar; Villarreal, Pablo D.

    Collaborative business models among enterprises require defining collaborative business processes. Enterprises implement B2B collaborations to execute these processes. In B2B collaborations the integration and interoperability of processes and systems of the enterprises are required to support the execution of collaborative processes. From a collaborative process model, which describes the global view of the enterprise interactions, each enterprise must define the interface process that represents the role it performs in the collaborative process in order to implement the process in a Business Process Management System. Hence, in this work we propose a method for the automatic generation of the interface process model of each enterprise from a collaborative process model. This method is based on a Model-Driven Architecture to transform collaborative process models into interface process models. By applying this method, interface processes are guaranteed to be interoperable and defined according to a collaborative process.

  5. Advanced social features in a recommendation system for process modeling

    NARCIS (Netherlands)

    Koschmider, A.; Song, M.S.; Reijers, H.A.; Abramowicz, W.

    2009-01-01

    Social software is known to stimulate the exchange and sharing of information among peers. This paper describes how an existing system that supports process builders in completing a business process can be enhanced with various social features. In that way, it is easier for process modeler to become

  6. Fake News, Conspiracy Theories, and Lies: An Information Laundering Model for Homeland Security

    Science.gov (United States)

    2018-03-01

    being distributed. B. INFORMATION LAUNDERING 2.0 MODEL Like Information Laundering 1.0, Information Laundering 2.0 is built on a metaphor of money ...However, unlike the previous model, the new model takes the metaphor a step further, incorporating all three phases of money laundering : placement...we say “ laundering .” Although there are many ways to describe money laundering , the simplest way is: a process by which one can turn “‘dirty’ money

  7. Spatio-temporal models of mental processes from fMRI.

    Science.gov (United States)

    Janoos, Firdaus; Machiraju, Raghu; Singh, Shantanu; Morocz, Istvan Ákos

    2011-07-15

    Understanding the highly complex, spatially distributed and temporally organized phenomena entailed by mental processes using functional MRI is an important research problem in cognitive and clinical neuroscience. Conventional analysis methods focus on the spatial dimension of the data discarding the information about brain function contained in the temporal dimension. This paper presents a fully spatio-temporal multivariate analysis method using a state-space model (SSM) for brain function that yields not only spatial maps of activity but also its temporal structure along with spatially varying estimates of the hemodynamic response. Efficient algorithms for estimating the parameters along with quantitative validations are given. A novel low-dimensional feature-space for representing the data, based on a formal definition of functional similarity, is derived. Quantitative validation of the model and the estimation algorithms is provided with a simulation study. Using a real fMRI study for mental arithmetic, the ability of this neurophysiologically inspired model to represent the spatio-temporal information corresponding to mental processes is demonstrated. Moreover, by comparing the models across multiple subjects, natural patterns in mental processes organized according to different mental abilities are revealed. Copyright © 2011 Elsevier Inc. All rights reserved.

  8. Gaussian random bridges and a geometric model for information equilibrium

    Science.gov (United States)

    Mengütürk, Levent Ali

    2018-03-01

    The paper introduces a class of conditioned stochastic processes that we call Gaussian random bridges (GRBs) and proves some of their properties. Due to the anticipative representation of any GRB as the sum of a random variable and a Gaussian (T , 0) -bridge, GRBs can model noisy information processes in partially observed systems. In this spirit, we propose an asset pricing model with respect to what we call information equilibrium in a market with multiple sources of information. The idea is to work on a topological manifold endowed with a metric that enables us to systematically determine an equilibrium point of a stochastic system that can be represented by multiple points on that manifold at each fixed time. In doing so, we formulate GRB-based information diversity over a Riemannian manifold and show that it is pinned to zero over the boundary determined by Dirac measures. We then define an influence factor that controls the dominance of an information source in determining the best estimate of a signal in the L2-sense. When there are two sources, this allows us to construct information equilibrium as a functional of a geodesic-valued stochastic process, which is driven by an equilibrium convergence rate representing the signal-to-noise ratio. This leads us to derive price dynamics under what can be considered as an equilibrium probability measure. We also provide a semimartingale representation of Markovian GRBs associated with Gaussian martingales and a non-anticipative representation of fractional Brownian random bridges that can incorporate degrees of information coupling in a given system via the Hurst exponent.

  9. The Relevance of the Social Information Processing Model for Understanding Relational Aggression in Girls

    Science.gov (United States)

    Crain, Marcelle M.; Finch, Cambra L.; Foster, Sharon L.

    2005-01-01

    Two studies examined whether social information-processing variables predict relational aggression in girls. In Study 1, fourth- through sixth-grade girls reported their intent attributions, social goals, outcome expectancies for relational aggression, and the likelihood that they would choose a relationally aggressive response in response to…

  10. Process of making decisions on loan currency: Influence of representativeness on information processing and coherence with consumption motives

    Directory of Open Access Journals (Sweden)

    Anđelković Dragan

    2016-01-01

    Full Text Available Rationality of decision maker is often reduced by heuristics and biases, and also by different types of external stimuli. In decision-making process individuals simplify phases of information selection and information processing by using heuristics, simple rules which are focused on one aspect of complex problem and ignore other aspects, and in that way 'speed up' decision-making process. This method of making decisions, although efficient in making simple decisions, can lead to mistakes in probability assessment and diminish rationality of decision maker. In that way it can influence drastically on transaction outcome for which decision is being made. The subject of this study is influence of representativeness heuristic on making financial decisions by individuals, and influence of consumption motives on stereotypical elements in information processing phase. Study was conducted by determining attitudes of respondents toward currencies, and then by conducting experiments with aim of analyzing method of making decisions on loan currency. Aim of study was determining whether and to what extent representativeness influence choice of currency in process of making loan decisions. Results of conducted behavioral experiments show that respondents, opposite to rational model, do not asses probability by processing available information and in accordance with their preferences, but by comparing decision objects with other objects which have same attributes, showing in that way moderate positive correlation between stereotypical attitudes and choice of loan currency. Experiments have shown that instrumental motive significantly influence representativeness heuristics, that is, individuals are prone to process information with diminished influence of stereotypical attitudes caused by external stimuli, in situations where there is no so called 'hedonistic decision-making'. Respondents have been making more efficient decisions if they had motive which does

  11. Interaction between Task Oriented and Affective Information Processing in Cognitive Robotics

    Science.gov (United States)

    Haazebroek, Pascal; van Dantzig, Saskia; Hommel, Bernhard

    There is an increasing interest in endowing robots with emotions. Robot control however is still often very task oriented. We present a cognitive architecture that allows the combination of and interaction between task representations and affective information processing. Our model is validated by comparing simulation results with empirical data from experimental psychology.

  12. The RiverFish Approach to Business Process Modeling: Linking Business Steps to Control-Flow Patterns

    Science.gov (United States)

    Zuliane, Devanir; Oikawa, Marcio K.; Malkowski, Simon; Alcazar, José Perez; Ferreira, João Eduardo

    Despite the recent advances in the area of Business Process Management (BPM), today’s business processes have largely been implemented without clearly defined conceptual modeling. This results in growing difficulties for identification, maintenance, and reuse of rules, processes, and control-flow patterns. To mitigate these problems in future implementations, we propose a new approach to business process modeling using conceptual schemas, which represent hierarchies of concepts for rules and processes shared among collaborating information systems. This methodology bridges the gap between conceptual model description and identification of actual control-flow patterns for workflow implementation. We identify modeling guidelines that are characterized by clear phase separation, step-by-step execution, and process building through diagrams and tables. The separation of business process modeling in seven mutually exclusive phases clearly delimits information technology from business expertise. The sequential execution of these phases leads to the step-by-step creation of complex control-flow graphs. The process model is refined through intuitive table and diagram generation in each phase. Not only does the rigorous application of our modeling framework minimize the impact of rule and process changes, but it also facilitates the identification and maintenance of control-flow patterns in BPM-based information system architectures.

  13. Spiral model of procedural cycle of educational process management

    Directory of Open Access Journals (Sweden)

    Bezrukov Valery I.

    2016-01-01

    Full Text Available The article analyzes the nature and characteristics of the spiral model Procedure educational systems management cycle. The authors identify patterns between the development of information and communication technologies and the transformation of the education management process, give the characteristics of the concept of “information literacy” and “Media Education”. Consider the design function, determine its potential in changing the traditional educational paradigm to the new - information.

  14. Information Systems to Support a Decision Process at Stanford.

    Science.gov (United States)

    Chaffee, Ellen Earle

    1982-01-01

    When a rational decision process is desired, information specialists can contribute information and also contribute to the process in which that information is used, thereby promoting rational decision-making. The contribution of Stanford's information specialists to rational decision-making is described. (MLW)

  15. Business process modelling in demand-driven agri-food supply chains : a reference framework

    NARCIS (Netherlands)

    Verdouw, C.N.

    2010-01-01

    Keywords: Business process models; Supply chain management; Information systems; Reference information models; Market orientation; Mass customisation; Configuration; Coordination; Control; SCOR; Pot plants; Fruit industry

    Abstract

    The increasing volatility and diversity of

  16. Information processing in illness representation: Implications from an associative-learning framework.

    Science.gov (United States)

    Lowe, Rob; Norman, Paul

    2017-03-01

    The common-sense model (Leventhal, Meyer, & Nerenz, 1980) outlines how illness representations are important for understanding adjustment to health threats. However, psychological processes giving rise to these representations are little understood. To address this, an associative-learning framework was used to model low-level process mechanics of illness representation and coping-related decision making. Associative learning was modeled within a connectionist network simulation. Two types of information were paired: Illness identities (indigestion, heart attack, cancer) were paired with illness-belief profiles (cause, timeline, consequences, control/cure), and specific illness beliefs were paired with coping procedures (family doctor, emergency services, self-treatment). To emulate past experience, the network was trained with these pairings. As an analogue of a current illness event, the trained network was exposed to partial information (illness identity or select representation beliefs) and its response recorded. The network (a) produced the appropriate representation profile (beliefs) for a given illness identity, (b) prioritized expected coping procedures, and (c) highlighted circumstances in which activated representation profiles could include self-generated or counterfactual beliefs. Encoding and activation of illness beliefs can occur spontaneously and automatically; conventional questionnaire measurement may be insensitive to these automatic representations. Furthermore, illness representations may comprise a coherent set of nonindependent beliefs (a schema) rather than a collective of independent beliefs. Incoming information may generate a "tipping point," dramatically changing the active schema as a new illness-knowledge set is invoked. Finally, automatic activation of well-learned information can lead to the erroneous interpretation of illness events, with implications for [inappropriate] coping efforts. (PsycINFO Database Record (c) 2017 APA, all

  17. Modeling Routinization in Games: An Information Theory Approach

    DEFF Research Database (Denmark)

    Wallner, Simon; Pichlmair, Martin; Hecher, Michael

    2015-01-01

    Routinization is the result of practicing until an action stops being a goal-directed process. This paper formulates a definition of routinization in games based on prior research in the fields of activity theory and practice theory. Routinization is analyzed using the formal model of discrete......-time, discrete-space Markov chains and information theory to measure the actual error between the dynamically trained models and the player interaction. Preliminary research supports the hypothesis that Markov chains can be effectively used to model routinization in games. A full study design is presented...

  18. Risk perception and decision processes underlying informed consent to research participation.

    Science.gov (United States)

    Reynolds, William W; Nelson, Robert M

    2007-11-01

    According to the rational choice model, informed consent should consist of a systematic, step-by-step evaluation of all information pertinent to the treatment or research participation decision. Research shows that people frequently deviate from this normative model, however, employing decision-making shortcuts, or heuristics. In this paper we report findings from a qualitative study of 32 adolescents and (their) 31 parents who were recruited from two Northeastern US hospitals and asked to consider the risks of and make hypothetical decisions about research participation. The purpose of this study was to increase our understanding of how diabetic and at-risk adolescents (i.e., those who are obese and/or have a family history of diabetes) and their parents perceive risks and make decisions about research participation. Using data collected from adolescents and parents, we identify heuristic decision processes in which participant perceptions of risk magnitude, which are formed quickly and intuitively and appear to be based on affective responses to information, are far more prominent and central to the participation decision than are perceptions of probability. We discuss participants' use of decision-making heuristics in the context of recent research on affect and decision processes, and we consider the implications of these findings for researchers.

  19. Modelling of Processes of Logistics in Cyberspace Security

    Directory of Open Access Journals (Sweden)

    Konečný Jiří

    2017-01-01

    Full Text Available The goal of this contribution is especially to familiarize experts in various fields with the need for a new approach to the system-defined model and modelling of processes in the engineering practice and the expression of some state variables' possibilities for the modelling of real-world systems with regard to the highly dynamic development of structures and to the behaviour of systems of logistics. Thus, in this contribution, the necessity of making full use of cybernetics as a field for the management and communication of information is expressed, and also the environment of cybernetics as a much needed cybernetic realm (cyberspace, determining the steady state between cyber-attacks and cyber-defence as a modern knowledge-based potential in general and specifically of logistics in cyber security. Connected with this process is the very important area of lifelong training of experts in the dynamic world of science and technology (that is, also in a social system which is also expressed here briefly, and also the cyber and information security, all of which falls under the cyberspace of new perspective electronic learning (e-learning with the use of modern laboratories with new effects also for future possibilities of process modelling of artificial intelligence (AI with a perspective of mass use of UAVs in logistics.

  20. The Information Warfare Life Cycle Model

    Directory of Open Access Journals (Sweden)

    Brett van Niekerk

    2011-11-01

    Full Text Available Information warfare (IW is a dynamic and developing concept, which constitutes a number of disciplines. This paper aims to develop a life cycle model for information warfare that is applicable to all of the constituent disciplines. The model aims to be scalable and applicable to civilian and military incidents where information warfare tactics are employed. Existing information warfare models are discussed, and a new model is developed from the common aspects of these existing models. The proposed model is then applied to a variety of incidents to test its applicability and scalability. The proposed model is shown to be applicable to multiple disciplines of information warfare and is scalable, thus meeting the objectives of the model.

  1. The Information Warfare Life Cycle Model

    Directory of Open Access Journals (Sweden)

    Brett van Niekerk

    2011-03-01

    Full Text Available Information warfare (IW is a dynamic and developing concept, which constitutes a number of disciplines. This paper aims to develop a life cycle model for information warfare that is applicable to all of the constituent disciplines. The model aims to be scalable and applicable to civilian and military incidents where information warfare tactics are employed. Existing information warfare models are discussed, and a new model is developed from the common aspects of these existing models. The proposed model is then applied to a variety of incidents to test its applicability and scalability. The proposed model is shown to be applicable to multiple disciplines of information warfare and is scalable, thus meeting the objectives of the model.

  2. Cognition to Collaboration: User-Centric Approach and Information Behaviour Theories/Models

    Directory of Open Access Journals (Sweden)

    Alperen M Aydin

    2016-12-01

    Full Text Available Aim/Purpose: The objective of this paper is to review the vast literature of user-centric in-formation science and inform about the emerging themes in information behaviour science. Background:\tThe paradigmatic shift from system-centric to user-centric approach facilitates research on the cognitive and individual information processing. Various information behaviour theories/models emerged. Methodology: Recent information behaviour theories and models are presented. Features, strengths and weaknesses of the models are discussed through the analysis of the information behaviour literature. Contribution: This paper sheds light onto the weaknesses in earlier information behaviour models and stresses (and advocates the need for research on social information behaviour. Findings: Prominent information behaviour models deal with individual information behaviour. People live in a social world and sort out most of their daily or work problems in groups. However, only seven papers discuss social information behaviour (Scopus search. Recommendations for Practitioners\t: ICT tools used for inter-organisational sharing should be redesigned for effective information-sharing during disaster/emergency times. Recommendation for Researchers: There are scarce sources on social side of the information behaviour, however, most of the work tasks are carried out in groups/teams. Impact on Society: In dynamic work contexts like disaster management and health care settings, collaborative information-sharing may result in decreasing the losses. Future Research: A fieldwork will be conducted in disaster management context investigating the inter-organisational information-sharing.

  3. Lévy Random Bridges and the Modelling of Financial Information

    OpenAIRE

    Edward Hoyle; Lane P. Hughston; Andrea Macrina

    2010-01-01

    The information-based asset-pricing framework of Brody, Hughston and Mac- rina (BHM) is extended to include a wider class of models for market information. In the BHM framework, each asset is associated with a collection of random cash flows. The price of the asset is the sum of the discounted conditional expecta- tions of the cash flows. The conditional expectations are taken with respect to a ¯ltration generated by a set of 'information processes'. The information pro- cesses carry imperfec...

  4. Consciousness: a unique way of processing information.

    Science.gov (United States)

    Marchetti, Giorgio

    2018-02-08

    In this article, I argue that consciousness is a unique way of processing information, in that: it produces information, rather than purely transmitting it; the information it produces is meaningful for us; the meaning it has is always individuated. This uniqueness allows us to process information on the basis of our personal needs and ever-changing interactions with the environment, and consequently to act autonomously. Three main basic cognitive processes contribute to realize this unique way of information processing: the self, attention and working memory. The self, which is primarily expressed via the central and peripheral nervous systems, maps our body, the environment, and our relations with the environment. It is the primary means by which the complexity inherent to our composite structure is reduced into the "single voice" of a unique individual. It provides a reference system that (albeit evolving) is sufficiently stable to define the variations that will be used as the raw material for the construction of conscious information. Attention allows for the selection of those variations in the state of the self that are most relevant in the given situation. Attention originates and is deployed from a single locus inside our body, which represents the center of the self, around which all our conscious experiences are organized. Whatever is focused by attention appears in our consciousness as possessing a spatial quality defined by this center and the direction toward which attention is focused. In addition, attention determines two other features of conscious experience: periodicity and phenomenal quality. Self and attention are necessary but not sufficient for conscious information to be produced. Complex forms of conscious experiences, such as the various modes of givenness of conscious experience and the stream of consciousness, need a working memory mechanism to assemble the basic pieces of information selected by attention.

  5. Process control using modern systems of information processing

    International Nuclear Information System (INIS)

    Baldeweg, F.

    1984-01-01

    Modern digital automation techniques allow the application of demanding types of process control. These types of process control are characterized by their belonging to higher levels in a multilevel model. Functional and technical aspects of the performance of digital automation plants are presented and explained. A modern automation system is described considering special procedures of process control (e.g. real time diagnosis)

  6. Modelling of information diffusion on social networks with applications to WeChat

    Science.gov (United States)

    Liu, Liang; Qu, Bo; Chen, Bin; Hanjalic, Alan; Wang, Huijuan

    2018-04-01

    Traces of user activities recorded in online social networks open new possibilities to systematically understand the information diffusion process on social networks. From the online social network WeChat, we collected a large number of information cascade trees, each of which tells the spreading trajectory of a message/information such as which user creates the information and which users view or forward the information shared by which neighbours. In this work, we propose two heterogeneous non-linear models, one for the topologies of the information cascade trees and the other for the stochastic process of information diffusion on a social network. Both models are validated by the WeChat data in reproducing and explaining key features of cascade trees. Specifically, we apply the Random Recursive Tree (RRT) to model the growth of cascade trees. The RRT model could capture key features, i.e. the average path length and degree variance of a cascade tree in relation to the number of nodes (size) of the tree. Its single identified parameter quantifies the relative depth or broadness of the cascade trees and indicates that information propagates via a star-like broadcasting or viral-like hop by hop spreading. The RRT model explains the appearance of hubs, thus a possibly smaller average path length as the cascade size increases, as observed in WeChat. We further propose the stochastic Susceptible View Forward Removed (SVFR) model to depict the dynamic user behaviour including creating, viewing, forwarding and ignoring a message on a given social network. Beside the average path length and degree variance of the cascade trees in relation to their sizes, the SVFR model could further explain the power-law cascade size distribution in WeChat and unravel that a user with a large number of friends may actually have a smaller probability to read a message (s)he receives due to limited attention.

  7. Flotation process diagnostics and modelling by coal grain analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ofori, P; O' Brien, G.; Firth, B.; Jenkins, B. [CSIRO Energy Technology, Brisbane, Qld. (Australia)

    2006-05-15

    In coal flotation, particles of different components of the coal such as maceral groups and mineral matter and their associations have different hydrophobicities and therefore different flotation responses. By using a new coal grain analysis method for characterising individual grains, more detailed flotation performance analysis and modelling approaches have been developed. The method involves the use of microscopic imaging techniques to obtain estimates of size, compositional and density information on individual grains of fine coal. The density and composition partitioning of coal processed through different flotation systems provides an avenue to pinpoint the actual cause of poor process performance so that corrective action may be initiated. The information on grain size, density and composition is being used as input data to develop more detailed flotation process models to provide better predictions of process performance for both mechanical and column flotation devices. A number of approaches may be taken to flotation modelling such as the probability approach and the kinetic model approach or a combination of the two. In the work reported here, a simple probability approach has been taken, which will be further refined in due course. The use of grain data to map the responses of different types of coal grains through various fine coal cleaning processes provided a more advanced diagnostic capability for fine coal cleaning circuits. This enabled flotation performance curves analogous to partition curves for density separators to be produced for flotation devices.

  8. Comprehension of Multiple Documents with Conflicting Information: A Two-Step Model of Validation

    Science.gov (United States)

    Richter, Tobias; Maier, Johanna

    2017-01-01

    In this article, we examine the cognitive processes that are involved when readers comprehend conflicting information in multiple texts. Starting from the notion of routine validation during comprehension, we argue that readers' prior beliefs may lead to a biased processing of conflicting information and a one-sided mental model of controversial…

  9. Photonic Architecture for Scalable Quantum Information Processing in Diamond

    Directory of Open Access Journals (Sweden)

    Kae Nemoto

    2014-08-01

    Full Text Available Physics and information are intimately connected, and the ultimate information processing devices will be those that harness the principles of quantum mechanics. Many physical systems have been identified as candidates for quantum information processing, but none of them are immune from errors. The challenge remains to find a path from the experiments of today to a reliable and scalable quantum computer. Here, we develop an architecture based on a simple module comprising an optical cavity containing a single negatively charged nitrogen vacancy center in diamond. Modules are connected by photons propagating in a fiber-optical network and collectively used to generate a topological cluster state, a robust substrate for quantum information processing. In principle, all processes in the architecture can be deterministic, but current limitations lead to processes that are probabilistic but heralded. We find that the architecture enables large-scale quantum information processing with existing technology.

  10. Locating the Seventh Cervical Spinous Process: Development and Validation of a Multivariate Model Using Palpation and Personal Information.

    Science.gov (United States)

    Ferreira, Ana Paula A; Póvoa, Luciana C; Zanier, José F C; Ferreira, Arthur S

    2017-02-01

    The aim of this study was to develop and validate a multivariate prediction model, guided by palpation and personal information, for locating the seventh cervical spinous process (C7SP). A single-blinded, cross-sectional study at a primary to tertiary health care center was conducted for model development and temporal validation. One-hundred sixty participants were prospectively included for model development (n = 80) and time-split validation stages (n = 80). The C7SP was located using the thorax-rib static method (TRSM). Participants underwent chest radiography for assessment of the inner body structure located with TRSM and using radio-opaque markers placed over the skin. Age, sex, height, body mass, body mass index, and vertex-marker distance (D V-M ) were used to predict the distance from the C7SP to the vertex (D V-C7 ). Multivariate linear regression modeling, limits of agreement plot, histogram of residues, receiver operating characteristic curves, and confusion tables were analyzed. The multivariate linear prediction model for D V-C7 (in centimeters) was D V-C7 = 0.986D V-M + 0.018(mass) + 0.014(age) - 1.008. Receiver operating characteristic curves had better discrimination of D V-C7 (area under the curve = 0.661; 95% confidence interval = 0.541-0.782; P = .015) than D V-M (area under the curve = 0.480; 95% confidence interval = 0.345-0.614; P = .761), with respective cutoff points at 23.40 cm (sensitivity = 41%, specificity = 63%) and 24.75 cm (sensitivity = 69%, specificity = 52%). The C7SP was correctly located more often when using predicted D V-C7 in the validation sample than when using the TRSM in the development sample: n = 53 (66%) vs n = 32 (40%), P information. Copyright © 2016. Published by Elsevier Inc.

  11. Using Annotated Conceptual Models to Derive Information System Implementations

    Directory of Open Access Journals (Sweden)

    Anthony Berglas

    1994-05-01

    Full Text Available Producing production quality information systems from conceptual descriptions is a time consuming process that employs many of the world's programmers. Although most of this programming is fairly routine, the process has not been amenable to simple automation because conceptual models do not provide sufficient parameters to make all the implementation decisions that are required, and numerous special cases arise in practice. Most commercial CASE tools address these problems by essentially implementing a waterfall model in which the development proceeds from analysis through design, layout and coding phases in a partially automated manner, but the analyst/programmer must heavily edit each intermediate stage. This paper demonstrates that by recognising the nature of information systems, it is possible to specify applications completely using a conceptual model that has een annotated with additional parameters that guide automated implementation. More importantly, it will be argued that a manageable number of annotations are sufficient to implement realistic applications, and techniques will be described that enabled the author's commercial CASE tool, the Intelligent Develope to automated implementation without requiring complex theorem proving technology.

  12. The Logic Process Formalism of the Informational Domain

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available The performance of present-day informational technologies has two main properties: the universality of the structures used and the flexibility of the final user's interfaces. The first determines the potential cover area of the informational domain. The second determines the diversity and efficiency of processing methods of the proceedings being automated. The mentioned aspects are of great importance in agriculture and ecology because there are complex processes and considerable volumes of used information. For example, the meteoro-logical processes are a part of the ecological one like habitats' existential conditions and are known as a complex prognostic problem. The latter needs considerable computational resources to solve the appropriate equations. Likewise, agriculture as a controlled activity under strong impact from natural conditions has the same high requirements for diverse structures and flexibility of information processing.

  13. Information processing theory in the early design stages

    DEFF Research Database (Denmark)

    Cash, Philip; Kreye, Melanie

    2014-01-01

    suggestions for improvements and support. One theory that may be particularly applicable to the early design stages is Information Processing Theory (IPT) as it is linked to the design process with regard to the key concepts considered. IPT states that designers search for information if they perceive......, the new knowledge is shared between the design team to reduce ambiguity with regards to its meaning and to build a shared understanding – reducing perceived uncertainty. Thus, we propose that Information-Processing Theory is suitable to describe designer activity in the early design stages...... uncertainty with regard to the knowledge necessary to solve a design challenge. They then process this information and compare if the new knowledge they have gained covers the previous knowledge gap. In engineering design, uncertainty plays a key role, particularly in the early design stages which has been...

  14. Biologically inspired information theory: Adaptation through construction of external reality models by living systems.

    Science.gov (United States)

    Nakajima, Toshiyuki

    2015-12-01

    Higher animals act in the world using their external reality models to cope with the uncertain environment. Organisms that have not developed such information-processing organs may also have external reality models built in the form of their biochemical, physiological, and behavioral structures, acquired by natural selection through successful models constructed internally. Organisms subject to illusions would fail to survive in the material universe. How can organisms, or living systems in general, determine the external reality from within? This paper starts with a phenomenological model, in which the self constitutes a reality model developed through the mental processing of phenomena. Then, the it-from-bit concept is formalized using a simple mathematical model. For this formalization, my previous work on an algorithmic process is employed to constitute symbols referring to the external reality, called the inverse causality, with additional improvements to the previous work. Finally, as an extension of this model, the cognizers system model is employed to describe the self as one of many material entities in a world, each of which acts as a subject by responding to the surrounding entities. This model is used to propose a conceptual framework of information theory that can deal with both the qualitative (semantic) and quantitative aspects of the information involved in biological processes. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Neural information processing in cognition: we start to understand the orchestra, but where is the conductor?

    Directory of Open Access Journals (Sweden)

    Guenther ePalm

    2016-01-01

    Full Text Available Research in neural information processing has been successful in the past, providing useful approaches both to practical problems in computer science and to computational models in neuroscience. Recent developments in the area of cognitive neuroscience present new challenges for a computational or theoretical understanding asking for neural information processing models that fulfill criteria or constraints from cognitive psychology, neuroscience and computational efficiency. The most important of these criteria for the evaluation of present and future contributions to this new emerging field are listed at the end of this article.

  16. Neural Information Processing in Cognition: We Start to Understand the Orchestra, but Where is the Conductor?

    Science.gov (United States)

    Palm, Günther

    2016-01-01

    Research in neural information processing has been successful in the past, providing useful approaches both to practical problems in computer science and to computational models in neuroscience. Recent developments in the area of cognitive neuroscience present new challenges for a computational or theoretical understanding asking for neural information processing models that fulfill criteria or constraints from cognitive psychology, neuroscience and computational efficiency. The most important of these criteria for the evaluation of present and future contributions to this new emerging field are listed at the end of this article. PMID:26858632

  17. The informed consent process in randomised controlled trials: a nurse-led process.

    Science.gov (United States)

    Cresswell, Pip; Gilmour, Jean

    2014-03-01

    Clinical trials are carried out with human participants to answer questions about the best way to diagnose, treat and prevent illness. Participants must give informed consent to take part in clinical trials that requires understanding of how clinical trials work and their purpose. Randomised controlled trials provide strong evidence but their complex design is difficult for both clinicians and participants to understand. Increasingly, ensuring informed consent in randomised controlled trials has become part of the clinical research nurse role. The aim of this study was to explore in depth the clinical research nurse role in the informed consent process using a qualitative descriptive approach. Three clinical research nurses were interviewed and data analysed using a thematic analysis approach. Three themes were identified to describe the process of ensuring informed consent. The first theme, Preparatory partnerships, canvassed the relationships required prior to initiation of the informed consent process. The second theme, Partnering the participant, emphasises the need for ensuring voluntariness and understanding, along with patient advocacy. The third theme, Partnership with the project, highlights the clinical research nurse contribution to the capacity of the trial to answer the research question through appropriate recruiting and follow up of participants. Gaining informed consent in randomised controlled trials was complex and required multiple partnerships. A wide variety of skills was used to protect the safety of trial participants and promote quality research. The information from this study contributes to a greater understanding of the clinical research nurse role, and suggests the informed consent process in trials can be a nurse-led one. In order to gain collegial, employer and industry recognition it is important this aspect of the nursing role is acknowledged.

  18. Using BPMN to model Internet of Things behavior within business process

    OpenAIRE

    Dulce Domingos; Francisco Martins

    2017-01-01

    Whereas, traditionally, business processes use the Internet of Things (IoTs) as a distributed source of information, the increase of computational capabilities of IoT devices provides them with the means to also execute parts of the business logic, reducing the amount of exchanged data and central processing. Current approaches based on Business Process Model and Notation (BPMN) already support modelers to define both business processes and IoT devices behavior at the same level of abstractio...

  19. Informal learning processes in support of clinical service delivery in a service-oriented community pharmacy.

    Science.gov (United States)

    Patterson, Brandon J; Bakken, Brianne K; Doucette, William R; Urmie, Julie M; McDonough, Randal P

    The evolving health care system necessitates pharmacy organizations' adjustments by delivering new services and establishing inter-organizational relationships. One approach supporting pharmacy organizations in making changes may be informal learning by technicians, pharmacists, and pharmacy owners. Informal learning is characterized by a four-step cycle including intent to learn, action, feedback, and reflection. This framework helps explain individual and organizational factors that influence learning processes within an organization as well as the individual and organizational outcomes of those learning processes. A case study of an Iowa independent community pharmacy with years of experience in offering patient care services was made. Nine semi-structured interviews with pharmacy personnel revealed initial evidence in support of the informal learning model in practice. Future research could investigate more fully the informal learning model in delivery of patient care services in community pharmacies. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Using Interaction Scenarios to Model Information Systems

    DEFF Research Database (Denmark)

    Bækgaard, Lars; Bøgh Andersen, Peter

    The purpose of this paper is to define and discuss a set of interaction primitives that can be used to model the dynamics of socio-technical activity systems, including information systems, in a way that emphasizes structural aspects of the interaction that occurs in such systems. The primitives...... a number of case studies that indicate that interaction primitives can be useful modeling tools for supplementing conventional flow-oriented modeling of business processes....... are based on a unifying, conceptual definition of the disparate interaction types - a robust model of the types. The primitives can be combined and may thus represent mediated interaction. We present a set of visualizations that can be used to define multiple related interactions and we present and discuss...

  1. The minimal work cost of information processing

    Science.gov (United States)

    Faist, Philippe; Dupuis, Frédéric; Oppenheim, Jonathan; Renner, Renato

    2015-07-01

    Irreversible information processing cannot be carried out without some inevitable thermodynamical work cost. This fundamental restriction, known as Landauer's principle, is increasingly relevant today, as the energy dissipation of computing devices impedes the development of their performance. Here we determine the minimal work required to carry out any logical process, for instance a computation. It is given by the entropy of the discarded information conditional to the output of the computation. Our formula takes precisely into account the statistically fluctuating work requirement of the logical process. It enables the explicit calculation of practical scenarios, such as computational circuits or quantum measurements. On the conceptual level, our result gives a precise and operational connection between thermodynamic and information entropy, and explains the emergence of the entropy state function in macroscopic thermodynamics.

  2. Cutting force model for high speed machining process

    International Nuclear Information System (INIS)

    Haber, R. E.; Jimenez, J. E.; Jimenez, A.; Lopez-Coronado, J.

    2004-01-01

    This paper presents cutting force-based models able to describe a high speed machining process. The model considers the cutting force as output variable, essential for the physical processes that are taking place in high speed machining. Moreover, this paper shows the mathematical development to derive the integral-differential equations, and the algorithms implemented in MATLAB to predict the cutting force in real time MATLAB is a software tool for doing numerical computations with matrices and vectors. It can also display information graphically and includes many toolboxes for several research and applications areas. Two end mill shapes are considered (i. e. cylindrical and ball end mill) for real-time implementation of the developed algorithms. the developed models are validated in slot milling operations. The results corroborate the importance of the cutting force variable for predicting tool wear in high speed machining operations. The developed models are the starting point for future work related with vibration analysis, process stability and dimensional surface finish in high speed machining processes. (Author) 19 refs

  3. Information processing and routing in wireless sensor networks

    CERN Document Server

    Yu, Yang; Krishnamachari, Bhaskar

    2006-01-01

    This book presents state-of-the-art cross-layer optimization techniques for energy-efficient information processing and routing in wireless sensor networks. Besides providing a survey on this important research area, three specific topics are discussed in detail - information processing in a collocated cluster, information transport over a tree substrate, and information routing for computationally intensive applications. The book covers several important system knobs for cross-layer optimization, including voltage scaling, rate adaptation, and tunable compression. By exploring tradeoffs of en

  4. Development of hydrological models and surface process modelization Study case in High Mountain slopes

    International Nuclear Information System (INIS)

    Loaiza, Juan Carlos; Pauwels, Valentijn R

    2011-01-01

    Hydrological models are useful because allow to predict fluxes into the hydrological systems, which is useful to predict foods and violent phenomenon associated to water fluxes, especially in materials under a high meteorization level. The combination of these models with meteorological predictions, especially with rainfall models, allow to model water behavior into the soil. On most of cases, this type of models is really sensible to evapotranspiration. On climatic studies, the superficial processes have to be represented adequately. Calibration and validation of these models is necessary to obtain reliable results. This paper is a practical exercise of application of complete hydrological information at detailed scale in a high mountain catchment, considering the soil use and types more representatives. The information of soil moisture, infiltration, runoff and rainfall is used to calibrate and validate TOPLATS hydrological model to simulate the behavior of soil moisture. The finds show that is possible to implement an hydrological model by means of soil moisture information use and an equation of calibration by Extended Kalman Filter (EKF).

  5. Knowledge acquisition process as an issue in information sciences

    Directory of Open Access Journals (Sweden)

    Boris Bosančić

    2016-07-01

    Full Text Available The paper presents an overview of some problems of information science which are explicitly portrayed in literature. It covers the following issues: information explosion, information flood and data deluge, information retrieval and relevance of information, and finally, the problem of scientific communication. The purpose of this paper is to explain why knowledge acquisition, can be considered as an issue in information sciences. The existing theoretical foundation within the information sciences, i.e. the DIKW hierarchy and its key concepts - data, information, knowledge and wisdom, is recognized as a symbolic representation as well as the theoretical foundation of the knowledge acquisition process. Moreover, it seems that the relationship between the DIKW hierarchy and the knowledge acquisition process is essential for a stronger foundation of information sciences in the 'body' of the overall human knowledge. In addition, the history of both the human and machine knowledge acquisition has been considered, as well as a proposal that the DIKW hierarchy take place as a symbol of general knowledge acquisition process, which could equally relate to both human and machine knowledge acquisition. To achieve this goal, it is necessary to modify the existing concept of the DIKW hierarchy. The appropriate modification of the DIKW hierarchy (one of which is presented in this paper could result in a much more solid theoretical foundation of the knowledge acquisition process and information sciences as a whole. The theoretical assumptions on which the knowledge acquisition process may be established as a problem of information science are presented at the end of the paper. The knowledge acquisition process does not necessarily have to be the subject of epistemology. It may establish a stronger link between the concepts of data and knowledge; furthermore, it can be used in the context of scientific research, but on the more primitive level than conducting

  6. A Companion Model Approach to Modelling and Simulation of Industrial Processes

    International Nuclear Information System (INIS)

    Juslin, K.

    2005-09-01

    Modelling and simulation provides for huge possibilities if broadly taken up by engineers as a working method. However, when considering the launching of modelling and simulation tools in an engineering design project, they shall be easy to learn and use. Then, there is no time to write equations, to consult suppliers' experts, or to manually transfer data from one tool to another. The answer seems to be in the integration of easy to use and dependable simulation software with engineering tools. Accordingly, the modelling and simulation software shall accept as input such structured design information on industrial unit processes and their connections, as provided for by e.g. CAD software and product databases. The software technology, including required specification and communication standards, is already available. Internet based service repositories make it possible for equipment manufacturers to supply 'extended products', including such design data as needed by engineers engaged in process and automation integration. There is a market niche evolving for simulation service centres, operating in co-operation with project consultants, equipment manufacturers, process integrators, automation designers, plant operating personnel, and maintenance centres. The companion model approach for specification and solution of process simulation models, as presented herein, is developed from the above premises. The focus is on how to tackle real world processes, which from the modelling point of view are heterogeneous, dynamic, very stiff, very nonlinear and only piece vice continuous, without extensive manual interventions of human experts. An additional challenge, to solve the arising equations fast and reliable, is dealt with, as well. (orig.)

  7. Abnormal presynaptic short-term plasticity and information processing in a mouse model of fragile X syndrome.

    Science.gov (United States)

    Deng, Pan-Yue; Sojka, David; Klyachko, Vitaly A

    2011-07-27

    Fragile X syndrome (FXS) is the most common inherited form of intellectual disability and the leading genetic cause of autism. It is associated with the lack of fragile X mental retardation protein (FMRP), a regulator of protein synthesis in axons and dendrites. Studies on FXS have extensively focused on the postsynaptic changes underlying dysfunctions in long-term plasticity. In contrast, the presynaptic mechanisms of FXS have garnered relatively little attention and are poorly understood. Activity-dependent presynaptic processes give rise to several forms of short-term plasticity (STP), which is believed to control some of essential neural functions, including information processing, working memory, and decision making. The extent of STP defects and their contributions to the pathophysiology of FXS remain essentially unknown, however. Here we report marked presynaptic abnormalities at excitatory hippocampal synapses in Fmr1 knock-out (KO) mice leading to defects in STP and information processing. Loss of FMRP led to enhanced responses to high-frequency stimulation. Fmr1 KO mice also exhibited abnormal synaptic processing of natural stimulus trains, specifically excessive enhancement during the high-frequency spike discharges associated with hippocampal place fields. Analysis of individual STP components revealed strongly increased augmentation and reduced short-term depression attributable to loss of FMRP. These changes were associated with exaggerated calcium influx in presynaptic neurons during high-frequency stimulation, enhanced synaptic vesicle recycling, and enlarged readily-releasable and reserved vesicle pools. These data suggest that loss of FMRP causes abnormal STP and information processing, which may represent a novel mechanism contributing to cognitive impairments in FXS.

  8. A preliminary geodetic data model for geographic information systems

    Science.gov (United States)

    Kelly, K. M.

    2009-12-01

    Our ability to gather and assimilate integrated data collections from multiple disciplines is important for earth system studies. Moreover, geosciences data collection has increased dramatically, with pervasive networks of observational stations on the ground, in the oceans, in the atmosphere and in space. Contemporary geodetic observations from several space and terrestrial technologies contribute to our knowledge of earth system processes and thus are a valuable source of high accuracy information for many global change studies. Assimilation of these geodetic observations and numerical models into models of weather, climate, oceans, hydrology, ice, and solid Earth processes is an important contribution geodesists can make to the earth science community. Clearly, the geodetic observations and models are fundamental to these contributions. ESRI wishes to provide leadership in the geodetic community to collaboratively build an open, freely available content specification that can be used by anyone to structure and manage geodetic data. This Geodetic Data Model will provide important context for all geographic information. The production of a task-specific geodetic data model involves several steps. The goal of the data model is to provide useful data structures and best practices for each step, making it easier for geodesists to organize their data and metadata in a way that will be useful in their data analyses and to their customers. Built on concepts from the successful Arc Marine data model, we introduce common geodetic data types and summarize the main thematic layers of the Geodetic Data Model. These provide a general framework for envisioning the core feature classes required to represent geodetic data in a geographic information system. Like Arc Marine, the framework is generic to allow users to build workflow or product specific geodetic data models tailored to the specific task(s) at hand. This approach allows integration of the data with other existing

  9. Classicality of quantum information processing

    International Nuclear Information System (INIS)

    Poulin, David

    2002-01-01

    The ultimate goal of the classicality program is to quantify the amount of quantumness of certain processes. Here, classicality is studied for a restricted type of process: quantum information processing (QIP). Under special conditions, one can force some qubits of a quantum computer into a classical state without affecting the outcome of the computation. The minimal set of conditions is described and its structure is studied. Some implications of this formalism are the increase of noise robustness, a proof of the quantumness of mixed state quantum computing, and a step forward in understanding the very foundation of QIP

  10. The Effects of Cassady and Justin's Functional Model for Emotional Information Processing on Improving Social Competence of First Grade Children with ADHD

    Science.gov (United States)

    Eissa, Mourad Ali

    2017-01-01

    This study explores whether or not Emotional Information Processing (EIP) model Intervention has positive effects on the Social Competency in first grade children with ADHD. 10 first graders primary who had been identified as having ADHD using Attention-Deficit Hyperactivity Disorder Test (ADHDT) (Jeong, 2005) and were experiencing social problems…

  11. Evaluation of clinical information modeling tools.

    Science.gov (United States)

    Moreno-Conde, Alberto; Austin, Tony; Moreno-Conde, Jesús; Parra-Calderón, Carlos L; Kalra, Dipak

    2016-11-01

    Clinical information models are formal specifications for representing the structure and semantics of the clinical content within electronic health record systems. This research aims to define, test, and validate evaluation metrics for software tools designed to support the processes associated with the definition, management, and implementation of these models. The proposed framework builds on previous research that focused on obtaining agreement on the essential requirements in this area. A set of 50 conformance criteria were defined based on the 20 functional requirements agreed by that consensus and applied to evaluate the currently available tools. Of the 11 initiative developing tools for clinical information modeling identified, 9 were evaluated according to their performance on the evaluation metrics. Results show that functionalities related to management of data types, specifications, metadata, and terminology or ontology bindings have a good level of adoption. Improvements can be made in other areas focused on information modeling and associated processes. Other criteria related to displaying semantic relationships between concepts and communication with terminology servers had low levels of adoption. The proposed evaluation metrics were successfully tested and validated against a representative sample of existing tools. The results identify the need to improve tool support for information modeling and software development processes, especially in those areas related to governance, clinician involvement, and optimizing the technical validation of testing processes. This research confirmed the potential of these evaluation metrics to support decision makers in identifying the most appropriate tool for their organization. Los Modelos de Información Clínica son especificaciones para representar la estructura y características semánticas del contenido clínico en los sistemas de Historia Clínica Electrónica. Esta investigación define, prueba y valida

  12. Data assimilation in modeling ocean processes: A bibliographic study

    Digital Repository Service at National Institute of Oceanography (India)

    Mahadevan, R.; Fernandes, A.A.; Saran, A.K.

    An annotated bibliography on studies related to data assimilation in modeling ocean processes has been prepared. The bibliography listed here is not comprehensive and is not prepared from the original references. Information obtainable from...

  13. Stereotype Strength and Attentional Bias: Preference for Confirming versus Disconfirming Information Depends on Processing Capacity

    Science.gov (United States)

    Allen, Thomas J.; Sherman, Jeffrey W.; Conrey, Frederica R.; Stroessner, Steven J.

    2009-01-01

    In two experiments, we investigated the relationships among stereotype strength, processing capacity, and the allocation of attention to stereotype-consistent versus stereotype-inconsistent information describing a target person. The results of both experiments showed that, with full capacity, greater stereotype strength was associated with increased attention toward stereotype-consistent versus stereotype-inconsistent information. However, when capacity was diminished, greater stereotype strength was associated with increased attention toward inconsistent versus consistent information. Thus, strong stereotypes may act as self-confirming filters when processing capacity is plentiful, but as efficient information gathering devices that maximize the acquisition of novel (disconfirming) information when capacity is depleted. Implications for models of stereotyping and stereotype change are discussed. PMID:20161043

  14. Theoretical aspects of cellular decision-making and information-processing.

    Science.gov (United States)

    Kobayashi, Tetsuya J; Kamimura, Atsushi

    2012-01-01

    Microscopic biological processes have extraordinary complexity and variety at the sub-cellular, intra-cellular, and multi-cellular levels. In dealing with such complex phenomena, conceptual and theoretical frameworks are crucial, which enable us to understand seemingly different intra- and inter-cellular phenomena from unified viewpoints. Decision-making is one such concept that has attracted much attention recently. Since a number of cellular behavior can be regarded as processes to make specific actions in response to external stimuli, decision-making can cover and has been used to explain a broad range of different cellular phenomena [Balázsi et al. (Cell 144(6):910, 2011), Zeng et al. (Cell 141(4):682, 2010)]. Decision-making is also closely related to cellular information-processing because appropriate decisions cannot be made without exploiting the information that the external stimuli contain. Efficiency of information transduction and processing by intra-cellular networks determines the amount of information obtained, which in turn limits the efficiency of subsequent decision-making. Furthermore, information-processing itself can serve as another concept that is crucial for understanding of other biological processes than decision-making. In this work, we review recent theoretical developments on cellular decision-making and information-processing by focusing on the relation between these two concepts.

  15. Career information processing strategies of secondary school ...

    African Journals Online (AJOL)

    This study examined the strategies commonly adopted by Osun state secondary school students in processing career information. It specifically examined the sources of career information available to the students, the uses to which the students put the information collected and how their career decision making skills can be ...

  16. Getting the Client Into the Loop in Information Systems Modelling Courses

    NARCIS (Netherlands)

    Sikkel, Nicolaas; Daneva, Maia

    2011-01-01

    Information system modelling is more than a translation of requirements from one notation into another, it is part of the requirements analysis process. Adding explicitness and making design choices should provide critical feedback to the requirements document being modelled. We want our students to

  17. Framework model and principles for trusted information sharing in pervasive health.

    Science.gov (United States)

    Ruotsalainen, Pekka; Blobel, Bernd; Nykänen, Pirkko; Seppälä, Antto; Sorvari, Hannu

    2011-01-01

    Trustfulness (i.e. health and wellness information is processed ethically, and privacy is guaranteed) is one of the cornerstones for future Personal Health Systems, ubiquitous healthcare and pervasive health. Trust in today's healthcare is organizational, static and predefined. Pervasive health takes place in an open and untrusted information space where person's lifelong health and wellness information together with contextual data are dynamically collected and used by many stakeholders. This generates new threats that do not exist in today's eHealth systems. Our analysis shows that the way security and trust are implemented in today's healthcare cannot guarantee information autonomy and trustfulness in pervasive health. Based on a framework model of pervasive health and risks analysis of ubiquitous information space, we have formulated principles which enable trusted information sharing in pervasive health. Principles imply that the data subject should have the right to dynamically verify trust and to control the use of her health information, as well as the right to set situation based context-aware personal policies. Data collectors and processors have responsibilities including transparency of information processing, and openness of interests, policies and environmental features. Our principles create a base for successful management of privacy and information autonomy in pervasive health. They also imply that it is necessary to create new data models for personal health information and new architectures which support situation depending trust and privacy management.

  18. Standard Model processes

    CERN Document Server

    Mangano, M.L.; Aguilar-Saavedra, Juan Antonio; Alekhin, S.; Badger, S.; Bauer, C.W.; Becher, T.; Bertone, V.; Bonvini, M.; Boselli, S.; Bothmann, E.; Boughezal, R.; Cacciari, M.; Carloni Calame, C.M.; Caola, F.; Campbell, J.M.; Carrazza, S.; Chiesa, M.; Cieri, L.; Cimaglia, F.; Febres Cordero, F.; Ferrarese, P.; D'Enterria, D.; Ferrera, G.; Garcia i Tormo, X.; Garzelli, M.V.; Germann, E.; Hirschi, V.; Han, T.; Ita, H.; Jäger, B.; Kallweit, S.; Karlberg, A.; Kuttimalai, S.; Krauss, F.; Larkoski, A.J.; Lindert, J.; Luisoni, G.; Maierhöfer, P.; Mattelaer, O.; Martinez, H.; Moch, S.; Montagna, G.; Moretti, M.; Nason, P.; Nicrosini, O.; Oleari, C.; Pagani, D.; Papaefstathiou, A.; Petriello, F.; Piccinini, F.; Pierini, M.; Pierog, T.; Pozzorini, S.; Re, E.; Robens, T.; Rojo, J.; Ruiz, R.; Sakurai, K.; Salam, G.P.; Salfelder, L.; Schönherr, M.; Schulze, M.; Schumann, S.; Selvaggi, M.; Shivaji, A.; Siodmok, A.; Skands, P.; Torrielli, P.; Tramontano, F.; Tsinikos, I.; Tweedie, B.; Vicini, A.; Westhoff, S.; Zaro, M.; Zeppenfeld, D.; CERN. Geneva. ATS Department

    2017-06-22

    This report summarises the properties of Standard Model processes at the 100 TeV pp collider. We document the production rates and typical distributions for a number of benchmark Standard Model processes, and discuss new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

  19. Process system of radiometric and magnetometric aerial information

    International Nuclear Information System (INIS)

    Bazua Rueda, L.F.

    1985-01-01

    The author has been working first in the National Institute of Nuclear Energy (Mexico) and then in URAMEX (Uranio Mexicano) since 1975 to 1983, integrated to radiometric and magnetometric aerial prospecting projects in computerized processing of information aspects. During this period the author participated in the work out of computing systems, information processing and mathematical procedures definition for the geophysical reduction of the calibration equipment data. With cumulated experience, in this thesis are presented aspects concerning to management and operation of computerized processing of information systems. Operation handbooks of the majority of modules are presented. Program lists are not included. (Author)

  20. Process optimization of friction stir welding based on thermal models

    DEFF Research Database (Denmark)

    Larsen, Anders Astrup

    2010-01-01

    This thesis investigates how to apply optimization methods to numerical models of a friction stir welding process. The work is intended as a proof-of-concept using different methods that are applicable to models of high complexity, possibly with high computational cost, and without the possibility...... information of the high-fidelity model. The optimization schemes are applied to stationary thermal models of differing complexity of the friction stir welding process. The optimization problems considered are based on optimizing the temperature field in the workpiece by finding optimal translational speed....... Also an optimization problem based on a microstructure model is solved, allowing the hardness distribution in the plate to be optimized. The use of purely thermal models represents a simplification of the real process; nonetheless, it shows the applicability of the optimization methods considered...