WorldWideScience

Sample records for information processing model

  1. Conceptual models of information processing

    Science.gov (United States)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  2. Information-Processing Models and Curriculum Design

    Science.gov (United States)

    Calfee, Robert C.

    1970-01-01

    "This paper consists of three sections--(a) the relation of theoretical analyses of learning to curriculum design, (b) the role of information-processing models in analyses of learning processes, and (c) selected examples of the application of information-processing models to curriculum design problems." (Author)

  3. Information in general medical practices: the information processing model.

    Science.gov (United States)

    Crowe, Sarah; Tully, Mary P; Cantrill, Judith A

    2010-04-01

    The need for effective communication and handling of secondary care information in general practices is paramount. To explore practice processes on receiving secondary care correspondence in a way that integrates the information needs and perceptions of practice staff both clinical and administrative. Qualitative study using semi-structured interviews with a wide range of practice staff (n = 36) in nine practices in the Northwest of England. Analysis was based on the framework approach using N-Vivo software and involved transcription, familiarization, coding, charting, mapping and interpretation. The 'information processing model' was developed to describe the six stages involved in practice processing of secondary care information. These included the amendment or updating of practice records whilst simultaneously or separately actioning secondary care recommendations, using either a 'one-step' or 'two-step' approach, respectively. Many factors were found to influence each stage and impact on the continuum of patient care. The primary purpose of processing secondary care information is to support patient care; this study raises the profile of information flow and usage within practices as an issue requiring further consideration.

  4. Animal models for information processing during sleep

    NARCIS (Netherlands)

    Coenen, A.M.L.; Drinkenburg, W.H.I.M.

    2002-01-01

    Information provided by external stimuli does reach the brain during sleep, although the amount of information is reduced during sleep compared to wakefulness. The process controlling this reduction is called `sensory' gating and evidence exists that the underlying neurophysiological processes take

  5. Modeling biochemical transformation processes and information processing with Narrator

    Directory of Open Access Journals (Sweden)

    Palfreyman Niall M

    2007-03-01

    Full Text Available Abstract Background Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs, which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Results Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Conclusion Narrator is a

  6. Modeling biochemical transformation processes and information processing with Narrator.

    Science.gov (United States)

    Mandel, Johannes J; Fuss, Hendrik; Palfreyman, Niall M; Dubitzky, Werner

    2007-03-27

    Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Narrator is a flexible and intuitive systems biology tool. It is

  7. An information theory-based approach to modeling the information processing of NPP operators

    International Nuclear Information System (INIS)

    Kim, Jong Hyun; Seong, Poong Hyun

    2002-01-01

    This paper proposes a quantitative approach to modeling the information processing of NPP operators. The aim of this work is to derive the amount of the information processed during a certain control task. The focus will be on i) developing a model for information processing of NPP operators and ii) quantifying the model. To resolve the problems of the previous approaches based on the information theory, i.e. the problems of single channel approaches, we primarily develop the information processing model having multiple stages, which contains information flows. Then the uncertainty of the information is quantified using the Conant's model, a kind of information theory

  8. A simplified computational memory model from information processing

    Science.gov (United States)

    Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang

    2016-01-01

    This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view. PMID:27876847

  9. A simplified computational memory model from information processing.

    Science.gov (United States)

    Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang

    2016-11-23

    This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view.

  10. Lecturing and Loving It: Applying the Information-Processing Model.

    Science.gov (United States)

    Parker, Jonathan K.

    1993-01-01

    Discusses the benefits of lecturing, when done properly, in high schools. Describes the positive attributes of effective lecturers. Provides a human information-processing model applicable to the task of lecturing to students. (HB)

  11. Motivation within the Information Processing Model of Foreign Language Learning

    Science.gov (United States)

    Manolopoulou-Sergi, Eleni

    2004-01-01

    The present article highlights the importance of the motivational construct for the foreign language learning (FLL) process. More specifically, in the present article it is argued that motivation is likely to play a significant role at all three stages of the FLL process as they are discussed within the information processing model of FLL, namely,…

  12. A quantitative approach to modeling the information processing of NPP operators under input information overload

    International Nuclear Information System (INIS)

    Kim, Jong Hyun; Seong, Poong Hyun

    2002-01-01

    This paper proposes a quantitative approach to modeling the information processing of NPP operators. The aim of this work is to derive the amount of the information processed during a certain control task under input information overload. We primarily develop the information processing model having multiple stages, which contains information flow. Then the uncertainty of the information is quantified using the Conant's model, a kind of information theory. We also investigate the applicability of this approach to quantifying the information reduction of operators under the input information overload

  13. Modelling of information processes management of educational complex

    Directory of Open Access Journals (Sweden)

    Оксана Николаевна Ромашкова

    2014-12-01

    Full Text Available This work concerns information model of the educational complex which includes several schools. A classification of educational complexes formed in Moscow is given. There are also a consideration of the existing organizational structure of the educational complex and a suggestion of matrix management structure. Basic management information processes of the educational complex were conceptualized.

  14. A Social Information Processing Model of Media Use in Organizations.

    Science.gov (United States)

    Fulk, Janet; And Others

    1987-01-01

    Presents a model to examine how social influence processes affect individuals' attitudes toward communication media and media use behavior, integrating two research areas: media use patterns as the outcome of objectively rational choices and social information processing theory. Asserts (in a synthesis) that media characteristics and attitudes are…

  15. A Process Model for Goal-Based Information Retrieval

    Directory of Open Access Journals (Sweden)

    Harvey Hyman

    2014-12-01

    Full Text Available In this paper we examine the domain of information search and propose a "goal-based" approach to study search strategy. We describe "goal-based information search" using a framework of Knowledge Discovery. We identify two Information Retrieval (IR goals using the constructs of Knowledge Acquisition (KA and Knowledge Explanation (KE. We classify these constructs into two specific information problems: An exploration-exploitation problem and an implicit-explicit problem. Our proposed framework is an extension of prior work in this domain, applying an IR Process Model originally developed for Legal-IR and adapted to Medical-IR. The approach in this paper is guided by the recent ACM-SIG Medical Information Retrieval (MedIR Workshop definition: "methodologies and technologies that seek to improve access to medical information archives via a process of information retrieval."

  16. Thermodynamics of information processing based on enzyme kinetics: An exactly solvable model of an information pump.

    Science.gov (United States)

    Cao, Yuansheng; Gong, Zongping; Quan, H T

    2015-06-01

    Motivated by the recent proposed models of the information engine [Proc. Natl. Acad. Sci. USA 109, 11641 (2012)] and the information refrigerator [Phys. Rev. Lett. 111, 030602 (2013)], we propose a minimal model of the information pump and the information eraser based on enzyme kinetics. This device can either pump molecules against the chemical potential gradient by consuming the information to be encoded in the bit stream or (partially) erase the information initially encoded in the bit stream by consuming the Gibbs free energy. The dynamics of this model is solved exactly, and the "phase diagram" of the operation regimes is determined. The efficiency and the power of the information machine is analyzed. The validity of the second law of thermodynamics within our model is clarified. Our model offers a simple paradigm for the investigating of the thermodynamics of information processing involving the chemical potential in small systems.

  17. Thermodynamics of information processing based on enzyme kinetics: An exactly solvable model of an information pump

    Science.gov (United States)

    Cao, Yuansheng; Gong, Zongping; Quan, H. T.

    2015-06-01

    Motivated by the recent proposed models of the information engine [Proc. Natl. Acad. Sci. USA 109, 11641 (2012), 10.1073/pnas.1204263109] and the information refrigerator [Phys. Rev. Lett. 111, 030602 (2013), 10.1103/PhysRevLett.111.030602], we propose a minimal model of the information pump and the information eraser based on enzyme kinetics. This device can either pump molecules against the chemical potential gradient by consuming the information to be encoded in the bit stream or (partially) erase the information initially encoded in the bit stream by consuming the Gibbs free energy. The dynamics of this model is solved exactly, and the "phase diagram" of the operation regimes is determined. The efficiency and the power of the information machine is analyzed. The validity of the second law of thermodynamics within our model is clarified. Our model offers a simple paradigm for the investigating of the thermodynamics of information processing involving the chemical potential in small systems.

  18. Advanced modeling of management processes in information technology

    CERN Document Server

    Kowalczuk, Zdzislaw

    2014-01-01

    This book deals with the issues of modelling management processes of information technology and IT projects while its core is the model of information technology management and its component models (contextual, local) describing initial processing and the maturity capsule as well as a decision-making system represented by a multi-level sequential model of IT technology selection, which acquires a fuzzy rule-based implementation in this work. In terms of applicability, this work may also be useful for diagnosing applicability of IT standards in evaluation of IT organizations. The results of this diagnosis might prove valid for those preparing new standards so that – apart from their own visions – they could, to an even greater extent, take into account the capabilities and needs of the leaders of project and manufacturing teams. The book is intended for IT professionals using the ITIL, COBIT and TOGAF standards in their work. Students of computer science and management who are interested in the issue of IT...

  19. Metadata and their impact on processes in Building Information Modeling

    Directory of Open Access Journals (Sweden)

    Vladimir Nyvlt

    2014-04-01

    Full Text Available Building Information Modeling (BIM itself contains huge potential, how to increase effectiveness of every project in its all life cycle. It means from initial investment plan through project and building-up activities to long-term usage and property maintenance and finally demolition. Knowledge Management or better say Knowledge Sharing covers two sets of tools, managerial and technological. Manager`s needs are real expectations and desires of final users in terms of how could they benefit from managing long-term projects, covering whole life cycle in terms of sparing investment money and other resources. Technology employed can help BIM processes to support and deliver these benefits to users. How to use this technology for data and metadata collection, storage and sharing, which processes may these new technologies deploy. We will touch how to cover optimized processes proposal for better and smooth support of knowledge sharing within project time-scale, and covering all its life cycle.

  20. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    Science.gov (United States)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  1. Approaching the Affective Factors of Information Seeking: The Viewpoint of the Information Search Process Model

    Science.gov (United States)

    Savolainen, Reijo

    2015-01-01

    Introduction: The article contributes to the conceptual studies of affective factors in information seeking by examining Kuhlthau's information search process model. Method: This random-digit dial telephone survey of 253 people (75% female) living in a rural, medically under-serviced area of Ontario, Canada, follows-up a previous interview study…

  2. Method for modeling social care processes for national information exchange.

    Science.gov (United States)

    Miettinen, Aki; Mykkänen, Juha; Laaksonen, Maarit

    2012-01-01

    Finnish social services include 21 service commissions of social welfare including Adoption counselling, Income support, Child welfare, Services for immigrants and Substance abuse care. This paper describes the method used for process modeling in the National project for IT in Social Services in Finland (Tikesos). The process modeling in the project aimed to support common national target state processes from the perspective of national electronic archive, increased interoperability between systems and electronic client documents. The process steps and other aspects of the method are presented. The method was developed, used and refined during the three years of process modeling in the national project.

  3. Temporal Expectation and Information Processing: A Model-Based Analysis

    Science.gov (United States)

    Jepma, Marieke; Wagenmakers, Eric-Jan; Nieuwenhuis, Sander

    2012-01-01

    People are able to use temporal cues to anticipate the timing of an event, enabling them to process that event more efficiently. We conducted two experiments, using the fixed-foreperiod paradigm (Experiment 1) and the temporal-cueing paradigm (Experiment 2), to assess which components of information processing are speeded when subjects use such…

  4. Temporal expectation and information processing: A model-based analysis

    NARCIS (Netherlands)

    Jepma, M.; Wagenmakers, E.-J.; Nieuwenhuis, S.

    2012-01-01

    People are able to use temporal cues to anticipate the timing of an event, enabling them to process that event more efficiently. We conducted two experiments, using the fixed-foreperiod paradigm (Experiment 1) and the temporal-cueing paradigm (Experiment 2), to assess which components of information

  5. Computational spectrotemporal auditory model with applications to acoustical information processing

    Science.gov (United States)

    Chi, Tai-Shih

    A computational spectrotemporal auditory model based on neurophysiological findings in early auditory and cortical stages is described. The model provides a unified multiresolution representation of the spectral and temporal features of sound likely critical in the perception of timbre. Several types of complex stimuli are used to demonstrate the spectrotemporal information preserved by the model. Shown by these examples, this two stage model reflects the apparent progressive loss of temporal dynamics along the auditory pathway from the rapid phase-locking (several kHz in auditory nerve), to moderate rates of synchrony (several hundred Hz in midbrain), to much lower rates of modulations in the cortex (around 30 Hz). To complete this model, several projection-based reconstruction algorithms are implemented to resynthesize the sound from the representations with reduced dynamics. One particular application of this model is to assess speech intelligibility. The spectro-temporal Modulation Transfer Functions (MTF) of this model is investigated and shown to be consistent with the salient trends in the human MTFs (derived from human detection thresholds) which exhibit a lowpass function with respect to both spectral and temporal dimensions, with 50% bandwidths of about 16 Hz and 2 cycles/octave. Therefore, the model is used to demonstrate the potential relevance of these MTFs to the assessment of speech intelligibility in noise and reverberant conditions. Another useful feature is the phase singularity emerged in the scale space generated by this multiscale auditory model. The singularity is shown to have certain robust properties and carry the crucial information about the spectral profile. Such claim is justified by perceptually tolerable resynthesized sounds from the nonconvex singularity set. In addition, the singularity set is demonstrated to encode the pitch and formants at different scales. These properties make the singularity set very suitable for traditional

  6. Constructing topic models of Internet of Things for information processing.

    Science.gov (United States)

    Xin, Jie; Cui, Zhiming; Zhang, Shukui; He, Tianxu; Li, Chunhua; Huang, Haojing

    2014-01-01

    Internet of Things (IoT) is regarded as a remarkable development of the modern information technology. There is abundant digital products data on the IoT, linking with multiple types of objects/entities. Those associated entities carry rich information and usually in the form of query records. Therefore, constructing high quality topic hierarchies that can capture the term distribution of each product record enables us to better understand users' search intent and benefits tasks such as taxonomy construction, recommendation systems, and other communications solutions for the future IoT. In this paper, we propose a novel record entity topic model (RETM) for IoT environment that is associated with a set of entities and records and a Gibbs sampling-based algorithm is proposed to learn the model. We conduct extensive experiments on real-world datasets and compare our approach with existing methods to demonstrate the advantage of our approach.

  7. Constructing Topic Models of Internet of Things for Information Processing

    Science.gov (United States)

    Xin, Jie; Cui, Zhiming; Zhang, Shukui; He, Tianxu; Li, Chunhua; Huang, Haojing

    2014-01-01

    Internet of Things (IoT) is regarded as a remarkable development of the modern information technology. There is abundant digital products data on the IoT, linking with multiple types of objects/entities. Those associated entities carry rich information and usually in the form of query records. Therefore, constructing high quality topic hierarchies that can capture the term distribution of each product record enables us to better understand users' search intent and benefits tasks such as taxonomy construction, recommendation systems, and other communications solutions for the future IoT. In this paper, we propose a novel record entity topic model (RETM) for IoT environment that is associated with a set of entities and records and a Gibbs sampling-based algorithm is proposed to learn the model. We conduct extensive experiments on real-world datasets and compare our approach with existing methods to demonstrate the advantage of our approach. PMID:25110737

  8. Modeling of ETL-Processes and Processed Information in Clinical Data Warehousing.

    Science.gov (United States)

    Tute, Erik; Steiner, Jochen

    2018-01-01

    Literature describes a big potential for reuse of clinical patient data. A clinical data warehouse (CDWH) is a means for that. To support management and maintenance of processes extracting, transforming and loading (ETL) data into CDWHs as well as to ease reuse of metadata between regular IT-management, CDWH and secondary data users by providing a modeling approach. Expert survey and literature review to find requirements and existing modeling techniques. An ETL-modeling-technique was developed extending existing modeling techniques. Evaluation by exemplarily modeling existing ETL-process and a second expert survey. Nine experts participated in the first survey. Literature review yielded 15 included publications. Six existing modeling techniques were identified. A modeling technique extending 3LGM2 and combining it with openEHR information models was developed and evaluated. Seven experts participated in the evaluation. The developed approach can help in management and maintenance of ETL-processes and could serve as interface between regular IT-management, CDWH and secondary data users.

  9. Process and building information modelling in the construction industry by using information delivery manuals and model view definitions

    DEFF Research Database (Denmark)

    Karlshøj, Jan

    2012-01-01

    The construction industry is gradually increasing its use of structured information and building information modelling.To date, the industry has suffered from the disadvantages of a project-based organizational structure and ad hoc solutions. Furthermore, it is not used to formalizing the flow...... of information and specifying exactly which objects and properties are needed for each process and which information is produced by the processes. The present study is based on reviewing the existing methodology of Information Delivery Manuals (IDM) from Buildingsmart, which also is also an ISO standard 29481...... Part 1; and the Model View Definition (MVD) methodology developed by Buildingsmart and BLIS. The research also includes a review of concrete IDM development projects that have been developed over the last five years. Although the study has identified interest in the IDM methodology in a number...

  10. Testing the Causal Mediation Component of Dodge's Social Information Processing Model of Social Competence and Depression

    Science.gov (United States)

    Possel, Patrick; Seemann, Simone; Ahrens, Stefanie; Hautzinger, Martin

    2006-01-01

    In Dodge's model of "social information processing" depression is the result of a linear sequence of five stages of information processing ("Annu Rev Psychol" 44: 559-584, 1993). These stages follow a person's reaction to situational stimuli, such that each stage of information processing mediates the relationship between earlier and later stages.…

  11. Statistical Language Models and Information Retrieval: Natural Language Processing Really Meets Retrieval

    NARCIS (Netherlands)

    Hiemstra, Djoerd; de Jong, Franciska M.G.

    2001-01-01

    Traditionally, natural language processing techniques for information retrieval have always been studied outside the framework of formal models of information retrieval. In this article, we introduce a new formal model of information retrieval based on the application of statistical language models.

  12. The importance of information goods abstraction levels for information commerce process models

    NARCIS (Netherlands)

    Wijnhoven, Alphonsus B.J.M.

    2002-01-01

    A process model, in the context of e-commerce, is an organized set of activities for the creation, (re-)production, trade and delivery of goods. Electronic commerce studies have created important process models for the trade of physical goods via Internet. These models are not easily suitable for

  13. Towards a structured process modeling method: Building the prescriptive modeling theory information on submission

    NARCIS (Netherlands)

    Claes, J.; Vanderfeesten, I.T.P.; Gailly, F.; Grefen, P.W.P.J.; Poels, G.

    2017-01-01

    In their effort to control and manage processes, organizations often create process models. The quality of such models is not always optimal, because it is challenging for a modeler to translate her mental image of the process into a formal process description. In order to support this complex human

  14. Applying Catastrophe Theory to an Information-Processing Model of Problem Solving in Science Education

    Science.gov (United States)

    Stamovlasis, Dimitrios; Tsaparlis, Georgios

    2012-01-01

    In this study, we test an information-processing model (IPM) of problem solving in science education, namely the working memory overload model, by applying catastrophe theory. Changes in students' achievement were modeled as discontinuities within a cusp catastrophe model, where working memory capacity was implemented as asymmetry and the degree…

  15. A model of designing as the intersection between uncertainty perception, information processing, and coevolution

    DEFF Research Database (Denmark)

    Lasso, Sarah Venturim; Cash, Philip; Daalhuizen, Jaap

    2016-01-01

    , the designer's perceived uncertainty is the motivation to start a process of collecting, exchanging, and integrating knowledge. This has been formalised in Information-Processing Theory and more generally described by authors such as Aurisicchio et al. (2013) who describe design as an information...... takes the first steps towards linking these disparate perspectives in a model of designing that synthesises coevolution and information processing. How designers act has been shown to play an important role in the process of New Product Development (NPD) (See e.g. Badke-Schaub and Frankenberger, 2012...... transformation process. Here the aim of the activity is to reduce the perceived uncertainty through identifying and integrating external information and knowledge within the design team. For2example, when perceiving uncertainty the designer might seek new information online, process this information, and share...

  16. A Conceptual Model of the Cognitive Processing of Environmental Distance Information

    Science.gov (United States)

    Montello, Daniel R.

    I review theories and research on the cognitive processing of environmental distance information by humans, particularly that acquired via direct experience in the environment. The cognitive processes I consider for acquiring and thinking about environmental distance information include working-memory, nonmediated, hybrid, and simple-retrieval processes. Based on my review of the research literature, and additional considerations about the sources of distance information and the situations in which it is used, I propose an integrative conceptual model to explain the cognitive processing of distance information that takes account of the plurality of possible processes and information sources, and describes conditions under which particular processes and sources are likely to operate. The mechanism of summing vista distances is identified as widely important in situations with good visual access to the environment. Heuristics based on time, effort, or other information are likely to play their most important role when sensory access is restricted.

  17. Application of SADT and ARIS methodologies for modeling and management of business processes of information systems

    Directory of Open Access Journals (Sweden)

    O. V. Fedorova

    2018-01-01

    Full Text Available The article is devoted to application of SADT and ARIS methodologies for modeling and management of business processes of information systems. The relevance of this article is beyond doubt, because the design of the architecture of information systems, based on a thorough system analysis of the subject area, is of paramount importance for the development of information systems in general. The authors conducted a serious work on the analysis of the application of SADT and ARIS methodologies for modeling and managing business processes of information systems. The analysis was carried out both in terms of modeling business processes (notation and applying the CASE-tool, and in terms of business process management. The first point of view reflects the interaction of the business analyst and the programmer in the development of the information system. The second point of view is the interaction of the business analyst and the customer. The basis of many modern methodologies for modeling business processes is the SADT methodology. Using the methodology of the IDEF family, it is possible to efficiently display and analyze the activity models of a wide range of complex information systems in various aspects. CASE-tool ARIS is a complex of tools for analysis and modeling of the organization's activities. The methodical basis of ARIS is a set of different modeling methods that reflect different views on the system under study. The authors' conclusions are fully justified. The results of the work can be useful for specialists in the field of modeling business processes of information systems. In addition, the article has an oriented character when working on the constituent elements of curricula for students specializing in information specialties and management, provides an update of the content and structure of disciplines on modeling the architecture of information systems and organization management, using models.

  18. Application of a model of social information processing to nursing theory: how nurses respond to patients.

    Science.gov (United States)

    Sheldon, Lisa Kennedy; Ellington, Lee

    2008-11-01

    This paper is a report of a study to assess the applicability of a theoretical model of social information processing in expanding a nursing theory addressing how nurses respond to patients. Nursing communication affects patient outcomes such as anxiety, adherence to treatments and satisfaction with care. Orlando's theory of nursing process describes nurses' reactions to patients' behaviour as generating a perception, thought and feeling in the nurse and then action by the nurse. A model of social information processing describes the sequential steps in the cognitive processes used to respond to social cues and may be useful in describing the nursing process. Cognitive interviews were conducted in 2006 with a convenience sample of 5 nurses in the United States of America. The data were interpreted using the Crick and Dodge model of social information processing. Themes arising from cognitive interviews validated concepts of the nursing theory and the constructs of the model of social information processing. The interviews revealed that the support of peers was an additional construct involved in the development of communication skills, creation of a database and enhancement of self-efficacy. Models of social information processing enhance understanding of the process of how nurses respond to patients and further develop nursing theories further. In combination, the theories are useful in developing research into nurse-patient communication. Future research based on the expansion of nursing theory may identify effective and culturally appropriate nurse response patterns to specific patient interactions with implications for nursing care and patient outcomes.

  19. Information Processing and Risk Perception: An Adaptation of the Heuristic-Systematic Model.

    Science.gov (United States)

    Trumbo, Craig W.

    2002-01-01

    Describes heuristic-systematic information-processing model and risk perception--the two major conceptual areas of the analysis. Discusses the proposed model, describing the context of the data collections (public health communication involving cancer epidemiology) and providing the results of a set of three replications using the proposed model.…

  20. Clinical, information and business process modeling to promote development of safe and flexible software.

    Science.gov (United States)

    Liaw, Siaw-Teng; Deveny, Elizabeth; Morrison, Iain; Lewis, Bryn

    2006-09-01

    Using a factorial vignette survey and modeling methodology, we developed clinical and information models - incorporating evidence base, key concepts, relevant terms, decision-making and workflow needed to practice safely and effectively - to guide the development of an integrated rule-based knowledge module to support prescribing decisions in asthma. We identified workflows, decision-making factors, factor use, and clinician information requirements. The Unified Modeling Language (UML) and public domain software and knowledge engineering tools (e.g. Protégé) were used, with the Australian GP Data Model as the starting point for expressing information needs. A Web Services service-oriented architecture approach was adopted within which to express functional needs, and clinical processes and workflows were expressed in the Business Process Execution Language (BPEL). This formal analysis and modeling methodology to define and capture the process and logic of prescribing best practice in a reference implementation is fundamental to tackling deficiencies in prescribing decision support software.

  1. Conservation Process Model (cpm): a Twofold Scientific Research Scope in the Information Modelling for Cultural Heritage

    Science.gov (United States)

    Fiorani, D.; Acierno, M.

    2017-05-01

    The aim of the present research is to develop an instrument able to adequately support the conservation process by means of a twofold approach, based on both BIM environment and ontology formalisation. Although BIM has been successfully experimented within AEC (Architecture Engineering Construction) field, it has showed many drawbacks for architectural heritage. To cope with unicity and more generally complexity of ancient buildings, applications so far developed have shown to poorly adapt BIM to conservation design with unsatisfactory results (Dore, Murphy 2013; Carrara 2014). In order to combine achievements reached within AEC through BIM environment (design control and management) with an appropriate, semantically enriched and flexible The presented model has at its core a knowledge base developed through information ontologies and oriented around the formalization and computability of all the knowledge necessary for the full comprehension of the object of architectural heritage an its conservation. Such a knowledge representation is worked out upon conceptual categories defined above all within architectural criticism and conservation scope. The present paper aims at further extending the scope of conceptual modelling within cultural heritage conservation already formalized by the model. A special focus is directed on decay analysis and surfaces conservation project.

  2. CONSERVATION PROCESS MODEL (CPM: A TWOFOLD SCIENTIFIC RESEARCH SCOPE IN THE INFORMATION MODELLING FOR CULTURAL HERITAGE

    Directory of Open Access Journals (Sweden)

    D. Fiorani

    2017-05-01

    Full Text Available The aim of the present research is to develop an instrument able to adequately support the conservation process by means of a twofold approach, based on both BIM environment and ontology formalisation. Although BIM has been successfully experimented within AEC (Architecture Engineering Construction field, it has showed many drawbacks for architectural heritage. To cope with unicity and more generally complexity of ancient buildings, applications so far developed have shown to poorly adapt BIM to conservation design with unsatisfactory results (Dore, Murphy 2013; Carrara 2014. In order to combine achievements reached within AEC through BIM environment (design control and management with an appropriate, semantically enriched and flexible The presented model has at its core a knowledge base developed through information ontologies and oriented around the formalization and computability of all the knowledge necessary for the full comprehension of the object of architectural heritage an its conservation. Such a knowledge representation is worked out upon conceptual categories defined above all within architectural criticism and conservation scope. The present paper aims at further extending the scope of conceptual modelling within cultural heritage conservation already formalized by the model. A special focus is directed on decay analysis and surfaces conservation project.

  3. Implications of Building Information Modeling on Interior Design Education: The Impact on Teaching Design Processes

    Directory of Open Access Journals (Sweden)

    Amy Roehl, MFA

    2013-06-01

    Full Text Available Currently, major shifts occur in design processes effecting business practices for industries involved with designing and delivering the built environment. These changing conditions are a direct result of industry adoption of relatively new technologies called BIM or Building Information Modeling. This review of literature examines implications of these changing processes on interior design education.

  4. System model the processing of heterogeneous sensory information in robotized complex

    Science.gov (United States)

    Nikolaev, V.; Titov, V.; Syryamkin, V.

    2018-05-01

    Analyzed the scope and the types of robotic systems consisting of subsystems of the form "a heterogeneous sensors data processing subsystem". On the basis of the Queuing theory model is developed taking into account the unevenness of the intensity of information flow from the sensors to the subsystem of information processing. Analytical solution to assess the relationship of subsystem performance and uneven flows. The research of the obtained solution in the range of parameter values of practical interest.

  5. Models of neural dynamics in brain information processing - the developments of 'the decade'

    International Nuclear Information System (INIS)

    Borisyuk, G N; Borisyuk, R M; Kazanovich, Yakov B; Ivanitskii, Genrikh R

    2002-01-01

    Neural network models are discussed that have been developed during the last decade with the purpose of reproducing spatio-temporal patterns of neural activity in different brain structures. The main goal of the modeling was to test hypotheses of synchronization, temporal and phase relations in brain information processing. The models being considered are those of temporal structure of spike sequences, of neural activity dynamics, and oscillatory models of attention and feature integration. (reviews of topical problems)

  6. Extending the 4I Organizational Learning Model: Information Sources, Foraging Processes and Tools

    Directory of Open Access Journals (Sweden)

    Tracy A. Jenkin

    2013-08-01

    Full Text Available The continued importance of organizational learning has recently led to several calls for further developing the theory. This article addresses these calls by extending Crossan, Lane and White’s (1999 4I model to include a fifth process, information foraging, and a fourth level, the tool. The resulting 5I organizational learning model can be generalized to a number of learning contexts, especially those that involve understanding and making sense of data and information. Given the need for organizations to both innovate and increase productivity, and the volumes of data and information that are available to support both, the 5I model addresses an important organizational issue.

  7. Work and information processing in a solvable model of Maxwell's demon.

    Science.gov (United States)

    Mandal, Dibyendu; Jarzynski, Christopher

    2012-07-17

    We describe a minimal model of an autonomous Maxwell demon, a device that delivers work by rectifying thermal fluctuations while simultaneously writing information to a memory register. We solve exactly for the steady-state behavior of our model, and we construct its phase diagram. We find that our device can also act as a "Landauer eraser", using externally supplied work to remove information from the memory register. By exposing an explicit, transparent mechanism of operation, our model offers a simple paradigm for investigating the thermodynamics of information processing by small systems.

  8. [DESCRIPTION AND PRESENTATION OF THE RESULTS OF ELECTROENCEPHALOGRAM PROCESSING USING AN INFORMATION MODEL].

    Science.gov (United States)

    Myznikov, I L; Nabokov, N L; Rogovanov, D Yu; Khankevich, Yu R

    2016-01-01

    The paper proposes to apply the informational modeling of correlation matrix developed by I.L. Myznikov in early 1990s in neurophysiological investigations, such as electroencephalogram recording and analysis, coherence description of signals from electrodes on the head surface. The authors demonstrate information models built using the data from studies of inert gas inhalation by healthy human subjects. In the opinion of the authors, information models provide an opportunity to describe physiological processes with a high level of generalization. The procedure of presenting the EEG results holds great promise for the broad application.

  9. A neuromathematical model of human information processing and its application to science content acquisition

    Science.gov (United States)

    Anderson, O. Roger

    The rate of information processing during science learning and the efficiency of the learner in mobilizing relevant information in long-term memory as an aid in transmitting newly acquired information to stable storage in long-term memory are fundamental aspects of science content acquisition. These cognitive processes, moreover, may be substantially related in tempo and quality of organization to the efficiency of higher thought processes such as divergent thinking and problem-solving ability that characterize scientific thought. As a contribution to our quantitative understanding of these fundamental information processes, a mathematical model of information acquisition is presented and empirically evaluated in comparison to evidence obtained from experimental studies of science content acquisition. Computer-based models are used to simulate variations in learning parameters and to generate the theoretical predictions to be empirically tested. The initial tests of the predictive accuracy of the model show close agreement between predicted and actual mean recall scores in short-term learning tasks. Implications of the model for human information acquisition and possible future research are discussed in the context of the unique theoretical framework of the model.

  10. A stream-based mathematical model for distributed information processing systems - SysLab system model

    OpenAIRE

    Klein, Cornel; Rumpe, Bernhard; Broy, Manfred

    2014-01-01

    In the SysLab project we develop a software engineering method based on a mathematical foundation. The SysLab system model serves as an abstract mathematical model for information systems and their components. It is used to formalize the semantics of all used description techniques such as object diagrams state automata sequence charts or data-flow diagrams. Based on the requirements for such a reference model, we define the system model including its different views and their relationships.

  11. Temporal Information Processing and Stability Analysis of the MHSN Neuron Model in DDF

    Directory of Open Access Journals (Sweden)

    Saket Kumar Choudhary

    2016-12-01

    Full Text Available Implementation of a neuron like information processing structure at hardware level is a burning research problem. In this article, we analyze the modified hybrid spiking neuron model (the MHSN model in distributed delay framework (DDF for hardware level implementation point of view. We investigate its temporal information processing capability in term of inter-spike-interval (ISI distribution. We also perform the stability analysis of the MHSN model, in which, we compute nullclines, steady state solution, eigenvalues corresponding the MHSN model. During phase plane analysis, we notice that the MHSN model generates limit cycle oscillations which is an important phenomenon in many biological processes. Qualitative behavior of these limit cycle does not changes due to the variation in applied input stimulus, however, delay effect the spiking activity and duration of cycle get altered.

  12. Evolution of the archaeal and mammalian information processing systems: towards an archaeal model for human disease.

    Science.gov (United States)

    Lyu, Zhe; Whitman, William B

    2017-01-01

    Current evolutionary models suggest that Eukaryotes originated from within Archaea instead of being a sister lineage. To test this model of ancient evolution, we review recent studies and compare the three major information processing subsystems of replication, transcription and translation in the Archaea and Eukaryotes. Our hypothesis is that if the Eukaryotes arose within the archaeal radiation, their information processing systems will appear to be one of kind and not wholly original. Within the Eukaryotes, the mammalian or human systems are emphasized because of their importance in understanding health. Biochemical as well as genetic studies provide strong evidence for the functional similarity of archaeal homologs to the mammalian information processing system and their dissimilarity to the bacterial systems. In many independent instances, a simple archaeal system is functionally equivalent to more elaborate eukaryotic homologs, suggesting that evolution of complexity is likely an central feature of the eukaryotic information processing system. Because fewer components are often involved, biochemical characterizations of the archaeal systems are often easier to interpret. Similarly, the archaeal cell provides a genetically and metabolically simpler background, enabling convenient studies on the complex information processing system. Therefore, Archaea could serve as a parsimonious and tractable host for studying human diseases that arise in the information processing systems.

  13. Neurobiological correlates of cognitions in fear and anxiety: a cognitive-neurobiological information-processing model.

    Science.gov (United States)

    Hofmann, Stefan G; Ellard, Kristen K; Siegle, Greg J

    2012-01-01

    We review likely neurobiological substrates of cognitions related to fear and anxiety. Cognitive processes are linked to abnormal early activity reflecting hypervigilance in subcortical networks involving the amygdala, hippocampus, and insular cortex, and later recruitment of cortical regulatory resources, including activation of the anterior cingulate cortex and prefrontal cortex to implement avoidant response strategies. Based on this evidence, we present a cognitive-neurobiological information-processing model of fear and anxiety, linking distinct brain structures to specific stages of information processing of perceived threat.

  14. Parametric models to relate spike train and LFP dynamics with neural information processing.

    Science.gov (United States)

    Banerjee, Arpan; Dean, Heather L; Pesaran, Bijan

    2012-01-01

    Spike trains and local field potentials (LFPs) resulting from extracellular current flows provide a substrate for neural information processing. Understanding the neural code from simultaneous spike-field recordings and subsequent decoding of information processing events will have widespread applications. One way to demonstrate an understanding of the neural code, with particular advantages for the development of applications, is to formulate a parametric statistical model of neural activity and its covariates. Here, we propose a set of parametric spike-field models (unified models) that can be used with existing decoding algorithms to reveal the timing of task or stimulus specific processing. Our proposed unified modeling framework captures the effects of two important features of information processing: time-varying stimulus-driven inputs and ongoing background activity that occurs even in the absence of environmental inputs. We have applied this framework for decoding neural latencies in simulated and experimentally recorded spike-field sessions obtained from the lateral intraparietal area (LIP) of awake, behaving monkeys performing cued look-and-reach movements to spatial targets. Using both simulated and experimental data, we find that estimates of trial-by-trial parameters are not significantly affected by the presence of ongoing background activity. However, including background activity in the unified model improves goodness of fit for predicting individual spiking events. Uncovering the relationship between the model parameters and the timing of movements offers new ways to test hypotheses about the relationship between neural activity and behavior. We obtained significant spike-field onset time correlations from single trials using a previously published data set where significantly strong correlation was only obtained through trial averaging. We also found that unified models extracted a stronger relationship between neural response latency and trial

  15. Information Processing of Trauma.

    Science.gov (United States)

    Hartman, Carol R.; Burgess, Ann W.

    1993-01-01

    This paper presents a neuropsychosocial model of information processing to explain a victimization experience, specifically child sexual abuse. It surveys the relation of sensation, perception, and cognition as a systematic way to provide a framework for studying human behavior and describing human response to traumatic events. (Author/JDD)

  16. Quantum-like model of processing of information in the brain based on classical electromagnetic field.

    Science.gov (United States)

    Khrennikov, Andrei

    2011-09-01

    We propose a model of quantum-like (QL) processing of mental information. This model is based on quantum information theory. However, in contrast to models of "quantum physical brain" reducing mental activity (at least at the highest level) to quantum physical phenomena in the brain, our model matches well with the basic neuronal paradigm of the cognitive science. QL information processing is based (surprisingly) on classical electromagnetic signals induced by joint activity of neurons. This novel approach to quantum information is based on representation of quantum mechanics as a version of classical signal theory which was recently elaborated by the author. The brain uses the QL representation (QLR) for working with abstract concepts; concrete images are described by classical information theory. Two processes, classical and QL, are performed parallely. Moreover, information is actively transmitted from one representation to another. A QL concept given in our model by a density operator can generate a variety of concrete images given by temporal realizations of the corresponding (Gaussian) random signal. This signal has the covariance operator coinciding with the density operator encoding the abstract concept under consideration. The presence of various temporal scales in the brain plays the crucial role in creation of QLR in the brain. Moreover, in our model electromagnetic noise produced by neurons is a source of superstrong QL correlations between processes in different spatial domains in the brain; the binding problem is solved on the QL level, but with the aid of the classical background fluctuations. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  17. Information Processing: A Review of Implications of Johnstone's Model for Science Education

    Science.gov (United States)

    St Clair-Thompson, Helen; Overton, Tina; Botton, Chris

    2010-01-01

    The current review is concerned with an information processing model used in science education. The purpose is to summarise the current theoretical understanding, in published research, of a number of factors that are known to influence learning and achievement. These include field independence, working memory, long-term memory, and the use of…

  18. Modeling and Security Threat Assessments of Data Processed in Cloud Based Information Systems

    Directory of Open Access Journals (Sweden)

    Darya Sergeevna Simonenkova

    2016-03-01

    Full Text Available The subject of the research is modeling and security threat assessments of data processed in cloud based information systems (CBIS. This method allow to determine the current security threats of CBIS, state of the system in which vulnerabilities exists, level of possible violators, security properties and to generate recommendations for neutralizing security threats of CBIS.

  19. Sustainable Manufacturing via Multi-Scale, Physics-Based Process Modeling and Manufacturing-Informed Design

    Energy Technology Data Exchange (ETDEWEB)

    None

    2017-04-01

    This factsheet describes a project that developed and demonstrated a new manufacturing-informed design framework that utilizes advanced multi-scale, physics-based process modeling to dramatically improve manufacturing productivity and quality in machining operations while reducing the cost of machined components.

  20. A Multidirectional Model for Assessing Learning Disabled Students' Intelligence: An Information-Processing Framework.

    Science.gov (United States)

    Swanson, H. Lee

    1982-01-01

    An information processing approach to the assessment of learning disabled students' intellectual performance is presented. The model is based on the assumption that intelligent behavior is comprised of a variety of problem- solving strategies. An account of child problem solving is explained and illustrated with a "thinking aloud" protocol.…

  1. The experiential health information processing model: supporting collaborative web-based patient education.

    Science.gov (United States)

    O'Grady, Laura A; Witteman, Holly; Wathen, C Nadine

    2008-12-16

    First generation Internet technologies such as mailing lists or newsgroups afforded unprecedented levels of information exchange within a variety of interest groups, including those who seek health information. With emergence of the World Wide Web many communication applications were ported to web browsers. One of the driving factors in this phenomenon has been the exchange of experiential or anecdotal knowledge that patients share online, and there is emerging evidence that participation in these forums may be having an impact on people's health decision making. Theoretical frameworks supporting this form of information seeking and learning have yet to be proposed. In this article, we propose an adaptation of Kolb's experiential learning theory to begin to formulate an experiential health information processing model that may contribute to our understanding of online health information seeking behaviour in this context. An experiential health information processing model is proposed that can be used as a research framework. Future research directions include investigating the utility of this model in the online health information seeking context, studying the impact of collaborating in these online environments on patient decision making and on health outcomes are provided.

  2. The experiential health information processing model: supporting collaborative web-based patient education

    Science.gov (United States)

    O'Grady, Laura A; Witteman, Holly; Wathen, C Nadine

    2008-01-01

    Background First generation Internet technologies such as mailing lists or newsgroups afforded unprecedented levels of information exchange within a variety of interest groups, including those who seek health information. With emergence of the World Wide Web many communication applications were ported to web browsers. One of the driving factors in this phenomenon has been the exchange of experiential or anecdotal knowledge that patients share online, and there is emerging evidence that participation in these forums may be having an impact on people's health decision making. Theoretical frameworks supporting this form of information seeking and learning have yet to be proposed. Results In this article, we propose an adaptation of Kolb's experiential learning theory to begin to formulate an experiential health information processing model that may contribute to our understanding of online health information seeking behaviour in this context. Conclusion An experiential health information processing model is proposed that can be used as a research framework. Future research directions include investigating the utility of this model in the online health information seeking context, studying the impact of collaborating in these online environments on patient decision making and on health outcomes are provided. PMID:19087353

  3. The experiential health information processing model: supporting collaborative web-based patient education

    Directory of Open Access Journals (Sweden)

    Wathen C Nadine

    2008-12-01

    Full Text Available Abstract Background First generation Internet technologies such as mailing lists or newsgroups afforded unprecedented levels of information exchange within a variety of interest groups, including those who seek health information. With emergence of the World Wide Web many communication applications were ported to web browsers. One of the driving factors in this phenomenon has been the exchange of experiential or anecdotal knowledge that patients share online, and there is emerging evidence that participation in these forums may be having an impact on people's health decision making. Theoretical frameworks supporting this form of information seeking and learning have yet to be proposed. Results In this article, we propose an adaptation of Kolb's experiential learning theory to begin to formulate an experiential health information processing model that may contribute to our understanding of online health information seeking behaviour in this context. Conclusion An experiential health information processing model is proposed that can be used as a research framework. Future research directions include investigating the utility of this model in the online health information seeking context, studying the impact of collaborating in these online environments on patient decision making and on health outcomes are provided.

  4. PROCESSING THE INFORMATION CONTENT ON THE BASIS OF FUZZY NEURAL MODEL OF DECISION MAKING

    Directory of Open Access Journals (Sweden)

    Nina V. Komleva

    2013-01-01

    Full Text Available The article is devoted to the issues of mathematical modeling of the decision-making process of information content processing based on the fuzzy neural network TSK. Integral rating assessment of the content, which is necessary for taking a decision about its further usage, is made depended on varying characteristics. Mechanism for building individual trajectory and forming individual competence is provided to make the intellectual content search.

  5. Dynamic information processing states revealed through neurocognitive models of object semantics

    Science.gov (United States)

    Clarke, Alex

    2015-01-01

    Recognising objects relies on highly dynamic, interactive brain networks to process multiple aspects of object information. To fully understand how different forms of information about objects are represented and processed in the brain requires a neurocognitive account of visual object recognition that combines a detailed cognitive model of semantic knowledge with a neurobiological model of visual object processing. Here we ask how specific cognitive factors are instantiated in our mental processes and how they dynamically evolve over time. We suggest that coarse semantic information, based on generic shared semantic knowledge, is rapidly extracted from visual inputs and is sufficient to drive rapid category decisions. Subsequent recurrent neural activity between the anterior temporal lobe and posterior fusiform supports the formation of object-specific semantic representations – a conjunctive process primarily driven by the perirhinal cortex. These object-specific representations require the integration of shared and distinguishing object properties and support the unique recognition of objects. We conclude that a valuable way of understanding the cognitive activity of the brain is though testing the relationship between specific cognitive measures and dynamic neural activity. This kind of approach allows us to move towards uncovering the information processing states of the brain and how they evolve over time. PMID:25745632

  6. A cascade model of information processing and encoding for retinal prosthesis.

    Science.gov (United States)

    Pei, Zhi-Jun; Gao, Guan-Xin; Hao, Bo; Qiao, Qing-Li; Ai, Hui-Jian

    2016-04-01

    Retinal prosthesis offers a potential treatment for individuals suffering from photoreceptor degeneration diseases. Establishing biological retinal models and simulating how the biological retina convert incoming light signal into spike trains that can be properly decoded by the brain is a key issue. Some retinal models have been presented, ranking from structural models inspired by the layered architecture to functional models originated from a set of specific physiological phenomena. However, Most of these focus on stimulus image compression, edge detection and reconstruction, but do not generate spike trains corresponding to visual image. In this study, based on state-of-the-art retinal physiological mechanism, including effective visual information extraction, static nonlinear rectification of biological systems and neurons Poisson coding, a cascade model of the retina including the out plexiform layer for information processing and the inner plexiform layer for information encoding was brought forward, which integrates both anatomic connections and functional computations of retina. Using MATLAB software, spike trains corresponding to stimulus image were numerically computed by four steps: linear spatiotemporal filtering, static nonlinear rectification, radial sampling and then Poisson spike generation. The simulated results suggested that such a cascade model could recreate visual information processing and encoding functionalities of the retina, which is helpful in developing artificial retina for the retinally blind.

  7. Information transfer with rate-modulated Poisson processes: a simple model for nonstationary stochastic resonance.

    Science.gov (United States)

    Goychuk, I

    2001-08-01

    Stochastic resonance in a simple model of information transfer is studied for sensory neurons and ensembles of ion channels. An exact expression for the information gain is obtained for the Poisson process with the signal-modulated spiking rate. This result allows one to generalize the conventional stochastic resonance (SR) problem (with periodic input signal) to the arbitrary signals of finite duration (nonstationary SR). Moreover, in the case of a periodic signal, the rate of information gain is compared with the conventional signal-to-noise ratio. The paper establishes the general nonequivalence between both measures notwithstanding their apparent similarity in the limit of weak signals.

  8. ADHD performance reflects inefficient but not impulsive information processing: a diffusion model analysis.

    Science.gov (United States)

    Metin, Baris; Roeyers, Herbert; Wiersema, Jan R; van der Meere, Jaap J; Thompson, Margaret; Sonuga-Barke, Edmund

    2013-03-01

    Attention-deficit/hyperactivity disorder (ADHD) is associated with performance deficits across a broad range of tasks. Although individual tasks are designed to tap specific cognitive functions (e.g., memory, inhibition, planning, etc.), these deficits could also reflect general effects related to either inefficient or impulsive information processing or both. These two components cannot be isolated from each other on the basis of classical analysis in which mean reaction time (RT) and mean accuracy are handled separately. Seventy children with a diagnosis of combined type ADHD and 50 healthy controls (between 6 and 17 years) performed two tasks: a simple two-choice RT (2-CRT) task and a conflict control task (CCT) that required higher levels of executive control. RT and errors were analyzed using the Ratcliff diffusion model, which divides decisional time into separate estimates of information processing efficiency (called "drift rate") and speed-accuracy tradeoff (SATO, called "boundary"). The model also provides an estimate of general nondecisional time. Results were the same for both tasks independent of executive load. ADHD was associated with lower drift rate and less nondecisional time. The groups did not differ in terms of boundary parameter estimates. RT and accuracy performance in ADHD appears to reflect inefficient rather than impulsive information processing, an effect independent of executive function load. The results are consistent with models in which basic information processing deficits make an important contribution to the ADHD cognitive phenotype. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  9. An information theoretic model of information processing in the Drosophila olfactory system: the role of inhibitory neurons for system efficiency.

    Science.gov (United States)

    Faghihi, Faramarz; Kolodziejski, Christoph; Fiala, André; Wörgötter, Florentin; Tetzlaff, Christian

    2013-12-20

    Fruit flies (Drosophila melanogaster) rely on their olfactory system to process environmental information. This information has to be transmitted without system-relevant loss by the olfactory system to deeper brain areas for learning. Here we study the role of several parameters of the fly's olfactory system and the environment and how they influence olfactory information transmission. We have designed an abstract model of the antennal lobe, the mushroom body and the inhibitory circuitry. Mutual information between the olfactory environment, simulated in terms of different odor concentrations, and a sub-population of intrinsic mushroom body neurons (Kenyon cells) was calculated to quantify the efficiency of information transmission. With this method we study, on the one hand, the effect of different connectivity rates between olfactory projection neurons and firing thresholds of Kenyon cells. On the other hand, we analyze the influence of inhibition on mutual information between environment and mushroom body. Our simulations show an expected linear relation between the connectivity rate between the antennal lobe and the mushroom body and firing threshold of the Kenyon cells to obtain maximum mutual information for both low and high odor concentrations. However, contradicting all-day experiences, high odor concentrations cause a drastic, and unrealistic, decrease in mutual information for all connectivity rates compared to low concentration. But when inhibition on the mushroom body is included, mutual information remains at high levels independent of other system parameters. This finding points to a pivotal role of inhibition in fly information processing without which the system efficiency will be substantially reduced.

  10. Using stochastic language models (SLM) to map lexical, syntactic, and phonological information processing in the brain.

    Science.gov (United States)

    Lopopolo, Alessandro; Frank, Stefan L; van den Bosch, Antal; Willems, Roel M

    2017-01-01

    Language comprehension involves the simultaneous processing of information at the phonological, syntactic, and lexical level. We track these three distinct streams of information in the brain by using stochastic measures derived from computational language models to detect neural correlates of phoneme, part-of-speech, and word processing in an fMRI experiment. Probabilistic language models have proven to be useful tools for studying how language is processed as a sequence of symbols unfolding in time. Conditional probabilities between sequences of words are at the basis of probabilistic measures such as surprisal and perplexity which have been successfully used as predictors of several behavioural and neural correlates of sentence processing. Here we computed perplexity from sequences of words and their parts of speech, and their phonemic transcriptions. Brain activity time-locked to each word is regressed on the three model-derived measures. We observe that the brain keeps track of the statistical structure of lexical, syntactic and phonological information in distinct areas.

  11. Formalize clinical processes into electronic health information systems: Modelling a screening service for diabetic retinopathy.

    Science.gov (United States)

    Eguzkiza, Aitor; Trigo, Jesús Daniel; Martínez-Espronceda, Miguel; Serrano, Luis; Andonegui, José

    2015-08-01

    Most healthcare services use information and communication technologies to reduce and redistribute the workload associated with follow-up of chronic conditions. However, the lack of normalization of the information handled in and exchanged between such services hinders the scalability and extendibility. The use of medical standards for modelling and exchanging information, especially dual-model based approaches, can enhance the features of screening services. Hence, the approach of this paper is twofold. First, this article presents a generic methodology to model patient-centered clinical processes. Second, a proof of concept of the proposed methodology was conducted within the diabetic retinopathy (DR) screening service of the Health Service of Navarre (Spain) in compliance with a specific dual-model norm (openEHR). As a result, a set of elements required for deploying a model-driven DR screening service has been established, namely: clinical concepts, archetypes, termsets, templates, guideline definition rules, and user interface definitions. This model fosters reusability, because those elements are available to be downloaded and integrated in any healthcare service, and interoperability, since from then on such services can share information seamlessly. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. From information processing to decisions: Formalizing and comparing psychologically plausible choice models.

    Science.gov (United States)

    Heck, Daniel W; Hilbig, Benjamin E; Moshagen, Morten

    2017-08-01

    Decision strategies explain how people integrate multiple sources of information to make probabilistic inferences. In the past decade, increasingly sophisticated methods have been developed to determine which strategy explains decision behavior best. We extend these efforts to test psychologically more plausible models (i.e., strategies), including a new, probabilistic version of the take-the-best (TTB) heuristic that implements a rank order of error probabilities based on sequential processing. Within a coherent statistical framework, deterministic and probabilistic versions of TTB and other strategies can directly be compared using model selection by minimum description length or the Bayes factor. In an experiment with inferences from given information, only three of 104 participants were best described by the psychologically plausible, probabilistic version of TTB. Similar as in previous studies, most participants were classified as users of weighted-additive, a strategy that integrates all available information and approximates rational decisions. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Model-free stochastic processes studied with q-wavelet-based informational tools

    International Nuclear Information System (INIS)

    Perez, D.G.; Zunino, L.; Martin, M.T.; Garavaglia, M.; Plastino, A.; Rosso, O.A.

    2007-01-01

    We undertake a model-free investigation of stochastic processes employing q-wavelet based quantifiers, that constitute a generalization of their Shannon counterparts. It is shown that (i) interesting physical information becomes accessible in such a way (ii) for special q values the quantifiers are more sensitive than the Shannon ones and (iii) there exist an implicit relationship between the Hurst parameter H and q within this wavelet framework

  14. Dendritic excitability modulates dendritic information processing in a purkinje cell model.

    Science.gov (United States)

    Coop, Allan D; Cornelis, Hugo; Santamaria, Fidel

    2010-01-01

    Using an electrophysiological compartmental model of a Purkinje cell we quantified the contribution of individual active dendritic currents to processing of synaptic activity from granule cells. We used mutual information as a measure to quantify the information from the total excitatory input current (I(Glu)) encoded in each dendritic current. In this context, each active current was considered an information channel. Our analyses showed that most of the information was encoded by the calcium (I(CaP)) and calcium activated potassium (I(Kc)) currents. Mutual information between I(Glu) and I(CaP) and I(Kc) was sensitive to different levels of excitatory and inhibitory synaptic activity that, at the same time, resulted in the same firing rate at the soma. Since dendritic excitability could be a mechanism to regulate information processing in neurons we quantified the changes in mutual information between I(Glu) and all Purkinje cell currents as a function of the density of dendritic Ca (g(CaP)) and Kca (g(Kc)) conductances. We extended our analysis to determine the window of temporal integration of I(Glu) by I(CaP) and I(Kc) as a function of channel density and synaptic activity. The window of information integration has a stronger dependence on increasing values of g(Kc) than on g(CaP), but at high levels of synaptic stimulation information integration is reduced to a few milliseconds. Overall, our results show that different dendritic conductances differentially encode synaptic activity and that dendritic excitability and the level of synaptic activity regulate the flow of information in dendrites.

  15. Information Theory Analysis of Cascading Process in a Synthetic Model of Fluid Turbulence

    Directory of Open Access Journals (Sweden)

    Massimo Materassi

    2014-02-01

    Full Text Available The use of transfer entropy has proven to be helpful in detecting which is the verse of dynamical driving in the interaction of two processes, X and Y . In this paper, we present a different normalization for the transfer entropy, which is capable of better detecting the information transfer direction. This new normalized transfer entropy is applied to the detection of the verse of energy flux transfer in a synthetic model of fluid turbulence, namely the Gledzer–Ohkitana–Yamada shell model. Indeed, this is a fully well-known model able to model the fully developed turbulence in the Fourier space, which is characterized by an energy cascade towards the small scales (large wavenumbers k, so that the application of the information-theory analysis to its outcome tests the reliability of the analysis tool rather than exploring the model physics. As a result, the presence of a direct cascade along the scales in the shell model and the locality of the interactions in the space of wavenumbers come out as expected, indicating the validity of this data analysis tool. In this context, the use of a normalized version of transfer entropy, able to account for the difference of the intrinsic randomness of the interacting processes, appears to perform better, being able to discriminate the wrong conclusions to which the “traditional” transfer entropy would drive.

  16. Processing of angular motion and gravity information through an internal model.

    Science.gov (United States)

    Laurens, Jean; Straumann, Dominik; Hess, Bernhard J M

    2010-09-01

    The vestibular organs in the base of the skull provide important information about head orientation and motion in space. Previous studies have suggested that both angular velocity information from the semicircular canals and information about head orientation and translation from the otolith organs are centrally processed in an internal model of head motion, using the principles of optimal estimation. This concept has been successfully applied to model behavioral responses to classical vestibular motion paradigms. This study measured the dynamic of the vestibuloocular reflex during postrotatory tilt, tilt during the optokinetic afternystagmus, and off-vertical axis rotation. The influence of otolith signal on the VOR was systematically varied by using a series of tilt angles. We found that the time constants of responses varied almost identically as a function of gravity in these paradigms. We show that Bayesian modeling could predict the experimental results in an accurate and consistent manner. In contrast to other approaches, the Bayesian model also provides a plausible explanation of why these vestibulooculo motor responses occur as a consequence of an internal process of optimal motion estimation.

  17. Clinical information modeling processes for semantic interoperability of electronic health records: systematic review and inductive analysis.

    Science.gov (United States)

    Moreno-Conde, Alberto; Moner, David; Cruz, Wellington Dimas da; Santos, Marcelo R; Maldonado, José Alberto; Robles, Montserrat; Kalra, Dipak

    2015-07-01

    This systematic review aims to identify and compare the existing processes and methodologies that have been published in the literature for defining clinical information models (CIMs) that support the semantic interoperability of electronic health record (EHR) systems. Following the preferred reporting items for systematic reviews and meta-analyses systematic review methodology, the authors reviewed published papers between 2000 and 2013 that covered that semantic interoperability of EHRs, found by searching the PubMed, IEEE Xplore, and ScienceDirect databases. Additionally, after selection of a final group of articles, an inductive content analysis was done to summarize the steps and methodologies followed in order to build CIMs described in those articles. Three hundred and seventy-eight articles were screened and thirty six were selected for full review. The articles selected for full review were analyzed to extract relevant information for the analysis and characterized according to the steps the authors had followed for clinical information modeling. Most of the reviewed papers lack a detailed description of the modeling methodologies used to create CIMs. A representative example is the lack of description related to the definition of terminology bindings and the publication of the generated models. However, this systematic review confirms that most clinical information modeling activities follow very similar steps for the definition of CIMs. Having a robust and shared methodology could improve their correctness, reliability, and quality. Independently of implementation technologies and standards, it is possible to find common patterns in methods for developing CIMs, suggesting the viability of defining a unified good practice methodology to be used by any clinical information modeler. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. The IPOO-model of creative learning and the students' information processing characteristics

    Directory of Open Access Journals (Sweden)

    Katalin Mező

    2015-03-01

    Full Text Available The present study was designed to examine secondary school students' information processing characteristics during learning and their relationship with the students' academic averages, internal motivation for learning and cognitive abilities, such as intelligence and creativity. Although many studies have previously focused on this issue, we are now studying this topic from the perspective of the IPOO-model, which is a new theoretical approach to school learning (note: IPOO is an acronym of Input, Process, Output, Organizing. This study featured 815 participants (secondary school students who completed the following tests and questionnaires: Raven's Advanced Progressive Matrices (APM intelligence test, the "Unusual Uses" creativity test (UUT, the 2nd version of the Jupiterbolha-próba (Jupiter Flea test – JB2 to test the information processing method of learning, and the Learning Attitude Questionnaire (LAQ. In our analysis we took the gender, school grade and academic average of participants into account. According to our results, the quality of information-processing methods of learning is at a low level, and there are no significant strong correlational relationships among the tests and questionnaire results (except in the cases of fluency, originality, and flexibility. There were no significant differences between genders or classes. These findings are consistent with the findings of previous studies.

  19. Studies on Manfred Eigen's model for the self-organization of information processing.

    Science.gov (United States)

    Ebeling, W; Feistel, R

    2018-05-01

    In 1971, Manfred Eigen extended the principles of Darwinian evolution to chemical processes, from catalytic networks to the emergence of information processing at the molecular level, leading to the emergence of life. In this paper, we investigate some very general characteristics of this scenario, such as the valuation process of phenotypic traits in a high-dimensional fitness landscape, the effect of spatial compartmentation on the valuation, and the self-organized transition from structural to symbolic genetic information of replicating chain molecules. In the first part, we perform an analysis of typical dynamical properties of continuous dynamical models of evolutionary processes. In particular, we study the mapping of genotype to continuous phenotype spaces following the ideas of Wright and Conrad. We investigate typical features of a Schrödinger-like dynamics, the consequences of the high dimensionality, the leading role of saddle points, and Conrad's extra-dimensional bypass. In the last part, we discuss in brief the valuation of compartment models and the self-organized emergence of molecular symbols at the beginning of life.

  20. Logical reasoning versus information processing in the dual-strategy model of reasoning.

    Science.gov (United States)

    Markovits, Henry; Brisson, Janie; de Chantal, Pier-Luc

    2017-01-01

    One of the major debates concerning the nature of inferential reasoning is between counterexample-based strategies such as mental model theory and statistical strategies underlying probabilistic models. The dual-strategy model, proposed by Verschueren, Schaeken, & d'Ydewalle (2005a, 2005b), which suggests that people might have access to both kinds of strategy has been supported by several recent studies. These have shown that statistical reasoners make inferences based on using information about premises in order to generate a likelihood estimate of conclusion probability. However, while results concerning counterexample reasoners are consistent with a counterexample detection model, these results could equally be interpreted as indicating a greater sensitivity to logical form. In order to distinguish these 2 interpretations, in Studies 1 and 2, we presented reasoners with Modus ponens (MP) inferences with statistical information about premise strength and in Studies 3 and 4, naturalistic MP inferences with premises having many disabling conditions. Statistical reasoners accepted the MP inference more often than counterexample reasoners in Studies 1 and 2, while the opposite pattern was observed in Studies 3 and 4. Results show that these strategies must be defined in terms of information processing, with no clear relations to "logical" reasoning. These results have additional implications for the underlying debate about the nature of human reasoning. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  1. An information-processing model of three cortical regions: evidence in episodic memory retrieval.

    Science.gov (United States)

    Sohn, Myeong-Ho; Goode, Adam; Stenger, V Andrew; Jung, Kwan-Jin; Carter, Cameron S; Anderson, John R

    2005-03-01

    ACT-R (Anderson, J.R., et al., 2003. An information-processing model of the BOLD response in symbol manipulation tasks. Psychon. Bull. Rev. 10, 241-261) relates the inferior dorso-lateral prefrontal cortex to a retrieval buffer that holds information retrieved from memory and the posterior parietal cortex to an imaginal buffer that holds problem representations. Because the number of changes in a problem representation is not necessarily correlated with retrieval difficulties, it is possible to dissociate prefrontal-parietal activations. In two fMRI experiments, we examined this dissociation using the fan effect paradigm. Experiment 1 compared a recognition task, in which representation requirement remains the same regardless of retrieval difficulty, with a recall task, in which both representation and retrieval loads increase with retrieval difficulty. In the recognition task, the prefrontal activation revealed a fan effect but not the parietal activation. In the recall task, both regions revealed fan effects. In Experiment 2, we compared visually presented stimuli and aurally presented stimuli using the recognition task. While only the prefrontal region revealed the fan effect, the activation patterns in the prefrontal and the parietal region did not differ by stimulus presentation modality. In general, these results provide support for the prefrontal-parietal dissociation in terms of retrieval and representation and the modality-independent nature of the information processed by these regions. Using ACT-R, we also provide computational models that explain patterns of fMRI responses in these two areas during recognition and recall.

  2. Modeling Deficits From Early Auditory Information Processing to Psychosocial Functioning in Schizophrenia.

    Science.gov (United States)

    Thomas, Michael L; Green, Michael F; Hellemann, Gerhard; Sugar, Catherine A; Tarasenko, Melissa; Calkins, Monica E; Greenwood, Tiffany A; Gur, Raquel E; Gur, Ruben C; Lazzeroni, Laura C; Nuechterlein, Keith H; Radant, Allen D; Seidman, Larry J; Shiluk, Alexandra L; Siever, Larry J; Silverman, Jeremy M; Sprock, Joyce; Stone, William S; Swerdlow, Neal R; Tsuang, Debby W; Tsuang, Ming T; Turetsky, Bruce I; Braff, David L; Light, Gregory A

    2017-01-01

    Neurophysiologic measures of early auditory information processing (EAP) are used as endophenotypes in genomic studies and biomarkers in clinical intervention studies. Research in schizophrenia has established correlations among measures of EAP, cognition, clinical symptoms, and functional outcome. Clarifying these associations by determining the pathways through which deficits in EAP affect functioning would suggest when and where to therapeutically intervene. To characterize the pathways from EAP to outcome and to estimate the extent to which enhancement of basic information processing might improve cognition and psychosocial functioning in schizophrenia. Cross-sectional data were analyzed using structural equation modeling to examine the associations among EAP, cognition, negative symptoms, and functional outcome. Participants were recruited from the community at 5 geographically distributed laboratories as part of the Consortium on the Genetics of Schizophrenia 2 from July 1, 2010, through January 31, 2014. This well-characterized cohort of 1415 patients with schizophrenia underwent EAP, cognitive, and thorough clinical and functional assessment. Mismatch negativity, P3a, and reorienting negativity were used to measure EAP. Cognition was measured by the Letter Number Span test and scales from the California Verbal Learning Test-Second Edition, the Wechsler Memory Scale-Third Edition, and the Penn Computerized Neurocognitive Battery. Negative symptoms were measured by the Scale for the Assessment of Negative Symptoms. Functional outcome was measured by the Role Functioning Scale. Participants included 1415 unrelated outpatients diagnosed with schizophrenia or schizoaffective disorder (mean [SD] age, 46 [11] years; 979 males [69.2%] and 619 white [43.7%]). Early auditory information processing had a direct effect on cognition (β = 0.37, P model in which EAP deficits lead to poor functional outcome via impaired cognition and increased negative symptoms

  3. Methods of information processing

    Energy Technology Data Exchange (ETDEWEB)

    Kosarev, Yu G; Gusev, V D

    1978-01-01

    Works are presented on automation systems for editing and publishing operations by methods of processing symbol information and information contained in training selection (ranking of objectives by promise, classification algorithm of tones and noise). The book will be of interest to specialists in the automation of processing textural information, programming, and pattern recognition.

  4. Emotional voices in context: a neurobiological model of multimodal affective information processing.

    Science.gov (United States)

    Brück, Carolin; Kreifelts, Benjamin; Wildgruber, Dirk

    2011-12-01

    Just as eyes are often considered a gateway to the soul, the human voice offers a window through which we gain access to our fellow human beings' minds - their attitudes, intentions and feelings. Whether in talking or singing, crying or laughing, sighing or screaming, the sheer sound of a voice communicates a wealth of information that, in turn, may serve the observant listener as valuable guidepost in social interaction. But how do human beings extract information from the tone of a voice? In an attempt to answer this question, the present article reviews empirical evidence detailing the cerebral processes that underlie our ability to decode emotional information from vocal signals. The review will focus primarily on two prominent classes of vocal emotion cues: laughter and speech prosody (i.e. the tone of voice while speaking). Following a brief introduction, behavioral as well as neuroimaging data will be summarized that allows to outline cerebral mechanisms associated with the decoding of emotional voice cues, as well as the influence of various context variables (e.g. co-occurring facial and verbal emotional signals, attention focus, person-specific parameters such as gender and personality) on the respective processes. Building on the presented evidence, a cerebral network model will be introduced that proposes a differential contribution of various cortical and subcortical brain structures to the processing of emotional voice signals both in isolation and in context of accompanying (facial and verbal) emotional cues. Copyright © 2011 Elsevier B.V. All rights reserved.

  5. Emotional voices in context: A neurobiological model of multimodal affective information processing

    Science.gov (United States)

    Brück, Carolin; Kreifelts, Benjamin; Wildgruber, Dirk

    2011-12-01

    Just as eyes are often considered a gateway to the soul, the human voice offers a window through which we gain access to our fellow human beings' minds - their attitudes, intentions and feelings. Whether in talking or singing, crying or laughing, sighing or screaming, the sheer sound of a voice communicates a wealth of information that, in turn, may serve the observant listener as valuable guidepost in social interaction. But how do human beings extract information from the tone of a voice? In an attempt to answer this question, the present article reviews empirical evidence detailing the cerebral processes that underlie our ability to decode emotional information from vocal signals. The review will focus primarily on two prominent classes of vocal emotion cues: laughter and speech prosody (i.e. the tone of voice while speaking). Following a brief introduction, behavioral as well as neuroimaging data will be summarized that allows to outline cerebral mechanisms associated with the decoding of emotional voice cues, as well as the influence of various context variables (e.g. co-occurring facial and verbal emotional signals, attention focus, person-specific parameters such as gender and personality) on the respective processes. Building on the presented evidence, a cerebral network model will be introduced that proposes a differential contribution of various cortical and subcortical brain structures to the processing of emotional voice signals both in isolation and in context of accompanying (facial and verbal) emotional cues.

  6. Algorithm-structured computer arrays and networks architectures and processes for images, percepts, models, information

    CERN Document Server

    Uhr, Leonard

    1984-01-01

    Computer Science and Applied Mathematics: Algorithm-Structured Computer Arrays and Networks: Architectures and Processes for Images, Percepts, Models, Information examines the parallel-array, pipeline, and other network multi-computers.This book describes and explores arrays and networks, those built, being designed, or proposed. The problems of developing higher-level languages for systems and designing algorithm, program, data flow, and computer structure are also discussed. This text likewise describes several sequences of successively more general attempts to combine the power of arrays wi

  7. Automatic Generation of Object Models for Process Planning and Control Purposes using an International standard for Information Exchange

    Directory of Open Access Journals (Sweden)

    Petter Falkman

    2003-10-01

    Full Text Available In this paper a formal mapping between static information models and dynamic models is presented. The static information models are given according to an international standard for product, process and resource information exchange, (ISO 10303-214. The dynamic models are described as Discrete Event Systems. The product, process and resource information is automatically converted into product routes and used for simulation, controller synthesis and verification. A high level language, combining Petri nets and process algebra, is presented and used for speci- fication of desired routes. A main implication of the presented method is that it enables the reuse of process information when creating dynamic models for process control. This method also enables simulation and verification to be conducted early in the development chain.

  8. The role of categorization and scale endpoint comparisons in numerical information processing: A two-process model.

    Science.gov (United States)

    Tao, Tao; Wyer, Robert S; Zheng, Yuhuang

    2017-03-01

    We propose a two-process conceptualization of numerical information processing to describe how people form impressions of a score that is described along a bounded scale. According to the model, people spontaneously categorize a score as high or low. Furthermore, they compare the numerical discrepancy between the score and the endpoint of the scale to which it is closer, if they are not confident of their categorization, and use implications of this comparison as a basis for judgment. As a result, their evaluation of the score is less extreme when the range of numbers along the scale is large (e.g., from 0 to 100) than when it is small (from 0 to 10). Six experiments support this two-process model and demonstrate its generalizability. Specifically, the magnitude of numbers composing the scale has less impact on judgments (a) when the score being evaluated is extreme, (b) when individuals are unmotivated to engage in endpoint comparison processes (i.e., they are low in need for cognition), and (c) when they are unable to do so (i.e., they are under cognitive load). Moreover, the endpoint to which individuals compare the score can depend on their regulatory focus. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  9. Pyphant – A Python Framework for Modelling Reusable Information Processing Tasks

    Directory of Open Access Journals (Sweden)

    2008-06-01

    Full Text Available We are presenting the Python framework “Pyphant” for the creation and application of information flow models. The central idea of this approach is to encapsulate each data processing step in one unit which we call a worker. A worker receives input via sockets and provides the results of its data processing via plugs. These can be connected to other workers' sockets. The resulting directed graph is called a recipe. Classes for these objects comprise the Pyphant core. To implement the actual processing steps, Pyphant relies on third-party plug-ins which extend the basic worker class and can be distributed as Python eggs. On top of the core, Pyphant offers an information exchange layer which facilitates the interoperability of the workers, using Numpy objects. A third layer comprises textual and graphical user interfaces. The former allows for the batch processing of data and the latter allows for the interactive construction of recipes.

    This paper discusses the Pyphant framework and presents an example recipe for determining the length scale of aggregated polymeric phases, building an amphiphilic conetwork from an Atomic Force Microscopy (AFM phase mode image.

  10. Natural Information Processing Systems

    OpenAIRE

    John Sweller; Susan Sweller

    2006-01-01

    Natural information processing systems such as biological evolution and human cognition organize information used to govern the activities of natural entities. When dealing with biologically secondary information, these systems can be specified by five common principles that we propose underlie natural information processing systems. The principles equate: (1) human long-term memory with a genome; (2) learning from other humans with biological reproduction; (3) problem solving through random ...

  11. Quantum information processing

    National Research Council Canada - National Science Library

    Leuchs, Gerd; Beth, Thomas

    2003-01-01

    ... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.5 SimulationofHamiltonians... References... 1 1 1 3 5 8 10 2 Quantum Information Processing and Error Correction with Jump Codes (G. Alber, M. Mussinger...

  12. Information management in process planning

    NARCIS (Netherlands)

    Lutters, Diederick; Wijnker, T.C.; Kals, H.J.J.

    1999-01-01

    A recently proposed reference model indicates the use of structured information as the basis for the control of design and manufacturing processes. The model is used as a basis to describe the integration of design and process planning. A differentiation is made between macro- and micro process

  13. The Analysis of Electronic Journal Utilization In Learning Process: Technology Acceptance Model And Information System Success

    Directory of Open Access Journals (Sweden)

    Achmad Zaky

    2017-12-01

    Full Text Available This study aims to observe the behavior of electronic journal (e-journal among bachelor students of the Universitas Brawijaya by Technology Acceptance Model (TAM and Information System Success (ISS as theoretical framework. The research samples are all bachelor students who have used e-journal in their learning process. The respondents are selected by convenience sampling method. The data are collected through survey and analyzed by Partial Least Square (PLS with SmartPLS 3. The result of the study reveals that user satisfaction and intention to use have significant effect on actual use of e-journal among bachelor students at the Universitas Brawijaya. Those variables affect the actual use because they have been formed by other variables such as information quality, perceived easiness, perceived usefulness, and attitude towards behavior. Furthermore, information quality has significant influence on user satisfaction, while perceived usefulness and perceived usefulness do not have direct effect on the intention to use. The implication of this study is relevant for educators to recognize the reason factors to use e-journal in the learning process.

  14. Developing a Process Model for the Forensic Extraction of Information from Desktop Search Applications

    Directory of Open Access Journals (Sweden)

    Timothy Pavlic

    2008-03-01

    Full Text Available Desktop search applications can contain cached copies of files that were deleted from the file system. Forensic investigators see this as a potential source of evidence, as documents deleted by suspects may still exist in the cache. Whilst there have been attempts at recovering data collected by desktop search applications, there is no methodology governing the process, nor discussion on the most appropriate means to do so. This article seeks to address this issue by developing a process model that can be applied when developing an information extraction application for desktop search applications, discussing preferred methods and the limitations of each. This work represents a more structured approach than other forms of current research.

  15. Parametric Accuracy: Building Information Modeling Process Applied to the Cultural Heritage Preservation

    Science.gov (United States)

    Garagnani, S.; Manferdini, A. M.

    2013-02-01

    Since their introduction, modeling tools aimed to architectural design evolved in today's "digital multi-purpose drawing boards" based on enhanced parametric elements able to originate whole buildings within virtual environments. Semantic splitting and elements topology are features that allow objects to be "intelligent" (i.e. self-aware of what kind of element they are and with whom they can interact), representing this way basics of Building Information Modeling (BIM), a coordinated, consistent and always up to date workflow improved in order to reach higher quality, reliability and cost reductions all over the design process. Even if BIM was originally intended for new architectures, its attitude to store semantic inter-related information can be successfully applied to existing buildings as well, especially if they deserve particular care such as Cultural Heritage sites. BIM engines can easily manage simple parametric geometries, collapsing them to standard primitives connected through hierarchical relationships: however, when components are generated by existing morphologies, for example acquiring point clouds by digital photogrammetry or laser scanning equipment, complex abstractions have to be introduced while remodeling elements by hand, since automatic feature extraction in available software is still not effective. In order to introduce a methodology destined to process point cloud data in a BIM environment with high accuracy, this paper describes some experiences on monumental sites documentation, generated through a plug-in written for Autodesk Revit and codenamed GreenSpider after its capability to layout points in space as if they were nodes of an ideal cobweb.

  16. Construction Process Simulation and Safety Analysis Based on Building Information Model and 4D Technology

    Institute of Scientific and Technical Information of China (English)

    HU Zhenzhong; ZHANG Jianping; DENG Ziyin

    2008-01-01

    Time-dependent structure analysis theory has been proved to be more accurate and reliable com-pared to commonly used methods during construction. However, so far applications are limited to partial pe-riod and part of the structure because of immeasurable artificial intervention. Based on the building informa-tion model (BIM) and four-dimensional (4D) technology, this paper proposes an improves structure analysis method, which can generate structural geometry, resistance model, and loading conditions automatically by a close interlink of the schedule information, architectural model, and material properties. The method was applied to a safety analysis during a continuous and dynamic simulation of the entire construction process.The results show that the organic combination of the BIM, 4D technology, construction simulation, and safety analysis of time-dependent structures is feasible and practical. This research also lays a foundation for further researches on building lifecycle management by combining architectural design, structure analy-sis, and construction management.

  17. Hybrid quantum information processing

    Energy Technology Data Exchange (ETDEWEB)

    Furusawa, Akira [Department of Applied Physics, School of Engineering, The University of Tokyo (Japan)

    2014-12-04

    I will briefly explain the definition and advantage of hybrid quantum information processing, which is hybridization of qubit and continuous-variable technologies. The final goal would be realization of universal gate sets both for qubit and continuous-variable quantum information processing with the hybrid technologies. For that purpose, qubit teleportation with a continuousvariable teleporter is one of the most important ingredients.

  18. Information-processing genes

    International Nuclear Information System (INIS)

    Tahir Shah, K.

    1995-01-01

    There are an estimated 100,000 genes in the human genome of which 97% is non-coding. On the other hand, bacteria have little or no non-coding DNA. Non-coding region includes introns, ALU sequences, satellite DNA, and other segments not expressed as proteins. Why it exists? Why nature has kept non-coding during the long evolutionary period if it has no role in the development of complex life forms? Does complexity of a species somehow correlated to the existence of apparently useless sequences? What kind of capability is encoded within such nucleotide sequences that is a necessary, but not a sufficient condition for the evolution of complex life forms, keeping in mind the C-value paradox and the omnipresence of non-coding segments in higher eurkaryotes and also in many archea and prokaryotes. The physico-chemical description of biological processes is hardware oriented and does not highlight algorithmic or information processing aspect. However, an algorithm without its hardware implementation is useless as much as hardware without its capability to run an algorithm. The nature and type of computation an information-processing hardware can perform depends only on its algorithm and the architecture that reflects the algorithm. Given that enormously difficult tasks such as high fidelity replication, transcription, editing and regulation are all achieved within a long linear sequence, it is natural to think that some parts of a genome are involved is these tasks. If some complex algorithms are encoded with these parts, then it is natural to think that non-coding regions contain processing-information algorithms. A comparison between well-known automatic sequences and sequences constructed out of motifs is found in all species proves the point: noncoding regions are a sort of ''hardwired'' programs, i.e., they are linear representations of information-processing machines. Thus in our model, a noncoding region, e.g., an intron contains a program (or equivalently, it is

  19. Non-homogeneous Markov process models with informative observations with an application to Alzheimer's disease.

    Science.gov (United States)

    Chen, Baojiang; Zhou, Xiao-Hua

    2011-05-01

    Identifying risk factors for transition rates among normal cognition, mildly cognitive impairment, dementia and death in an Alzheimer's disease study is very important. It is known that transition rates among these states are strongly time dependent. While Markov process models are often used to describe these disease progressions, the literature mainly focuses on time homogeneous processes, and limited tools are available for dealing with non-homogeneity. Further, patients may choose when they want to visit the clinics, which creates informative observations. In this paper, we develop methods to deal with non-homogeneous Markov processes through time scale transformation when observation times are pre-planned with some observations missing. Maximum likelihood estimation via the EM algorithm is derived for parameter estimation. Simulation studies demonstrate that the proposed method works well under a variety of situations. An application to the Alzheimer's disease study identifies that there is a significant increase in transition rates as a function of time. Furthermore, our models reveal that the non-ignorable missing mechanism is perhaps reasonable. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. A neural network model of normal and abnormal auditory information processing.

    Science.gov (United States)

    Du, X; Jansen, B H

    2011-08-01

    The ability of the brain to attenuate the response to irrelevant sensory stimulation is referred to as sensory gating. A gating deficiency has been reported in schizophrenia. To study the neural mechanisms underlying sensory gating, a neuroanatomically inspired model of auditory information processing has been developed. The mathematical model consists of lumped parameter modules representing the thalamus (TH), the thalamic reticular nucleus (TRN), auditory cortex (AC), and prefrontal cortex (PC). It was found that the membrane potential of the pyramidal cells in the PC module replicated auditory evoked potentials, recorded from the scalp of healthy individuals, in response to pure tones. Also, the model produced substantial attenuation of the response to the second of a pair of identical stimuli, just as seen in actual human experiments. We also tested the viewpoint that schizophrenia is associated with a deficit in prefrontal dopamine (DA) activity, which would lower the excitatory and inhibitory feedback gains in the AC and PC modules. Lowering these gains by less than 10% resulted in model behavior resembling the brain activity seen in schizophrenia patients, and replicated the reported gating deficits. The model suggests that the TRN plays a critical role in sensory gating, with the smaller response to a second tone arising from a reduction in inhibition of TH by the TRN. Copyright © 2011 Elsevier Ltd. All rights reserved.

  1. Information flow security for business process models - just one click away

    NARCIS (Netherlands)

    Lehmann, A.; Fahland, D.; Lohmann, N.; Moser, S.

    2012-01-01

    When outsourcing tasks of a business process to a third party, information flow security becomes a critical issue. In particular implicit information leaks are an intriguing problem. Given a business process one could ask whether the execution of a confidential task is kept secret to a third party

  2. Scientific information processing procedures

    Directory of Open Access Journals (Sweden)

    García, Maylin

    2013-07-01

    Full Text Available The paper systematizes several theoretical view-points on scientific information processing skill. It decomposes the processing skills into sub-skills. Several methods such analysis, synthesis, induction, deduction, document analysis were used to build up a theoretical framework. Interviews and survey to professional being trained and a case study was carried out to evaluate the results. All professional in the sample improved their performance in scientific information processing.

  3. Quantum Information Processing

    CERN Document Server

    Leuchs, Gerd

    2005-01-01

    Quantum processing and communication is emerging as a challenging technique at the beginning of the new millennium. This is an up-to-date insight into the current research of quantum superposition, entanglement, and the quantum measurement process - the key ingredients of quantum information processing. The authors further address quantum protocols and algorithms. Complementary to similar programmes in other countries and at the European level, the German Research Foundation (DFG) started a focused research program on quantum information in 1999. The contributions - written by leading experts - bring together the latest results in quantum information as well as addressing all the relevant questions

  4. Models of neural networks temporal aspects of coding and information processing in biological systems

    CERN Document Server

    Hemmen, J; Schulten, Klaus

    1994-01-01

    Since the appearance of Vol. 1 of Models of Neural Networks in 1991, the theory of neural nets has focused on two paradigms: information coding through coherent firing of the neurons and functional feedback. Information coding through coherent neuronal firing exploits time as a cardinal degree of freedom. This capacity of a neural network rests on the fact that the neuronal action potential is a short, say 1 ms, spike, localized in space and time. Spatial as well as temporal correlations of activity may represent different states of a network. In particular, temporal correlations of activity may express that neurons process the same "object" of, for example, a visual scene by spiking at the very same time. The traditional description of a neural network through a firing rate, the famous S-shaped curve, presupposes a wide time window of, say, at least 100 ms. It thus fails to exploit the capacity to "bind" sets of coherently firing neurons for the purpose of both scene segmentation and figure-ground segregatio...

  5. Mass media in health promotion: an analysis using an extended information-processing model.

    Science.gov (United States)

    Flay, B R; DiTecco, D; Schlegel, R P

    1980-01-01

    The information-processing model of the attitude and behavior change process was critically examined and extended from six to 12 levels for a better analysis of change due to mass media campaigns. Findings from social psychology and communications research, and from evaluations of mass media health promotion programs, were reviewed to determine how source, message, channel, receiver, and destination variables affect each of the levels of change of major interest (knowledge, beliefs, attitudes, intentions and behavior). Factors found to most likely induce permanent attitude and behavior change (most important in health promotion) were: presentation and repetition over long time periods, via multiple sources, at different times (including "prime" or high-exposure times), by multiple sources, in novel and involving ways, with appeals to multiple motives, development of social support, and provisions of appropriate behavioral skills, alternatives, and reinforcement (preferably in ways that get the active participation of the audience). Suggestions for evaluation of mass media programs that take account of this complexity were advanced.

  6. The Practice of Information Processing Model in the Teaching of Cognitive Strategies

    Science.gov (United States)

    Ozel, Ali

    2009-01-01

    In this research, the differentiation condition of teaching the learning strategies depending on the time which the first grade of primary school teachers carried out to form an information-process skeleton on student is tried to be found out. This process including the efforts of 260 teachers in this direction consists of whether the adequate…

  7. Aggression and Moral Development: Integrating Social Information Processing and Moral Domain Models

    Science.gov (United States)

    Arsenio, William F.; Lemerise, Elizabeth A.

    2004-01-01

    Social information processing and moral domain theories have developed in relative isolation from each other despite their common focus on intentional harm and victimization, and mutual emphasis on social cognitive processes in explaining aggressive, morally relevant behaviors. This article presents a selective summary of these literatures with…

  8. Using a logical information model-driven design process in healthcare.

    Science.gov (United States)

    Cheong, Yu Chye; Bird, Linda; Tun, Nwe Ni; Brooks, Colleen

    2011-01-01

    A hybrid standards-based approach has been adopted in Singapore to develop a Logical Information Model (LIM) for healthcare information exchange. The Singapore LIM uses a combination of international standards, including ISO13606-1 (a reference model for electronic health record communication), ISO21090 (healthcare datatypes), SNOMED CT (healthcare terminology) and HL7 v2 (healthcare messaging). This logic-based design approach also incorporates mechanisms for achieving bi-directional semantic interoperability.

  9. Development of Energy Models for Production Systems and Processes to Inform Environmentally Benign Decision-Making

    Science.gov (United States)

    Diaz-Elsayed, Nancy

    Between 2008 and 2035 global energy demand is expected to grow by 53%. While most industry-level analyses of manufacturing in the United States (U.S.) have traditionally focused on high energy consumers such as the petroleum, chemical, paper, primary metal, and food sectors, the remaining sectors account for the majority of establishments in the U.S. Specifically, of the establishments participating in the Energy Information Administration's Manufacturing Energy Consumption Survey in 2006, the non-energy intensive" sectors still consumed 4*109 GJ of energy, i.e., one-quarter of the energy consumed by the manufacturing sectors, which is enough to power 98 million homes for a year. The increasing use of renewable energy sources and the introduction of energy-efficient technologies in manufacturing operations support the advancement towards a cleaner future, but having a good understanding of how the systems and processes function can reduce the environmental burden even further. To facilitate this, methods are developed to model the energy of manufacturing across three hierarchical levels: production equipment, factory operations, and industry; these methods are used to accurately assess the current state and provide effective recommendations to further reduce energy consumption. First, the energy consumption of production equipment is characterized to provide machine operators and product designers with viable methods to estimate the environmental impact of the manufacturing phase of a product. The energy model of production equipment is tested and found to have an average accuracy of 97% for a product requiring machining with a variable material removal rate profile. However, changing the use of production equipment alone will not result in an optimal solution since machines are part of a larger system. Which machines to use, how to schedule production runs while accounting for idle time, the design of the factory layout to facilitate production, and even the

  10. Comprehensive process model of clinical information interaction in primary care: results of a "best-fit" framework synthesis.

    Science.gov (United States)

    Veinot, Tiffany C; Senteio, Charles R; Hanauer, David; Lowery, Julie C

    2018-06-01

    To describe a new, comprehensive process model of clinical information interaction in primary care (Clinical Information Interaction Model, or CIIM) based on a systematic synthesis of published research. We used the "best fit" framework synthesis approach. Searches were performed in PubMed, Embase, the Cumulative Index to Nursing and Allied Health Literature (CINAHL), PsycINFO, Library and Information Science Abstracts, Library, Information Science and Technology Abstracts, and Engineering Village. Two authors reviewed articles according to inclusion and exclusion criteria. Data abstraction and content analysis of 443 published papers were used to create a model in which every element was supported by empirical research. The CIIM documents how primary care clinicians interact with information as they make point-of-care clinical decisions. The model highlights 3 major process components: (1) context, (2) activity (usual and contingent), and (3) influence. Usual activities include information processing, source-user interaction, information evaluation, selection of information, information use, clinical reasoning, and clinical decisions. Clinician characteristics, patient behaviors, and other professionals influence the process. The CIIM depicts the complete process of information interaction, enabling a grasp of relationships previously difficult to discern. The CIIM suggests potentially helpful functionality for clinical decision support systems (CDSSs) to support primary care, including a greater focus on information processing and use. The CIIM also documents the role of influence in clinical information interaction; influencers may affect the success of CDSS implementations. The CIIM offers a new framework for achieving CDSS workflow integration and new directions for CDSS design that can support the work of diverse primary care clinicians.

  11. Mathematical model of information process of protection of the social sector

    Science.gov (United States)

    Novikov, D. A.; Tsarkova, E. G.; Dubrovin, A. S.; Soloviev, A. S.

    2018-03-01

    In work the mathematical model of information protection of society against distribution of extremist moods by means of impact on mass consciousness of information placed in media is investigated. Internal and external channels on which there is a dissemination of information are designated. The problem of optimization consisting in search of the optimum strategy allowing to use most effectively media for dissemination of antiterrorist information with the minimum financial expenses is solved. The algorithm of a numerical method of the solution of a problem of optimization is constructed and also the analysis of results of a computing experiment is carried out.

  12. The role of information in a lifetime process - a model of weight maintenance by women over long time periods

    Directory of Open Access Journals (Sweden)

    Judit Bar-Ilan

    2006-01-01

    Full Text Available Introduction. This paper proposes a model of information behaviour of women during their life-long struggle to maintain normal weight. Method. The model is integrative and contextual, built on existing models in information science and several other disciplines, and the life stories of about fifty Israeli women aged 25-55 and interviews with professionals. Analysis. The life stories of the participating women were analyzed qualitatively, major themes and phases were identified. Results. Weight loss and/or maintenance behaviour is a lifetime process in which distinctive stages were identified. In most cases the weight gain - weight loss - maintenance cycle is a recurring cycle. Information is a major resource during the process: several roles of information were defined: enabling, motivating, reinforcing, providing background information related to weight problems and creating the internal cognitive schema related to food and weight. Information behaviour and the roles of information vary with the different stages. Information needs are also influenced by the specific stage of the process. Information gathered at previous cycles is reused, and information gained through previous experience effects behaviour in the current cycle. Conclusion. The model has both theoretical and practical implications.

  13. PREFACE: Quantum information processing

    Science.gov (United States)

    Briggs, Andrew; Ferry, David; Stoneham, Marshall

    2006-05-01

    Microelectronics and the classical information technologies transformed the physics of semiconductors. Photonics has given optical materials a new direction. Quantum information technologies, we believe, will have immense impact on condensed matter physics. The novel systems of quantum information processing need to be designed and made. Their behaviours must be manipulated in ways that are intrinsically quantal and generally nanoscale. Both in this special issue and in previous issues (see e.g., Spiller T P and Munro W J 2006 J. Phys.: Condens. Matter 18 V1-10) we see the emergence of new ideas that link the fundamentals of science to the pragmatism of market-led industry. We hope these papers will be followed by many others on quantum information processing in the Journal of Physics: Condensed Matter.

  14. Modeling Business Processes of the Social Insurance Fund in Information System Runa WFE

    Science.gov (United States)

    Kataev, M. Yu; Bulysheva, L. A.; Xu, Li D.; Loseva, N. V.

    2016-08-01

    Introduction - Business processes are gradually becoming a tool that allows you at a new level to put employees or to make more efficient document management system. In these directions the main work, and presents the largest possible number of publications. However, business processes are still poorly implemented in public institutions, where it is very difficult to formalize the main existing processes. Us attempts to build a system of business processes for such state agencies as the Russian social insurance Fund (SIF), where virtually all of the processes, when different inputs have the same output: public service. The parameters of the state services (as a rule, time limits) are set by state laws and regulations. The article provides a brief overview of the FSS, the formulation of requirements to business processes, the justification of the choice of software for modeling business processes and create models of work in the system Runa WFE and optimization models one of the main business processes of the FSS. The result of the work of Runa WFE is an optimized model of the business process of FSS.

  15. Physics as Information Processing

    International Nuclear Information System (INIS)

    D'Ariano, Giacomo Mauro

    2011-01-01

    I review some recent advances in foundational research at Pavia QUIT group. The general idea is that there is only Quantum Theory without quantization rules, and the whole Physics - including space-time and relativity - is emergent from the quantum-information processing. And since Quantum Theory itself is axiomatized solely on informational principles, the whole Physics must be reformulated in information-theoretical terms: this is the It from bit of J. A. Wheeler.The review is divided into four parts: a) the informational axiomatization of Quantum Theory; b) how space-time and relativistic covariance emerge from quantum computation; c) what is the information-theoretical meaning of inertial mass and of (ℎ/2π), and how the quantum field emerges; d) an observational consequence of the new quantum field theory: a mass-dependent refraction index of vacuum. I will conclude with the research lines that will follow in the immediate future.

  16. Driver's various information process and multi-ruled decision-making mechanism: a fundamental of intelligent driving shaping model

    Directory of Open Access Journals (Sweden)

    Wuhong Wang

    2011-05-01

    Full Text Available The most difficult but important problem in advance driver assistance system development is how to measure and model the behavioral response of drivers with focusing on the cognition process. This paper describes driver's deceleration and acceleration behavior based on driving situation awareness in the car-following process, and then presents several driving models for analysis of driver's safety approaching behavior in traffic operation. The emphasis of our work is placed on the research of driver's various information process and multi-ruled decisionmaking mechanism by considering the complicated control process of driving; the results will be able to provide a theoretical basis for intelligent driving shaping model.

  17. Risk Information Seeking among U.S. and Dutch Residents. An Application of the model of Risk Information Seeking and Processing

    NARCIS (Netherlands)

    ter Huurne, E.F.J.; Griffin, Robert J.; Gutteling, Jan M.

    2009-01-01

    The model of risk information seeking and processing (RISP) proposes characteristics of individuals that might predispose them to seek risk information. The intent of this study is to test the model’s robustness across two independent samples in different nations. Based on data from the United

  18. A business process modeling experience in a complex information system re-engineering.

    Science.gov (United States)

    Bernonville, Stéphanie; Vantourout, Corinne; Fendeler, Geneviève; Beuscart, Régis

    2013-01-01

    This article aims to share a business process modeling experience in a re-engineering project of a medical records department in a 2,965-bed hospital. It presents the modeling strategy, an extract of the results and the feedback experience.

  19. Information services and information processing

    Science.gov (United States)

    1975-01-01

    Attempts made to design and extend space system capabilities are reported. Special attention was given to establishing user needs for information or services which might be provided by space systems. Data given do not attempt to detail scientific, technical, or economic bases for the needs expressed by the users.

  20. Mindfulness Training Alters Emotional Memory Recall Compared to Active Controls: Support for an Emotional Information Processing Model of Mindfulness

    OpenAIRE

    Roberts-Wolfe, Douglas; Sacchet, Matthew D.; Hastings, Elizabeth; Roth, Harold; Britton, Willoughby

    2012-01-01

    Objectives: While mindfulness-based interventions have received widespread application in both clinical and non-clinical populations, the mechanism by which mindfulness meditation improves well-being remains elusive. One possibility is that mindfulness training alters the processing of emotional information, similar to prevailing cognitive models of depression and anxiety. The aim of this study was to investigate the effects of mindfulness training on emotional information processing (i.e., m...

  1. The Relevance of the Social Information Processing Model for Understanding Relational Aggression in Girls

    Science.gov (United States)

    Crain, Marcelle M.; Finch, Cambra L.; Foster, Sharon L.

    2005-01-01

    Two studies examined whether social information-processing variables predict relational aggression in girls. In Study 1, fourth- through sixth-grade girls reported their intent attributions, social goals, outcome expectancies for relational aggression, and the likelihood that they would choose a relationally aggressive response in response to…

  2. Bridging the Operational Divide: An Information-Processing Model of Internal Supply Chain Integration

    Science.gov (United States)

    Rosado Feger, Ana L.

    2009-01-01

    Supply Chain Management, the coordination of upstream and downstream flows of product, services, finances, and information from a source to a customer, has risen in prominence over the past fifteen years. The delivery of a product to the consumer is a complex process requiring action from several independent entities. An individual firm consists…

  3. Models of neural dynamics in brain information processing - the developments of 'the decade'

    Energy Technology Data Exchange (ETDEWEB)

    Borisyuk, G N; Borisyuk, R M; Kazanovich, Yakov B [Institute of Mathematical Problems of Biology, Russian Academy of Sciences, Pushchino, Moscow region (Russian Federation); Ivanitskii, Genrikh R [Institute for Theoretical and Experimental Biophysics, Russian Academy of Sciences, Pushchino, Moscow region (Russian Federation)

    2002-10-31

    Neural network models are discussed that have been developed during the last decade with the purpose of reproducing spatio-temporal patterns of neural activity in different brain structures. The main goal of the modeling was to test hypotheses of synchronization, temporal and phase relations in brain information processing. The models being considered are those of temporal structure of spike sequences, of neural activity dynamics, and oscillatory models of attention and feature integration. (reviews of topical problems)

  4. Do Social Information-Processing Models Explain Aggressive Behaviour by Children with Mild Intellectual Disabilities in Residential Care?

    Science.gov (United States)

    van Nieuwenhuijzen, M.; de Castro, B. O.; van der Valk, I.; Wijnroks, L.; Vermeer, A.; Matthys, W.

    2006-01-01

    Background: This study aimed to examine whether the social information-processing model (SIP model) applies to aggressive behaviour by children with mild intellectual disabilities (MID). The response-decision element of SIP was expected to be unnecessary to explain aggressive behaviour in these children, and SIP was expected to mediate the…

  5. Toward a Model of Human Information Processing for Decision-Making and Skill Acquisition in Laparoscopic Colorectal Surgery.

    Science.gov (United States)

    White, Eoin J; McMahon, Muireann; Walsh, Michael T; Coffey, J Calvin; O Sullivan, Leonard

    To create a human information-processing model for laparoscopic surgery based on already established literature and primary research to enhance laparoscopic surgical education in this context. We reviewed the literature for information-processing models most relevant to laparoscopic surgery. Our review highlighted the necessity for a model that accounts for dynamic environments, perception, allocation of attention resources between the actions of both hands of an operator, and skill acquisition and retention. The results of the literature review were augmented through intraoperative observations of 7 colorectal surgical procedures, supported by laparoscopic video analysis of 12 colorectal procedures. The Wickens human information-processing model was selected as the most relevant theoretical model to which we make adaptions for this specific application. We expanded the perception subsystem of the model to involve all aspects of perception during laparoscopic surgery. We extended the decision-making system to include dynamic decision-making to account for case/patient-specific and surgeon-specific deviations. The response subsystem now includes dual-task performance and nontechnical skills, such as intraoperative communication. The memory subsystem is expanded to include skill acquisition and retention. Surgical decision-making during laparoscopic surgery is the result of a highly complex series of processes influenced not only by the operator's knowledge, but also patient anatomy and interaction with the surgical team. Newer developments in simulation-based education must focus on the theoretically supported elements and events that underpin skill acquisition and affect the cognitive abilities of novice surgeons. The proposed human information-processing model builds on established literature regarding information processing, accounting for a dynamic environment of laparoscopic surgery. This revised model may be used as a foundation for a model describing robotic

  6. Is there room for 'development' in developmental models of information processing biases to threat in children and adolescents?

    Science.gov (United States)

    Field, Andy P; Lester, Kathryn J

    2010-12-01

    Clinical and experimental theories assume that processing biases in attention and interpretation are a causal mechanism through which anxiety develops. Despite growing evidence that these processing biases are present in children and, therefore, develop long before adulthood, these theories ignore the potential role of child development. This review attempts to place information processing biases within a theoretical developmental framework. We consider whether child development has no impact on information processing biases to threat (integral bias model), or whether child development influences information processing biases and if so whether it does so by moderating the expression of an existing bias (moderation model) or by affecting the acquisition of a bias (acquisition model). We examine the extent to which these models fit with existing theory and research evidence and outline some methodological issues that need to be considered when drawing conclusions about the potential role of child development in the information processing of threat stimuli. Finally, we speculate about the developmental processes that might be important to consider in future research.

  7. Introduction to information processing

    CERN Document Server

    Dietel, Harvey M

    2014-01-01

    An Introduction to Information Processing provides an informal introduction to the computer field. This book introduces computer hardware, which is the actual computing equipment.Organized into three parts encompassing 12 chapters, this book begins with an overview of the evolution of personal computing and includes detailed case studies on two of the most essential personal computers for the 1980s, namely, the IBM Personal Computer and Apple's Macintosh. This text then traces the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapte

  8. The dynamic information architecture system : a simulation framework to provide interoperability for process models

    International Nuclear Information System (INIS)

    Hummel, J. R.; Christiansen, J. H.

    2002-01-01

    As modeling and simulation becomes a more important part of the day-to-day activities in industry and government, organizations are being faced with the vexing problem of how to integrate a growing suite of heterogeneous models both within their own organizations and between organizations. The Argonne National Laboratory, which is operated by the University of Chicago for the United States Department of Energy, has developed the Dynamic Information Architecture System (DIAS) to address such problems. DIAS is an object-oriented, subject domain independent framework that is used to integrate legacy or custom-built models and applications. In this paper we will give an overview of the features of DIAS and give examples of how it has been used to integrate models in a number of applications. We shall also describe some of the key supporting DIAS tools that provide seamless interoperability between models and applications

  9. Cognitive Theory within the Framework of an Information Processing Model and Learning Hierarchy: Viable Alternative to the Bloom-Mager System.

    Science.gov (United States)

    Stahl, Robert J.

    This review of the current status of the human information processing model presents the Stahl Perceptual Information Processing and Operations Model (SPInPrOM) as a model of how thinking, memory, and the processing of information take place within the individual learner. A related system, the Domain of Cognition, is presented as an alternative to…

  10. Testing for a Common Volatility Process and Information Spillovers in Bivariate Financial Time Series Models

    NARCIS (Netherlands)

    J. Chen (Jinghui); M. Kobayashi (Masahito); M.J. McAleer (Michael)

    2016-01-01

    textabstractThe paper considers the problem as to whether financial returns have a common volatility process in the framework of stochastic volatility models that were suggested by Harvey et al. (1994). We propose a stochastic volatility version of the ARCH test proposed by Engle and Susmel (1993),

  11. The Elaboration Likelihood Model and Proxemic Violations as Peripheral Cues to Information Processing.

    Science.gov (United States)

    Eaves, Michael

    This paper provides a literature review of the elaboration likelihood model (ELM) as applied in persuasion. Specifically, the paper addresses distraction with regard to effects on persuasion. In addition, the application of proxemic violations as peripheral cues in message processing is discussed. Finally, the paper proposes to shed new light on…

  12. Display analysis with the optimal control model of the human operator. [pilot-vehicle display interface and information processing

    Science.gov (United States)

    Baron, S.; Levison, W. H.

    1977-01-01

    Application of the optimal control model of the human operator to problems in display analysis is discussed. Those aspects of the model pertaining to the operator-display interface and to operator information processing are reviewed and discussed. The techniques are then applied to the analysis of advanced display/control systems for a Terminal Configured Vehicle. Model results are compared with those obtained in a large, fixed-base simulation.

  13. Applying an Information Processing Model to Measure the Effectiveness of a Mailed Circular Advertisement.

    Science.gov (United States)

    1982-01-01

    information applicable to marketing and advertising strategy which could be employed by the Navy Resale System. ...considered a good strategy if the consumer has passed through the lower stages in the model. Some advertising campaigns may 9 be targeted primarily at...behavior. In the marketing sense, the final behavior would be the purchase of the advertised product. The behavior will materialize only if the chain

  14. Modeling the Process of Purchase Payment as a Constituent of Information Security in E-commerce

    Directory of Open Access Journals (Sweden)

    Volodymyr Skitsko

    2016-01-01

    Full Text Available A mathematical model of the process of payment for purchases in an online store has been presented. The model belongs to the class of semi-open queueing networks with four phases of exponential servers and Poisson arrivals. The authors describe in detail the derivation of the equations describing the system. Analytic expressions are derived on the basis of the proposed model for the average number of online store customers who have already paid for goods. Practical implementation of the model allows us to determine the number of clients who have added goods to their cart, but have not yet passed through the payment verification system, and thus determine the stream of real customers of the online store. (original abstract

  15. Photonic Quantum Information Processing

    International Nuclear Information System (INIS)

    Walther, P.

    2012-01-01

    The advantage of the photon's mobility makes optical quantum system ideally suited for delegated quantum computation. I will present results for the realization for a measurement-based quantum network in a client-server environment, where quantum information is securely communicated and computed. Related to measurement-based quantum computing I will discuss a recent experiment showing that quantum discord can be used as resource for the remote state preparation, which might shine new light on the requirements for quantum-enhanced information processing. Finally, I will briefly review recent photonic quantum simulation experiments of four frustrated Heisenberg-interactions spins and present an outlook of feasible simulation experiments with more complex interactions or random walk structures. As outlook I will discuss the current status of new quantum technology for improving the scalability of photonic quantum systems by using superconducting single-photon detectors and tailored light-matter interactions. (author)

  16. Antecedent characteristics of online cancer information seeking among rural breast cancer patients: an application of the Cognitive-Social Health Information Processing (C-SHIP) model.

    Science.gov (United States)

    Shaw, Bret R; Dubenske, Lori L; Han, Jeong Yeob; Cofta-Woerpel, Ludmila; Bush, Nigel; Gustafson, David H; McTavish, Fiona

    2008-06-01

    Little research has examined the antecedent characteristics of patients most likely to seek online cancer information. This study employs the Cognitive-Social Health Information Processing (C-SHIP) model as a framework to understand what psychosocial characteristics precede online cancer-related information seeking among rural breast cancer patients who often have fewer health care providers and limited local support services. Examining 144 patients who were provided free computer hardware, Internet access, and training for how to use an interactive cancer communication system, pretest survey scores indicating patients' psychosocial status were correlated with specific online cancer information seeking behaviors. Each of the factors specified by the C-SHIP model had significant relationships with online cancer information seeking behaviors, with the strongest findings emerging for cancer-relevant encodings and self-construals, cancer-relevant beliefs and expectancies, and cancer-relevant self-regulatory competencies and skills. Specifically, patients with more negative appraisals in these domains were more likely to seek out online cancer information. Additionally, antecedent variables associated with the C-SHIP model had more frequent relationships with experiential information as compared with to didactic information. This study supports the applicability of the model to discern why people afflicted with cancer may seek online information to cope with their disease.

  17. Antecedent Characteristics of Online Cancer Information Seeking Among Rural Breast Cancer Patients: An Application of the Cognitive-Social Health Information Processing (C-SHIP) Model

    Science.gov (United States)

    Shaw, Bret R.; DuBenske, Lori L.; Han, Jeong Yeob; Cofta-Woerpel, Ludmila; Bush, Nigel; Gustafson, David H.; McTavish, Fiona

    2013-01-01

    Little research has examined the antecedent characteristics of patients most likely to seek online cancer information. This study employs the Cognitive-Social Health Information Processing (C-SHIP) model as a framework to understand what psychosocial characteristics precede online cancer-related information seeking among rural breast cancer patients who often have fewer healthcare providers and limited local support services. Examining 144 patients who were provided free computer hardware, Internet access and training for how to use an Interactive Cancer Communication System, pre-test survey scores indicating patients’ psychosocial status were correlated with specific online cancer information seeking behaviors. Each of the factors specified by the C-SHIP model had significant relationships with online cancer information seeking behaviors with the strongest findings emerging for cancer-relevant encodings and self-construals, cancer-relevant beliefs and expectancies and cancer-relevant self-regulatory competencies and skills. Specifically, patients with more negative appraisals in these domains were more likely to seek out online cancer information. Additionally, antecedent variables associated with the C-SHIP model had more frequent relationships with experiential information as compared to didactic information. This study supports the applicability of the model to discern why people afflicted with cancer may seek online information to cope with their disease. PMID:18569368

  18. Advanced information processing system

    Science.gov (United States)

    Lala, J. H.

    1984-01-01

    Design and performance details of the advanced information processing system (AIPS) for fault and damage tolerant data processing on aircraft and spacecraft are presented. AIPS comprises several computers distributed throughout the vehicle and linked by a damage tolerant data bus. Most I/O functions are available to all the computers, which run in a TDMA mode. Each computer performs separate specific tasks in normal operation and assumes other tasks in degraded modes. Redundant software assures that all fault monitoring, logging and reporting are automated, together with control functions. Redundant duplex links and damage-spread limitation provide the fault tolerance. Details of an advanced design of a laboratory-scale proof-of-concept system are described, including functional operations.

  19. The use of process models to inform and improve statistical models of nitrate occurrence, Great Miami River Basin, southwestern Ohio

    Science.gov (United States)

    Walter, Donald A.; Starn, J. Jeffrey

    2013-01-01

    Statistical models of nitrate occurrence in the glacial aquifer system of the northern United States, developed by the U.S. Geological Survey, use observed relations between nitrate concentrations and sets of explanatory variables—representing well-construction, environmental, and source characteristics— to predict the probability that nitrate, as nitrogen, will exceed a threshold concentration. However, the models do not explicitly account for the processes that control the transport of nitrogen from surface sources to a pumped well and use area-weighted mean spatial variables computed from within a circular buffer around the well as a simplified source-area conceptualization. The use of models that explicitly represent physical-transport processes can inform and, potentially, improve these statistical models. Specifically, groundwater-flow models simulate advective transport—predominant in many surficial aquifers— and can contribute to the refinement of the statistical models by (1) providing for improved, physically based representations of a source area to a well, and (2) allowing for more detailed estimates of environmental variables. A source area to a well, known as a contributing recharge area, represents the area at the water table that contributes recharge to a pumped well; a well pumped at a volumetric rate equal to the amount of recharge through a circular buffer will result in a contributing recharge area that is the same size as the buffer but has a shape that is a function of the hydrologic setting. These volume-equivalent contributing recharge areas will approximate circular buffers in areas of relatively flat hydraulic gradients, such as near groundwater divides, but in areas with steep hydraulic gradients will be elongated in the upgradient direction and agree less with the corresponding circular buffers. The degree to which process-model-estimated contributing recharge areas, which simulate advective transport and therefore account for

  20. Resolving Radiological Waste Classification and Release Issues Using Material Process Information and Simple Measurements and Models

    International Nuclear Information System (INIS)

    Hochel, R.C.

    1997-11-01

    This report was prepared as an account of work sponsored by an agency of the United States Government. Neither the United States Government nor any agency thereof, nor any of their employees, makes any warranty, express or implied, or assumes any legal liability or responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, or represents that its use would not infringe privately owned rights. Reference herein to any specific commercial product, process, or service by trade name, trademark, manufacturer, or otherwise does not necessarily constitute or imply its endorsement, recommendation, or favoring by United States Government or any agency thereof. The views and opinions of the author expressed herein do not necessarily state or reflect those of the United States Government or any agency thereof

  1. Common data model for natural language processing based on two existing standard information models: CDA+GrAF.

    Science.gov (United States)

    Meystre, Stéphane M; Lee, Sanghoon; Jung, Chai Young; Chevrier, Raphaël D

    2012-08-01

    An increasing need for collaboration and resources sharing in the Natural Language Processing (NLP) research and development community motivates efforts to create and share a common data model and a common terminology for all information annotated and extracted from clinical text. We have combined two existing standards: the HL7 Clinical Document Architecture (CDA), and the ISO Graph Annotation Format (GrAF; in development), to develop such a data model entitled "CDA+GrAF". We experimented with several methods to combine these existing standards, and eventually selected a method wrapping separate CDA and GrAF parts in a common standoff annotation (i.e., separate from the annotated text) XML document. Two use cases, clinical document sections, and the 2010 i2b2/VA NLP Challenge (i.e., problems, tests, and treatments, with their assertions and relations), were used to create examples of such standoff annotation documents, and were successfully validated with the XML schemata provided with both standards. We developed a tool to automatically translate annotation documents from the 2010 i2b2/VA NLP Challenge format to GrAF, and automatically generated 50 annotation documents using this tool, all successfully validated. Finally, we adapted the XSL stylesheet provided with HL7 CDA to allow viewing annotation XML documents in a web browser, and plan to adapt existing tools for translating annotation documents between CDA+GrAF and the UIMA and GATE frameworks. This common data model may ease directly comparing NLP tools and applications, combining their output, transforming and "translating" annotations between different NLP applications, and eventually "plug-and-play" of different modules in NLP applications. Copyright © 2011 Elsevier Inc. All rights reserved.

  2. Efficiency of cellular information processing

    International Nuclear Information System (INIS)

    Barato, Andre C; Hartich, David; Seifert, Udo

    2014-01-01

    We show that a rate of conditional Shannon entropy reduction, characterizing the learning of an internal process about an external process, is bounded by the thermodynamic entropy production. This approach allows for the definition of an informational efficiency that can be used to study cellular information processing. We analyze three models of increasing complexity inspired by the Escherichia coli sensory network, where the external process is an external ligand concentration jumping between two values. We start with a simple model for which ATP must be consumed so that a protein inside the cell can learn about the external concentration. With a second model for a single receptor we show that the rate at which the receptor learns about the external environment can be nonzero even without any dissipation inside the cell since chemical work done by the external process compensates for this learning rate. The third model is more complete, also containing adaptation. For this model we show inter alia that a bacterium in an environment that changes at a very slow time-scale is quite inefficient, dissipating much more than it learns. Using the concept of a coarse-grained learning rate, we show for the model with adaptation that while the activity learns about the external signal the option of changing the methylation level increases the concentration range for which the learning rate is substantial. (paper)

  3. Neuronal encoding of object and distance information: A model simulation study on naturalistic optic flow processing

    Directory of Open Access Journals (Sweden)

    Patrick eHennig

    2012-03-01

    Full Text Available We developed a model of the input circuitry of the FD1 cell, an identified motion-sensitive interneuron in the blowfly’s visual system. The model circuit successfully reproduces the FD1 cell’s most conspicuous property: Its larger responses to objects than to spatially extended patterns. The model circuit also mimics the time-dependent responses of FD1 to dynamically complex naturalistic stimuli, shaped by the blowfly’s saccadic flight and gaze strategy: The FD1 responses are enhanced when, as a consequence of self-motion, a nearby object crosses the receptive field during intersaccadic intervals. Moreover, the model predicts that these object-induced responses are superimposed by pronounced pattern-dependent fluctuations during movements on virtual test flights in a three-dimensional environment with systematic modifications of the environmental patterns. Hence, the FD1 cell is predicted to detect not unambiguously objects defined by the spatial layout of the environment, but to be also sensitive to objects distinguished by textural features. These ambiguous detection abilities suggest an encoding of information about objects - irrespective of the features by which the objects are defined - by a population of cells, with the FD1 cell presumably playing a prominent role in such an ensemble.

  4. Weather Information Processing

    Science.gov (United States)

    1991-01-01

    Science Communications International (SCI), formerly General Science Corporation, has developed several commercial products based upon experience acquired as a NASA Contractor. Among them are METPRO, a meteorological data acquisition and processing system, which has been widely used, RISKPRO, an environmental assessment system, and MAPPRO, a geographic information system. METPRO software is used to collect weather data from satellites, ground-based observation systems and radio weather broadcasts to generate weather maps, enabling potential disaster areas to receive advance warning. GSC's initial work for NASA Goddard Space Flight Center resulted in METPAK, a weather satellite data analysis system. METPAK led to the commercial METPRO system. The company also provides data to other government agencies, U.S. embassies and foreign countries.

  5. Single-process versus multiple-strategy models of decision making: evidence from an information intrusion paradigm.

    Science.gov (United States)

    Söllner, Anke; Bröder, Arndt; Glöckner, Andreas; Betsch, Tilmann

    2014-02-01

    When decision makers are confronted with different problems and situations, do they use a uniform mechanism as assumed by single-process models (SPMs) or do they choose adaptively from a set of available decision strategies as multiple-strategy models (MSMs) imply? Both frameworks of decision making have gathered a lot of support, but only rarely have they been contrasted with each other. Employing an information intrusion paradigm for multi-attribute decisions from givens, SPM and MSM predictions on information search, decision outcomes, attention, and confidence judgments were derived and tested against each other in two experiments. The results consistently support the SPM view: Participants seemingly using a "take-the-best" (TTB) strategy do not ignore TTB-irrelevant information as MSMs would predict, but adapt the amount of information searched, choose alternative choice options, and show varying confidence judgments contingent on the quality of the "irrelevant" information. The uniformity of these findings underlines the adequacy of the novel information intrusion paradigm and comprehensively promotes the notion of a uniform decision making mechanism as assumed by single-process models. Copyright © 2013 The Authors. Published by Elsevier B.V. All rights reserved.

  6. Information processing. [in human performance

    Science.gov (United States)

    Wickens, Christopher D.; Flach, John M.

    1988-01-01

    Theoretical models of sensory-information processing by the human brain are reviewed from a human-factors perspective, with a focus on their implications for aircraft and avionics design. The topics addressed include perception (signal detection and selection), linguistic factors in perception (context provision, logical reversals, absence of cues, and order reversals), mental models, and working and long-term memory. Particular attention is given to decision-making problems such as situation assessment, decision formulation, decision quality, selection of action, the speed-accuracy tradeoff, stimulus-response compatibility, stimulus sequencing, dual-task performance, task difficulty and structure, and factors affecting multiple task performance (processing modalities, codes, and stages).

  7. Risk and information processing

    International Nuclear Information System (INIS)

    Rasmussen, J.

    1985-08-01

    The reasons for the current widespread arguments between designers of advanced technological systems like, for instance, nuclear power plants and opponents from the general public concerning levels of acceptable risk may be found in incompatible definitions of risk, in differences in risk perception and criteria for acceptance, etc. Of importance may, however, also be the difficulties met in presenting the basis for risk analysis, such as the conceptual system models applied, in an explicit and credible form. Application of modern information technology for the design of control systems and human-machine interfaces together with the trends towards large centralised industrial installations have made it increasingly difficult to establish an acceptable model framework, in particular considering the role of human errors in major system failures and accidents. Different aspects of this problem are discussed in the paper, and areas are identified where research is needed in order to improve not only the safety of advanced systems, but also the basis for their acceptance by the general public. (author)

  8. Model for Electromagnetic Information Leakage

    OpenAIRE

    Mao Jian; Li Yongmei; Zhang Jiemin; Liu Jinming

    2013-01-01

    Electromagnetic leakage will happen in working information equipments; it could lead to information leakage. In order to discover the nature of information in electromagnetic leakage, this paper combined electromagnetic theory with information theory as an innovative research method. It outlines a systematic model of electromagnetic information leakage, which theoretically describes the process of information leakage, intercept and reproduction based on electromagnetic radiation, and ana...

  9. Mathematical model as means of optimization of the automation system of the process of incidents of information security management

    Directory of Open Access Journals (Sweden)

    Yulia G. Krasnozhon

    2018-03-01

    Full Text Available Modern information technologies have an increasing importance for development dynamics and management structure of an enterprise. The management efficiency of implementation of modern information technologies directly related to the quality of information security incident management. However, issues of assessment of the impact of information security incidents management on quality and efficiency of the enterprise management system are not sufficiently highlighted neither in Russian nor in foreign literature. The main direction to approach these problems is the optimization of the process automation system of the information security incident management. Today a special attention is paid to IT-technologies while dealing with information security incidents at mission-critical facilities in Russian Federation such as the Federal Tax Service of Russia (FTS. It is proposed to use the mathematical apparatus of queueing theory in order to build a mathematical model of the system optimization. The developed model allows to estimate quality of the management taking into account the rules and restrictions imposed on the system by the effects of information security incidents. Here an example is given in order to demonstrate the system in work. The obtained statistical data are shown. An implementation of the system discussed here will improve the quality of the Russian FTS services and make responses to information security incidents faster.

  10. Modelling health care processes for eliciting user requirements: a way to link a quality paradigm and clinical information system design.

    Science.gov (United States)

    Staccini, P; Joubert, M; Quaranta, J F; Fieschi, D; Fieschi, M

    2001-12-01

    Healthcare institutions are looking at ways to increase their efficiency by reducing costs while providing care services with a high level of safety. Thus, hospital information systems have to support quality improvement objectives. The elicitation of the requirements has to meet users' needs in relation to both the quality (efficacy, safety) and the monitoring of all health care activities (traceability). Information analysts need methods to conceptualise clinical information systems that provide actors with individual benefits and guide behavioural changes. A methodology is proposed to elicit and structure users' requirements using a process-oriented analysis, and it is applied to the blood transfusion process. An object-oriented data model of a process has been defined in order to organise the data dictionary. Although some aspects of activity, such as 'where', 'what else', and 'why' are poorly represented by the data model alone, this method of requirement elicitation fits the dynamic of data input for the process to be traced. A hierarchical representation of hospital activities has to be found for the processes to be interrelated, and for their characteristics to be shared, in order to avoid data redundancy and to fit the gathering of data with the provision of care.

  11. Process model repositories and PNML

    NARCIS (Netherlands)

    Hee, van K.M.; Post, R.D.J.; Somers, L.J.A.M.; Werf, van der J.M.E.M.; Kindler, E.

    2004-01-01

    Bringing system and process models together in repositories facilitates the interchange of model information between modelling tools, and allows the combination and interlinking of complementary models. Petriweb is a web application for managing such repositories. It supports hierarchical process

  12. How parents process child health and nutrition information: A grounded theory model.

    Science.gov (United States)

    Lovell, Jennifer L

    2016-02-01

    The aim of the present study was to investigate low-income parents' experiences receiving, making meaning of, and applying sociocultural messages about childhood health and nutrition. Semi-structured interviews were conducted with parents from 16 low-income Early Head Start families. Verbatim interview transcripts, observations, field notes, documentary evidence, and follow-up participant checks were used during grounded theory analysis of the data. Data yielded a potential theoretical model of parental movement toward action involving (a) the culture and context influencing parents, (b) parents' sources of social and cultural messages, (c) parental values and engagement, (d) parental motivation for action, (e) intervening conditions impacting motivation and application, and (f) parent action taken on the individual and social levels. Parent characteristics greatly impacted the ways in which parents understood and applied health and nutrition information. Among other implications, it is recommended that educators and providers focus on a parent's beliefs, values, and cultural preferences regarding food and health behaviors as well as his/her personal/family definition of "health" when framing recommendations and developing interventions. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Towards the Significance of Decision Aid in Building Information Modeling (BIM Software Selection Process

    Directory of Open Access Journals (Sweden)

    Omar Mohd Faizal

    2014-01-01

    Full Text Available Building Information Modeling (BIM has been considered as a solution in construction industry to numerous problems such as delays, increased lead in times and increased costs. This is due to the concept and characteristic of BIM that will reshaped the way construction project teams work together to increase productivity and improve the final project outcomes (cost, time, quality, safety, functionality, maintainability, etc.. As a result, the construction industry has witnesses numerous of BIM software available in market. Each of this software has offers different function, features. Furthermore, the adoption of BIM required high investment on software, hardware and also training expenses. Thus, there is indentified that there is a need of decision aid for appropriated BIM software selection that fulfill the project needs. However, research indicates that there is limited study attempt to guide decision in BIM software selection problem. Thus, this paper highlight the importance of decision making and support for BIM software selection as it is vital to increase productivity, construction project throughout building lifecycle.

  14. Lateral Information Processing by Spiking Neurons: A Theoretical Model of the Neural Correlate of Consciousness

    Directory of Open Access Journals (Sweden)

    Marc Ebner

    2011-01-01

    Full Text Available Cognitive brain functions, for example, sensory perception, motor control and learning, are understood as computation by axonal-dendritic chemical synapses in networks of integrate-and-fire neurons. Cognitive brain functions may occur either consciously or nonconsciously (on “autopilot”. Conscious cognition is marked by gamma synchrony EEG, mediated largely by dendritic-dendritic gap junctions, sideways connections in input/integration layers. Gap-junction-connected neurons define a sub-network within a larger neural network. A theoretical model (the “conscious pilot” suggests that as gap junctions open and close, a gamma-synchronized subnetwork, or zone moves through the brain as an executive agent, converting nonconscious “auto-pilot” cognition to consciousness, and enhancing computation by coherent processing and collective integration. In this study we implemented sideways “gap junctions” in a single-layer artificial neural network to perform figure/ground separation. The set of neurons connected through gap junctions form a reconfigurable resistive grid or sub-network zone. In the model, outgoing spikes are temporally integrated and spatially averaged using the fixed resistive grid set up by neurons of similar function which are connected through gap-junctions. This spatial average, essentially a feedback signal from the neuron's output, determines whether particular gap junctions between neurons will open or close. Neurons connected through open gap junctions synchronize their output spikes. We have tested our gap-junction-defined sub-network in a one-layer neural network on artificial retinal inputs using real-world images. Our system is able to perform figure/ground separation where the laterally connected sub-network of neurons represents a perceived object. Even though we only show results for visual stimuli, our approach should generalize to other modalities. The system demonstrates a moving sub-network zone of

  15. An evaluation of the coping patterns of rape victims: integration with a schema-based information-processing model.

    Science.gov (United States)

    Littleton, Heather

    2007-08-01

    The current study sought to provide an expansion of Resick and Schnicke's information-processing model of interpersonal violence response. Their model posits that interpersonal violence threatens victims' schematic beliefs and that victims can resolve this threat through assimilation, accommodation, or overaccommodation. In addition, it is hypothesized that how victims resolve schematic threat affects their coping strategies. To test this hypothesis, a cluster analysis of rape victims' coping patterns was conducted. Victims' coping patterns were related to distress, self-worth, and rape label in ways consistent with predictions. Thus, future research should focus on the implications of how victims integrate trauma with schemas.

  16. A Model for Information

    Directory of Open Access Journals (Sweden)

    Paul Walton

    2014-09-01

    Full Text Available This paper uses an approach drawn from the ideas of computer systems modelling to produce a model for information itself. The model integrates evolutionary, static and dynamic views of information and highlights the relationship between symbolic content and the physical world. The model includes what information technology practitioners call “non-functional” attributes, which, for information, include information quality and information friction. The concepts developed in the model enable a richer understanding of Floridi’s questions “what is information?” and “the informational circle: how can information be assessed?” (which he numbers P1 and P12.

  17. Logical Reasoning versus Information Processing in the Dual-Strategy Model of Reasoning

    Science.gov (United States)

    Markovits, Henry; Brisson, Janie; de Chantal, Pier-Luc

    2017-01-01

    One of the major debates concerning the nature of inferential reasoning is between counterexample-based strategies such as mental model theory and statistical strategies underlying probabilistic models. The dual-strategy model, proposed by Verschueren, Schaeken, & d'Ydewalle (2005a, 2005b), which suggests that people might have access to both…

  18. Balancing Information Analysis and Decision Value: A Model to Exploit the Decision Process

    Science.gov (United States)

    2011-12-01

    technical intelli- gence e.g. signals and sensors (SIGINT and MASINT), imagery (!MINT), as well and human and open source intelligence (HUMINT and OSINT ...Clark 2006). The ability to capture large amounts of da- ta and the plenitude of modem intelligence information sources provides a rich cache of...many tech- niques for managing information collected and derived from these sources , the exploitation of intelligence assets for decision-making

  19. [Effects of an implicit internal working model on attachment in information processing assessed using Go/No-Go Association Task].

    Science.gov (United States)

    Fujii, Tsutomu; Uebuchi, Hisashi; Yamada, Kotono; Saito, Masahiro; Ito, Eriko; Tonegawa, Akiko; Uebuchi, Marie

    2015-06-01

    The purposes of the present study were (a) to use both a relational-anxiety Go/No-Go Association Task (GNAT) and an avoidance-of-intimacy GNAT in order to assess an implicit Internal Working Model (IWM) of attachment; (b) to verify the effects of both measured implicit relational anxiety and implicit avoidance of intimacy on information processing. The implicit IWM measured by GNAT differed from the explicit IWM measured by questionnaires in terms of the effects on information processing. In particular, in subliminal priming tasks involving with others, implicit avoidance of intimacy predicted accelerated response times with negative stimulus words about attachment. Moreover, after subliminally priming stimulus words about self, implicit relational anxiety predicted delayed response times with negative stimulus words about attachment.

  20. Investigation of signal models and methods for evaluating structures of processing telecommunication information exchange systems under acoustic noise conditions

    Science.gov (United States)

    Kropotov, Y. A.; Belov, A. A.; Proskuryakov, A. Y.; Kolpakov, A. A.

    2018-05-01

    The paper considers models and methods for estimating signals during the transmission of information messages in telecommunication systems of audio exchange. One-dimensional probability distribution functions that can be used to isolate useful signals, and acoustic noise interference are presented. An approach to the estimation of the correlation and spectral functions of the parameters of acoustic signals is proposed, based on the parametric representation of acoustic signals and the components of the noise components. The paper suggests an approach to improving the efficiency of interference cancellation and highlighting the necessary information when processing signals from telecommunications systems. In this case, the suppression of acoustic noise is based on the methods of adaptive filtering and adaptive compensation. The work also describes the models of echo signals and the structure of subscriber devices in operational command telecommunications systems.

  1. AIM for Allostery: Using the Ising Model to Understand Information Processing and Transmission in Allosteric Biomolecular Systems.

    Science.gov (United States)

    LeVine, Michael V; Weinstein, Harel

    2015-05-01

    In performing their biological functions, molecular machines must process and transmit information with high fidelity. Information transmission requires dynamic coupling between the conformations of discrete structural components within the protein positioned far from one another on the molecular scale. This type of biomolecular "action at a distance" is termed allostery . Although allostery is ubiquitous in biological regulation and signal transduction, its treatment in theoretical models has mostly eschewed quantitative descriptions involving the system's underlying structural components and their interactions. Here, we show how Ising models can be used to formulate an approach to allostery in a structural context of interactions between the constitutive components by building simple allosteric constructs we termed Allosteric Ising Models (AIMs). We introduce the use of AIMs in analytical and numerical calculations that relate thermodynamic descriptions of allostery to the structural context, and then show that many fundamental properties of allostery, such as the multiplicative property of parallel allosteric channels, are revealed from the analysis of such models. The power of exploring mechanistic structural models of allosteric function in more complex systems by using AIMs is demonstrated by building a model of allosteric signaling for an experimentally well-characterized asymmetric homodimer of the dopamine D2 receptor.

  2. Deep Neural Networks: A New Framework for Modeling Biological Vision and Brain Information Processing.

    Science.gov (United States)

    Kriegeskorte, Nikolaus

    2015-11-24

    Recent advances in neural network modeling have enabled major strides in computer vision and other artificial intelligence applications. Human-level visual recognition abilities are coming within reach of artificial systems. Artificial neural networks are inspired by the brain, and their computations could be implemented in biological neurons. Convolutional feedforward networks, which now dominate computer vision, take further inspiration from the architecture of the primate visual hierarchy. However, the current models are designed with engineering goals, not to model brain computations. Nevertheless, initial studies comparing internal representations between these models and primate brains find surprisingly similar representational spaces. With human-level performance no longer out of reach, we are entering an exciting new era, in which we will be able to build biologically faithful feedforward and recurrent computational models of how biological brains perform high-level feats of intelligence, including vision.

  3. Mindfulness training alters emotional memory recall compared to active controls: support for an emotional information processing model of mindfulness

    Directory of Open Access Journals (Sweden)

    Doug eRoberts-Wolfe

    2012-02-01

    Full Text Available Objectives: While mindfulness-based interventions have received widespread application in both clinical and non-clinical populations, the mechanism by which mindfulness meditation improves well-being remains elusive. One possibility is that mindfulness training alters the processing of emotional information, similar to prevailing cognitive models of depression and anxiety. The aim of this study was to investigating the effects of mindfulness training on emotional information processing (i.e. memory biases in relation to both clinical symptomatology and well-being in comparison to active control conditions.Methods: Fifty-eight university students (28 female, age = 20.1 ± 2.7 years participated in either a 12-week course containing a "meditation laboratory" or an active control course with similar content or experiential practice laboratory format (music. Participants completed an emotional word recall task and self-report questionnaires of well-being and clinical symptoms before and after the 12-week course.Results: Meditators showed greater increases in positive word recall compared to controls F(1, 56 = 6.6, p = .02. The meditation group increased significantly more on measures of well-being [F(1, 56 = 6.6, p = .01], with a marginal decrease in depression and anxiety [(F(1, 56 = 3.0, p = .09] compared to controls. Increased positive word recall was associated with increased psychological well-being [r = 0.31, p = .02] and decreased clinical symptoms [r = -0.29, p = .03].Conclusion: Mindfulness training was associated with greater improvements in processing efficiency for positively valenced stimuli than active control conditions. This change in emotional information processing was associated with improvements in psychological well-being and less depression and anxiety. These data suggest that mindfulness training may improve well-being via changes in emotional information processing.

  4. Information Processing Research.

    Science.gov (United States)

    1986-09-01

    business form in which information is entered by filling in blanks, or circling alternatives. The fields of the form cor- respond to the various pieces...power. Parallelism, rather than raw speed of the computing elements, seems to be the way that the 4-15 MACHINE INTELIGENCE brain gets such jobs done...MACHINE INTELIGENCE all intelligent systems. The purpose of this paper is to characterize the weak methods and to explain how and why they arise in

  5. INCORPORATION OF MECHANISTIC INFORMATION IN THE ARSENIC PBPK MODEL DEVELOPMENT PROCESS

    Science.gov (United States)

    INCORPORATING MECHANISTIC INSIGHTS IN A PBPK MODEL FOR ARSENICElaina M. Kenyon, Michael F. Hughes, Marina V. Evans, David J. Thomas, U.S. EPA; Miroslav Styblo, University of North Carolina; Michael Easterling, Analytical Sciences, Inc.A physiologically based phar...

  6. Industrial Information Processing

    DEFF Research Database (Denmark)

    Svensson, Carsten

    2002-01-01

    This paper demonstrates, how cross-functional business processes may be aligned with product specification systems in an intra-organizational environment by integrating planning systems and expert systems, thereby providing an end-to-end integrated and an automated solution to the “build-to-order...

  7. Theoretical aspects and modelling of cellular decision making, cell killing and information-processing in photodynamic therapy of cancer.

    Science.gov (United States)

    Gkigkitzis, Ioannis

    2013-01-01

    The aim of this report is to provide a mathematical model of the mechanism for making binary fate decisions about cell death or survival, during and after Photodynamic Therapy (PDT) treatment, and to supply the logical design for this decision mechanism as an application of rate distortion theory to the biochemical processing of information by the physical system of a cell. Based on system biology models of the molecular interactions involved in the PDT processes previously established, and regarding a cellular decision-making system as a noisy communication channel, we use rate distortion theory to design a time dependent Blahut-Arimoto algorithm where the input is a stimulus vector composed of the time dependent concentrations of three PDT related cell death signaling molecules and the output is a cell fate decision. The molecular concentrations are determined by a group of rate equations. The basic steps are: initialize the probability of the cell fate decision, compute the conditional probability distribution that minimizes the mutual information between input and output, compute the cell probability of cell fate decision that minimizes the mutual information and repeat the last two steps until the probabilities converge. Advance to the next discrete time point and repeat the process. Based on the model from communication theory described in this work, and assuming that the activation of the death signal processing occurs when any of the molecular stimulants increases higher than a predefined threshold (50% of the maximum concentrations), for 1800s of treatment, the cell undergoes necrosis within the first 30 minutes with probability range 90.0%-99.99% and in the case of repair/survival, it goes through apoptosis within 3-4 hours with probability range 90.00%-99.00%. Although, there is no experimental validation of the model at this moment, it reproduces some patterns of survival ratios of predicted experimental data. Analytical modeling based on cell death

  8. Predicting Defects Using Information Intelligence Process Models in the Software Technology Project.

    Science.gov (United States)

    Selvaraj, Manjula Gandhi; Jayabal, Devi Shree; Srinivasan, Thenmozhi; Balasubramanie, Palanisamy

    2015-01-01

    A key differentiator in a competitive market place is customer satisfaction. As per Gartner 2012 report, only 75%-80% of IT projects are successful. Customer satisfaction should be considered as a part of business strategy. The associated project parameters should be proactively managed and the project outcome needs to be predicted by a technical manager. There is lot of focus on the end state and on minimizing defect leakage as much as possible. Focus should be on proactively managing and shifting left in the software life cycle engineering model. Identify the problem upfront in the project cycle and do not wait for lessons to be learnt and take reactive steps. This paper gives the practical applicability of using predictive models and illustrates use of these models in a project to predict system testing defects thus helping to reduce residual defects.

  9. Model of Cortical Organization Embodying a Basis for a Theory of Information Processing and Memory Recall

    Science.gov (United States)

    Shaw, Gordon L.; Silverman, Dennis J.; Pearson, John C.

    1985-04-01

    Motivated by V. B. Mountcastle's organizational principle for neocortical function, and by M. E. Fisher's model of physical spin systems, we introduce a cooperative model of the cortical column incorporating an idealized substructure, the trion, which represents a localized group of neurons. Computer studies reveal that typical networks composed of a small number of trions (with symmetric interactions) exhibit striking behavior--e.g., hundreds to thousands of quasi-stable, periodic firing patterns, any of which can be selected out and enhanced with only small changes in interaction strengths by using a Hebb-type algorithm.

  10. Information processing occurs via critical avalanches in a model of the primary visual cortex

    International Nuclear Information System (INIS)

    Bortolotto, G. S.; Girardi-Schappo, M.; Gonsalves, J. J.; Tragtenberg, M. H. R.; Pinto, L. T.

    2016-01-01

    We study a new biologically motivated model for the Macaque monkey primary visual cortex which presents power-law avalanches after a visual stimulus. The signal propagates through all the layers of the model via avalanches that depend on network structure and synaptic parameter. We identify four different avalanche profiles as a function of the excitatory postsynaptic potential. The avalanches follow a size-duration scaling relation and present critical exponents that match experiments. The structure of the network gives rise to a regime of two characteristic spatial scales, one of which vanishes in the thermodynamic limit. (paper)

  11. Scaling the Information Processing Demands of Occupations

    Science.gov (United States)

    Haase, Richard F.; Jome, LaRae M.; Ferreira, Joaquim Armando; Santos, Eduardo J. R.; Connacher, Christopher C.; Sendrowitz, Kerrin

    2011-01-01

    The purpose of this study was to provide additional validity evidence for a model of person-environment fit based on polychronicity, stimulus load, and information processing capacities. In this line of research the confluence of polychronicity and information processing (e.g., the ability of individuals to process stimuli from the environment…

  12. Building Information Modeling (BIM) Primer. Report 1: Facility Life-Cycle Process and Technology Innovation

    Science.gov (United States)

    2012-08-01

    design. Negatives of the software, Melendez said, was that it was too time consuming and demanding at the initial design phase. Also, her learning...manage the building using BIM as part of the WO process. ii. Today it is preferred that installations prepare for delivery of BIM and have inplace...offices, a theater, exhibition hall, and cafe Project Team Urban Design Group, Hathaway Dinwiddie, TBD Consultants, IBE Consulting Engineers, KPFF

  13. Integrated Optical Information Processing

    Science.gov (United States)

    1988-08-01

    applications in optical disk memory systems [91. This device is constructed in a glass /SiO2/Si waveguide. The choice of a Si substrate allows for the...contact mask) were formed in the photoresist deposited on all of the samples, we covered the unwanted gratings on each sample with cover glass slides...processing, let us consider TeO2 (v, = 620 m/s) as a potential substrate for applications requiring large time delays. This con- sideration is despite

  14. The evolution of concepts of vestibular peripheral information processing: toward the dynamic, adaptive, parallel processing macular model

    Science.gov (United States)

    Ross, Muriel D.

    2003-01-01

    In a letter to Robert Hooke, written on 5 February, 1675, Isaac Newton wrote "If I have seen further than certain other men it is by standing upon the shoulders of giants." In his context, Newton was referring to the work of Galileo and Kepler, who preceded him. However, every field has its own giants, those men and women who went before us and, often with few tools at their disposal, uncovered the facts that enabled later researchers to advance knowledge in a particular area. This review traces the history of the evolution of views from early giants in the field of vestibular research to modern concepts of vestibular organ organization and function. Emphasis will be placed on the mammalian maculae as peripheral processors of linear accelerations acting on the head. This review shows that early, correct findings were sometimes unfortunately disregarded, impeding later investigations into the structure and function of the vestibular organs. The central themes are that the macular organs are highly complex, dynamic, adaptive, distributed parallel processors of information, and that historical references can help us to understand our own place in advancing knowledge about their complicated structure and functions.

  15. A pseudo-equilibrium thermodynamic model of information processing in nonlinear brain dynamics.

    Science.gov (United States)

    Freeman, Walter J

    2008-01-01

    Computational models of brain dynamics fall short of performance in speed and robustness of pattern recognition in detecting minute but highly significant pattern fragments. A novel model employs the properties of thermodynamic systems operating far from equilibrium, which is analyzed by linearization near adaptive operating points using root locus techniques. Such systems construct order by dissipating energy. Reinforcement learning of conditioned stimuli creates a landscape of attractors and their basins in each sensory cortex by forming nerve cell assemblies in cortical connectivity. Retrieval of a selected category of stored knowledge is by a phase transition that is induced by a conditioned stimulus, and that leads to pattern self-organization. Near self-regulated criticality the cortical background activity displays aperiodic null spikes at which analytic amplitude nears zero, and which constitute a form of Rayleigh noise. Phase transitions in recognition and recall are initiated at null spikes in the presence of an input signal, owing to the high signal-to-noise ratio that facilitates capture of cortex by an attractor, even by very weak activity that is typically evoked by a conditioned stimulus.

  16. Processing of recognition information and additional cues: A model-based analysis of choice, confidence, and response time

    Directory of Open Access Journals (Sweden)

    Andreas Glockner

    2011-02-01

    Full Text Available Research on the processing of recognition information has focused on testing the recognition heuristic (RH. On the aggregate, the noncompensatory use of recognition information postulated by the RH was rejected in several studies, while RH could still account for a considerable proportion of choices. These results can be explained if either a a part of the subjects used RH or b nobody used it but its choice predictions were accidentally in line with predictions of the strategy used. In the current study, which exemplifies a new approach to model testing, we determined individuals' decision strategies based on a maximum-likelihood classification method, taking into account choices, response times and confidence ratings simultaneously. Unlike most previous studies of the RH, our study tested the RH under conditions in which we provided information about cue values of unrecognized objects (which we argue is fairly common and thus of some interest. For 77.5% of the subjects, overall behavior was best explained by a compensatory parallel constraint satisfaction (PCS strategy. The proportion of subjects using an enhanced RH heuristic (RHe was negligible (up to 7.5%; 15% of the subjects seemed to use a take the best strategy (TTB. A more-fine grained analysis of the supplemental behavioral parameters conditional on strategy use supports PCS but calls into question process assumptions for apparent users of RH, RHe, and TTB within our experimental context. Our results are consistent with previous literature highlighting the importance of individual strategy classification as compared to aggregated analyses.

  17. Enhancing the performance of model-based elastography by incorporating additional a priori information in the modulus image reconstruction process

    International Nuclear Information System (INIS)

    Doyley, Marvin M; Srinivasan, Seshadri; Dimidenko, Eugene; Soni, Nirmal; Ophir, Jonathan

    2006-01-01

    Model-based elastography is fraught with problems owing to the ill-posed nature of the inverse elasticity problem. To overcome this limitation, we have recently developed a novel inversion scheme that incorporates a priori information concerning the mechanical properties of the underlying tissue structures, and the variance incurred during displacement estimation in the modulus image reconstruction process. The information was procured by employing standard strain imaging methodology, and introduced in the reconstruction process through the generalized Tikhonov approach. In this paper, we report the results of experiments conducted on gelatin phantoms to evaluate the performance of modulus elastograms computed with the generalized Tikhonov (GTK) estimation criterion relative to those computed by employing the un-weighted least-squares estimation criterion, the weighted least-squares estimation criterion and the standard Tikhonov method (i.e., the generalized Tikhonov method with no modulus prior). The results indicate that modulus elastograms computed with the generalized Tikhonov approach had superior elastographic contrast discrimination and contrast recovery. In addition, image reconstruction was more resilient to structural decorrelation noise when additional constraints were imposed on the reconstruction process through the GTK method

  18. BRICS and Quantum Information Processing

    DEFF Research Database (Denmark)

    Schmidt, Erik Meineche

    1998-01-01

    BRICS is a research centre and international PhD school in theoretical computer science, based at the University of Aarhus, Denmark. The centre has recently become engaged in quantum information processing in cooperation with the Department of Physics, also University of Aarhus. This extended...... abstract surveys activities at BRICS with special emphasis on the activities in quantum information processing....

  19. Information Processing and Limited Liability

    OpenAIRE

    Bartosz Mackowiak; Mirko Wiederholt

    2012-01-01

    Decision-makers often face limited liability and thus know that their loss will be bounded. We study how limited liability affects the behavior of an agent who chooses how much information to acquire and process in order to take a good decision. We find that an agent facing limited liability processes less information than an agent with unlimited liability. The informational gap between the two agents is larger in bad times than in good times and when information is more costly to process.

  20. Biosphere Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    J. Schmitt

    2000-05-25

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  1. Biosphere Process Model Report

    International Nuclear Information System (INIS)

    Schmitt, J.

    2000-01-01

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  2. Mindfulness training alters emotional memory recall compared to active controls: support for an emotional information processing model of mindfulness.

    Science.gov (United States)

    Roberts-Wolfe, Douglas; Sacchet, Matthew D; Hastings, Elizabeth; Roth, Harold; Britton, Willoughby

    2012-01-01

    While mindfulness-based interventions have received widespread application in both clinical and non-clinical populations, the mechanism by which mindfulness meditation improves well-being remains elusive. One possibility is that mindfulness training alters the processing of emotional information, similar to prevailing cognitive models of depression and anxiety. The aim of this study was to investigate the effects of mindfulness training on emotional information processing (i.e., memory) biases in relation to both clinical symptomatology and well-being in comparison to active control conditions. Fifty-eight university students (28 female, age = 20.1 ± 2.7 years) participated in either a 12-week course containing a "meditation laboratory" or an active control course with similar content or experiential practice laboratory format (music). Participants completed an emotional word recall task and self-report questionnaires of well-being and clinical symptoms before and after the 12-week course. Meditators showed greater increases in positive word recall compared to controls [F(1, 56) = 6.6, p = 0.02]. The meditation group increased significantly more on measures of well-being [F(1, 56) = 6.6, p = 0.01], with a marginal decrease in depression and anxiety [F(1, 56) = 3.0, p = 0.09] compared to controls. Increased positive word recall was associated with increased psychological well-being (r = 0.31, p = 0.02) and decreased clinical symptoms (r = -0.29, p = 0.03). Mindfulness training was associated with greater improvements in processing efficiency for positively valenced stimuli than active control conditions. This change in emotional information processing was associated with improvements in psychological well-being and less depression and anxiety. These data suggest that mindfulness training may improve well-being via changes in emotional information processing. Future research with a fully randomized design will be

  3. Locating the Seventh Cervical Spinous Process: Development and Validation of a Multivariate Model Using Palpation and Personal Information.

    Science.gov (United States)

    Ferreira, Ana Paula A; Póvoa, Luciana C; Zanier, José F C; Ferreira, Arthur S

    2017-02-01

    The aim of this study was to develop and validate a multivariate prediction model, guided by palpation and personal information, for locating the seventh cervical spinous process (C7SP). A single-blinded, cross-sectional study at a primary to tertiary health care center was conducted for model development and temporal validation. One-hundred sixty participants were prospectively included for model development (n = 80) and time-split validation stages (n = 80). The C7SP was located using the thorax-rib static method (TRSM). Participants underwent chest radiography for assessment of the inner body structure located with TRSM and using radio-opaque markers placed over the skin. Age, sex, height, body mass, body mass index, and vertex-marker distance (D V-M ) were used to predict the distance from the C7SP to the vertex (D V-C7 ). Multivariate linear regression modeling, limits of agreement plot, histogram of residues, receiver operating characteristic curves, and confusion tables were analyzed. The multivariate linear prediction model for D V-C7 (in centimeters) was D V-C7 = 0.986D V-M + 0.018(mass) + 0.014(age) - 1.008. Receiver operating characteristic curves had better discrimination of D V-C7 (area under the curve = 0.661; 95% confidence interval = 0.541-0.782; P = .015) than D V-M (area under the curve = 0.480; 95% confidence interval = 0.345-0.614; P = .761), with respective cutoff points at 23.40 cm (sensitivity = 41%, specificity = 63%) and 24.75 cm (sensitivity = 69%, specificity = 52%). The C7SP was correctly located more often when using predicted D V-C7 in the validation sample than when using the TRSM in the development sample: n = 53 (66%) vs n = 32 (40%), P information. Copyright © 2016. Published by Elsevier Inc.

  4. Information accessibility and cryptic processes

    Energy Technology Data Exchange (ETDEWEB)

    Mahoney, John R; Ellison, Christopher J; Crutchfield, James P [Complexity Sciences Center and Physics Department, University of California at Davis, One Shields Avenue, Davis, CA 95616 (United States)], E-mail: jrmahoney@ucdavis.edu, E-mail: cellison@cse.ucdavis.edu, E-mail: chaos@cse.ucdavis.edu

    2009-09-11

    We give a systematic expansion of the crypticity-a recently introduced measure of the inaccessibility of a stationary process's internal state information. This leads to a hierarchy of k-cryptic processes and allows us to identify finite-state processes that have infinite cryptic order-the internal state information is present across arbitrarily long, observed sequences. The crypticity expansion is exact in both the finite- and infinite-order cases. It turns out that k-crypticity is complementary to the Markovian finite-order property that describes state information in processes. One application of these results is an efficient expansion of the excess entropy-the mutual information between a process's infinite past and infinite future-that is finite and exact for finite-order cryptic processes. (fast track communication)

  5. A language for information commerce processes

    NARCIS (Netherlands)

    Aberer, Karl; Wombacher, Andreas

    Automatizing information commerce requires languages to represent the typical information commerce processes. Existing languages and standards cover either only very specific types of business models or are too general to capture in a concise way the specific properties of information commerce

  6. Information processing, computation, and cognition.

    Science.gov (United States)

    Piccinini, Gualtiero; Scarantino, Andrea

    2011-01-01

    Computation and information processing are among the most fundamental notions in cognitive science. They are also among the most imprecisely discussed. Many cognitive scientists take it for granted that cognition involves computation, information processing, or both - although others disagree vehemently. Yet different cognitive scientists use 'computation' and 'information processing' to mean different things, sometimes without realizing that they do. In addition, computation and information processing are surrounded by several myths; first and foremost, that they are the same thing. In this paper, we address this unsatisfactory state of affairs by presenting a general and theory-neutral account of computation and information processing. We also apply our framework by analyzing the relations between computation and information processing on one hand and classicism, connectionism, and computational neuroscience on the other. We defend the relevance to cognitive science of both computation, at least in a generic sense, and information processing, in three important senses of the term. Our account advances several foundational debates in cognitive science by untangling some of their conceptual knots in a theory-neutral way. By leveling the playing field, we pave the way for the future resolution of the debates' empirical aspects.

  7. Mechanisms of placebo analgesia: A dual-process model informed by insights from cross-species comparisons.

    Science.gov (United States)

    Schafer, Scott M; Geuter, Stephan; Wager, Tor D

    2018-01-01

    Placebo treatments are pharmacologically inert, but are known to alleviate symptoms across a variety of clinical conditions. Associative learning and cognitive expectations both play important roles in placebo responses, however we are just beginning to understand how interactions between these processes lead to powerful effects. Here, we review the psychological principles underlying placebo effects and our current understanding of their brain bases, focusing on studies demonstrating both the importance of cognitive expectations and those that demonstrate expectancy-independent associative learning. To account for both forms of placebo analgesia, we propose a dual-process model in which flexible, contextually driven cognitive schemas and attributions guide associative learning processes that produce stable, long-term placebo effects. According to this model, the placebo-induction paradigms with the most powerful effects are those that combine reinforcement (e.g., the experience of reduced pain after placebo treatment) with suggestions and context cues that disambiguate learning by attributing perceived benefit to the placebo. Using this model as a conceptual scaffold, we review and compare neurobiological systems identified in both human studies of placebo analgesia and behavioral pain modulation in rodents. We identify substantial overlap between the circuits involved in human placebo analgesia and those that mediate multiple forms of context-based modulation of pain behavior in rodents, including forebrain-brainstem pathways and opioid and cannabinoid systems in particular. This overlap suggests that placebo effects are part of a set of adaptive mechanisms for shaping nociceptive signaling based on its information value and anticipated optimal response in a given behavioral context. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Research and development of models and instruments to define, measure, and improve shared information processing with government oversight agencies. An analysis of the literature, August 1990--January 1992

    Energy Technology Data Exchange (ETDEWEB)

    1992-12-31

    This document identifies elements of sharing, plus key variables of each and their interrelationships. The document`s model of sharing is intended to help management systems` users understand what sharing is and how to integrate it with information processing.

  9. Using life cycle information in process discovery

    NARCIS (Netherlands)

    Leemans, S.J.J.; Fahland, D.; Van Der Aalst, W.M.P.; Reichert, M.; Reijers, H.A.

    2016-01-01

    Understanding the performance of business processes is an important part of any business process intelligence project. From historical information recorded in event logs, performance can be measured and visualized on a discovered process model. Thereby the accuracy of the measured performance, e.g.,

  10. Modeling gross primary production of agro-forestry ecosystems by assimilation of satellite-derived information in a process-based model.

    Science.gov (United States)

    Migliavacca, Mirco; Meroni, Michele; Busetto, Lorenzo; Colombo, Roberto; Zenone, Terenzio; Matteucci, Giorgio; Manca, Giovanni; Seufert, Guenther

    2009-01-01

    In this paper we present results obtained in the framework of a regional-scale analysis of the carbon budget of poplar plantations in Northern Italy. We explored the ability of the process-based model BIOME-BGC to estimate the gross primary production (GPP) using an inverse modeling approach exploiting eddy covariance and satellite data. We firstly present a version of BIOME-BGC coupled with the radiative transfer models PROSPECT and SAILH (named PROSAILH-BGC) with the aims of i) improving the BIOME-BGC description of the radiative transfer regime within the canopy and ii) allowing the assimilation of remotely-sensed vegetation index time series, such as MODIS NDVI, into the model. Secondly, we present a two-step model inversion for optimization of model parameters. In the first step, some key ecophysiological parameters were optimized against data collected by an eddy covariance flux tower. In the second step, important information about phenological dates and about standing biomass were optimized against MODIS NDVI. Results obtained showed that the PROSAILH-BGC allowed simulation of MODIS NDVI with good accuracy and that we described better the canopy radiation regime. The inverse modeling approach was demonstrated to be useful for the optimization of ecophysiological model parameters, phenological dates and parameters related to the standing biomass, allowing good accuracy of daily and annual GPP predictions. In summary, this study showed that assimilation of eddy covariance and remote sensing data in a process model may provide important information for modeling gross primary production at regional scale.

  11. Modeling Gross Primary Production of Agro-Forestry Ecosystems by Assimilation of Satellite-Derived Information in a Process-Based Model

    Directory of Open Access Journals (Sweden)

    Guenther Seufert

    2009-02-01

    Full Text Available In this paper we present results obtained in the framework of a regional-scale analysis of the carbon budget of poplar plantations in Northern Italy. We explored the ability of the process-based model BIOME-BGC to estimate the gross primary production (GPP using an inverse modeling approach exploiting eddy covariance and satellite data. We firstly present a version of BIOME-BGC coupled with the radiative transfer models PROSPECT and SAILH (named PROSAILH-BGC with the aims of i improving the BIOME-BGC description of the radiative transfer regime within the canopy and ii allowing the assimilation of remotely-sensed vegetation index time series, such as MODIS NDVI, into the model. Secondly, we present a two-step model inversion for optimization of model parameters. In the first step, some key ecophysiological parameters were optimized against data collected by an eddy covariance flux tower. In the second step, important information about phenological dates and about standing biomass were optimized against MODIS NDVI. Results obtained showed that the PROSAILH-BGC allowed simulation of MODIS NDVI with good accuracy and that we described better the canopy radiation regime. The inverse modeling approach was demonstrated to be useful for the optimization of ecophysiological model parameters, phenological dates and parameters related to the standing biomass, allowing good accuracy of daily and annual GPP predictions. In summary, this study showed that assimilation of eddy covariance and remote sensing data in a process model may provide important information for modeling gross primary production at regional scale.

  12. Information Processing and Human Abilities

    Science.gov (United States)

    Kirby, John R.; Das, J. P.

    1978-01-01

    The simultaneous and successive processing model of cognitive abilities was compared to a traditional primary mental abilities model. Simultaneous processing was found to be primarily related to spatial ability; and to a lesser extent, to memory and inductive reasoning. Subjects were 104 fourth-grade urban males. (Author/GD C)

  13. Capturing the role of awareness and information search processes on Choice Set Formation in Models of Activity-Travel Behavior

    NARCIS (Netherlands)

    Arentze, T.A.; Timmermans, H.J.P.

    2004-01-01

    Individuals choose locations for conducting specific activities generally under limited information conditions and update their mental map during traveling and implementing activities. In an earlier study, we proposed a Bayesian belief network to model an individual's mental map and spatial

  14. What makes process models understandable?

    NARCIS (Netherlands)

    Mendling, J.; Reijers, H.A.; Cardoso, J.; Alonso, G.; Dadam, P.; Rosemann, M.

    2007-01-01

    Despite that formal and informal quality aspects are of significant importance to business process modeling, there is only little empirical work reported on process model quality and its impact factors. In this paper we investigate understandability as a proxy for quality of process models and focus

  15. Causal Relationship Model of the Information and Communication Technology Skill Affect the Technology Acceptance Process in the 21ST Century for Undergraduate Students

    Directory of Open Access Journals (Sweden)

    Thanyatorn Amornkitpinyo

    2015-02-01

    Full Text Available The objective of this study is to design a framework for a causal relationship model of the Information and Communication Technology skills that affect the Technology Acceptance Process (TAP for undergraduate students in the 21ST Century. This research uses correlational analysis. A consideration of the research methodology is divided into two sections. The first section involves a synthesis concept framework for process acceptance of the causal relationship model of the Information and Communication Technology skills that affect the Technology Acceptance Process for undergraduate students in the 21ST Century. The second section proposes the design concept framework of the model. The research findings are as follows: 1 The exogenous latent variables included in the causal relationship model of the Information and Communication Technology skills that affect the Technology Acceptance Process for undergraduate students in the 21ST Century are basic ICT skills and self-efficacy. 2 The mediating latent variables of the causal relationship model of the Information and Communication Technology skills that affect the Technology Acceptance Process for undergraduate students in the 21ST Century are from the TAM Model, these includes three components: 1 perceived usefulness, 2 perceived ease of use and 3 attitudes. 3 The outcome latent variable of the causal relationship model of the Information and Communication Technology skills that affect the Technology Acceptance Process for undergraduate students in the 21ST Century is behavioural intention.

  16. Modeling of the Operating Information for System of Logistical Support of the Hardware-software Means of Safety of the Distributed Systems for Data Processing

    Directory of Open Access Journals (Sweden)

    A. A. Durakovsky

    2010-03-01

    Full Text Available The technique of information modeling of processes and procedures making them by preparation of the operating information for system of logistical support of technological processes of operation and service of hardware-software means of safety of the distributed systems of data processing is offered. Procedures of preparation of the operating information for the system of logistical support of APSOB РСОД concern: working out and formalization of algorithm of functioning; construction of model of the functioning, allowing to calculate degree of risk of operation; decomposition of model and classification of its objects for the purpose of the unequivocal description of all elements of the operating information and mutual coordination of relations between information units.

  17. Proprioceptive information processing in schizophrenia

    DEFF Research Database (Denmark)

    Arnfred, Sidse M H

    of the left somatosensory cortex and it was suggested to be in accordance with two theories of schizophrenic information processing: the theory of deficiency of corollary discharge and the theory of weakening of the influence of past regularities. No gating deficiency was observed and the imprecision...... Rado (1890-1972) suggested that one of two un-reducible deficits in schizophrenia was a disorder of proprioception. Exploration of proprioceptive information processing is possible through the measurement of evoked and event related potentials. Event related EEG can be analyzed as conventional time...... and amplitude attenuation was not a general phenomenon across the entire brain response. Summing up, in support of Rado's hypothesis, schizophrenia spectrum patients demonstrated abnormalities in proprioceptive information processing. Future work needs to extend the findings in larger un-medicated, non...

  18. Combining Livestock Production Information in a Process-Based Vegetation Model to Reconstruct the History of Grassland Management

    Science.gov (United States)

    Chang, Jinfeng; Ciais, Philippe; Herrero, Mario; Havlik, Petr; Campioli, Matteo; Zhang, Xianzhou; Bai, Yongfei; Viovy, Nicolas; Joiner, Joanna; Wang, Xuhui; hide

    2016-01-01

    Grassland management type (grazed or mown) and intensity (intensive or extensive) play a crucial role in the greenhouse gas balance and surface energy budget of this biome, both at field scale and at large spatial scale. However, global gridded historical information on grassland management intensity is not available. Combining modelled grass-biomass productivity with statistics of the grass-biomass demand by livestock, we reconstruct gridded maps of grassland management intensity from 1901 to 2012. These maps include the minimum area of managed vs. maximum area of unmanaged grasslands and the fraction of mown vs. grazed area at a resolution of 0.5deg by 0.5deg. The grass-biomass demand is derived from a livestock dataset for 2000, extended to cover the period 19012012. The grass-biomass supply (i.e. forage grass from mown grassland and biomass grazed) is simulated by the process-based model ORCHIDEE-GM driven by historical climate change, risingCO2 concentration, and changes in nitrogen fertilization. The global area of managed grassland obtained in this study increases from 6.1 x 10(exp 6) km(exp 2) in 1901 to 12.3 x 10(exp 6) kmI(exp 2) in 2000, although the expansion pathway varies between different regions. ORCHIDEE-GM also simulated augmentation in global mean productivity and herbage-use efficiency over managed grassland during the 20th century, indicating a general intensification of grassland management at global scale but with regional differences. The gridded grassland management intensity maps are model dependent because they depend on modelled productivity. Thus specific attention was given to the evaluation of modelled productivity against a series of observations from site-level net primary productivity (NPP) measurements to two global satellite products of gross primary productivity (GPP) (MODIS-GPP and SIF data). Generally, ORCHIDEE-GM captures the spatial pattern, seasonal cycle, and inter-annual variability of grassland productivity at global

  19. Impacts of Irrigation and Climate Change on Water Security: Using Stakeholder Engagement to Inform a Process-based Crop Model

    Science.gov (United States)

    Leonard, A.; Flores, A. N.; Han, B.; Som Castellano, R.; Steimke, A.

    2016-12-01

    Irrigation is an essential component for agricultural production in arid and semi-arid regions, accounting for a majority of global freshwater withdrawals used for human consumption. Since climate change affects both the spatiotemporal demand and availability of water in irrigated areas, agricultural productivity and water efficiency depend critically on how producers adapt and respond to climate change. It is necessary, therefore, to understand the coevolution and feedbacks between humans and agricultural systems. Integration of social and hydrologic processes can be achieved by active engagement with local stakeholders and applying their expertise to models of coupled human-environment systems. Here, we use a process based crop simulation model (EPIC) informed by stakeholder engagement to determine how both farm management and climate change influence regional agricultural water use and production in the Lower Boise River Basin (LBRB) of southwest Idaho. Specifically, we investigate how a shift from flood to sprinkler fed irrigation would impact a watershed's overall agricultural water use under RCP 4.5 and RCP 8.5 climate scenarios. The LBRB comprises about 3500 km2, of which 20% is dedicated to irrigated crops and another 40% to grass/pasture grazing land. Via interviews of stakeholders in the LBRB, we have determined that approximately 70% of irrigated lands in the region are flood irrigated. We model four common crops produced in the LBRB (alfalfa, corn, winter wheat, and sugarbeets) to investigate both hydrologic and agricultural impacts of irrigation and climatic drivers. Factors influencing farmers' decision to switch from flood to sprinkler irrigation include potential economic benefits, external financial incentives, and providing a buffer against future water shortages. These two irrigation practices are associated with significantly different surface water and energy budgets, and large-scale shifts in practice could substantially impact regional

  20. INFORMATION MODEL OF SOCIAL TRANSFORMATIONS

    Directory of Open Access Journals (Sweden)

    Мария Васильевна Комова

    2013-09-01

    Full Text Available The social transformation is considered as a process of qualitative changes of the society, creating a new level of organization in all areas of life, in different social formations, societies of different types of development. The purpose of the study is to create a universal model for studying social transformations based on their understanding as the consequence of the information exchange processes in the society. After defining the conceptual model of the study, the author uses the following methods: the descriptive method, analysis, synthesis, comparison.Information, objectively existing in all elements and systems of the material world, is an integral attribute of the society transformation as well. The information model of social transformations is based on the definition of the society transformation as the change in the information that functions in the society’s information space. The study of social transformations is the study of information flows circulating in the society and being characterized by different spatial, temporal, and structural states. Social transformations are a highly integrated system of social processes and phenomena, the nature, course and consequences of which are affected by the factors representing the whole complex of material objects. The integrated information model of social transformations foresees the interaction of the following components: social memory, information space, and the social ideal. To determine the dynamics and intensity of social transformations the author uses the notions of "information threshold of social transformations" and "information pressure".Thus, the universal nature of information leads to considering social transformations as a system of information exchange processes. Social transformations can be extended to any episteme actualized by social needs. The establishment of an information threshold allows to simulate the course of social development, to predict the

  1. Relationships: empirical contribution. Understanding personality pathology in adolescents: the five factor model of personality and social information processing.

    Science.gov (United States)

    Hessels, Christel; van den Hanenberg, Danique; de Castro, Bram Orobio; van Aken, Marcel A G

    2014-02-01

    This study seeks to integrate two research traditions that lie at the base of the understanding of personality pathology in adolescents. The first research tradition refers to normal personality according to the Five Factor Model (FFM). The second tradition specifies the key feature of personality disorder as the capacity to mentalize, which can be reflected in Social Information Processing (SIP). In a clinical sample of 96 adolescents, the authors investigated response generation, coping strategy, and memories of past frustrating experiences as part of SIP, as mediator in the relationship between personality and personality pathology, and a possible moderating role of personality on the relationship between SIP and personality pathology. The hypothesized mediation, by which the effects of personality dimensions on personality pathology was expected to be mediated by SIP variables, was found only for the effect of Neuroticism, most specifically on BPD, which appeared to be mediated by memories the patients had about past frustrating conflict situations with peers. Some moderating effects of personality on the relationship between SIP variables and personality pathology were found, suggesting that high Agreeableness and sometimes low Neuroticism can buffer this relationship. These results suggest that personality dimensions and social cognitions both independently and together play a role in adolescents' personality pathology.

  2. Accuracy in Optical Information Processing

    Science.gov (United States)

    Timucin, Dogan Aslan

    Low computational accuracy is an important obstacle for optical processors which blocks their way to becoming a practical reality and a serious challenger for classical computing paradigms. This research presents a comprehensive solution approach to the problem of accuracy enhancement in discrete analog optical information processing systems. Statistical analysis of a generic three-plane optical processor is carried out first, taking into account the effects of diffraction, interchannel crosstalk, and background radiation. Noise sources included in the analysis are photon, excitation, and emission fluctuations in the source array, transmission and polarization fluctuations in the modulator, and photoelectron, gain, dark, shot, and thermal noise in the detector array. Means and mutual coherence and probability density functions are derived for both optical and electrical output signals. Next, statistical models for a number of popular optoelectronic devices are studied. Specific devices considered here are light-emitting and laser diode sources, an ideal noiseless modulator and a Gaussian random-amplitude-transmittance modulator, p-i-n and avalanche photodiode detectors followed by electronic postprocessing, and ideal free-space geometrical -optics propagation and single-lens imaging systems. Output signal statistics are determined for various interesting device combinations by inserting these models into the general formalism. Finally, based on these special-case output statistics, results on accuracy limitations and enhancement in optical processors are presented. Here, starting with the formulation of the accuracy enhancement problem as (1) an optimal detection problem and (2) as a parameter estimation problem, the potential accuracy improvements achievable via the classical multiple-hypothesis -testing and maximum likelihood and Bayesian parameter estimation methods are demonstrated. Merits of using proper normalizing transforms which can potentially stabilize

  3. The prototypical ranitidine analog JWS-USC-75-IX improves information processing and cognitive function in animal models.

    Science.gov (United States)

    Terry, Alvin V; Buccafusco, Jerry J; Herman, Elizabeth J; Callahan, Patrick M; Beck, Wayne D; Warner, Samantha; Vandenhuerk, Leah; Bouchard, Kristy; Schwarz, Gary M; Gao, Jie; Chapman, James M

    2011-03-01

    This study was designed to evaluate further a prototypical ranitidine analog, JWS-USC-75-IX, [(3-[[[2-[[(5-dimethylaminomethyl)-2-furanyl]methyl]thio]ethyl]amino]-4-nitropyridazine, JWS], for neuropharmacologic properties that would theoretically be useful for treating cognitive and noncognitive behavioral symptoms of neuropsychiatric disorders. JWS was previously found to inhibit acetylcholinesterase (AChE) activity, serve as a potent ligand at muscarinic M₂ acetylcholine receptors, and elicit positive effects on spatial learning, passive avoidance, and working memory in rodents. In the current study, JWS was evaluated for binding activity at more than 60 neurotransmitter receptors, transporters, and ion channels, as well as for inhibitory activity at AChE and butyrylcholinesterase (BChE). The results indicate that JWS inhibits AChE and BChE at low (micromolar) concentrations and that it is a functional antagonist at M₂ receptors (K(B) = 320 nM). JWS was subsequently evaluated orally across additional behavioral assays in rodents (dose range, 0.03-10.0 mg/kg) as well as nonhuman primates (dose range, 0.05-2.0 mg/kg). In rats, JWS improved prepulse inhibition (PPI) of the acoustic startle response in nonimpaired rats and attenuated PPI deficits in three pharmacologic impairment models. JWS also attenuated scopolamine and (-)-5-methyl-10,11-dihydro-5H-dibenzo[a,d]cyclohepten-5,10-imine maleate (MK-801)-related impairments in a spontaneous novel object recognition task and a five-choice serial reaction time task, respectively. In monkeys, JWS elicited dose-dependent improvements of a delayed match-to-sample task as well as an attention-related version of the task where randomly presented (task-relevant) distractors were presented. Thus, JWS (potentially via effects at several drug targets) improves information processing, attention, and memory in animal models and could potentially treat the cognitive and behavioral symptoms of some neuropsychiatric illnesses.

  4. Classicality of quantum information processing

    International Nuclear Information System (INIS)

    Poulin, David

    2002-01-01

    The ultimate goal of the classicality program is to quantify the amount of quantumness of certain processes. Here, classicality is studied for a restricted type of process: quantum information processing (QIP). Under special conditions, one can force some qubits of a quantum computer into a classical state without affecting the outcome of the computation. The minimal set of conditions is described and its structure is studied. Some implications of this formalism are the increase of noise robustness, a proof of the quantumness of mixed state quantum computing, and a step forward in understanding the very foundation of QIP

  5. Information processing among high-performance managers

    Directory of Open Access Journals (Sweden)

    S.C. Garcia-Santos

    2010-01-01

    Full Text Available The purpose of this study was to evaluate the information processing of 43 business managers with a professional superior performance. The theoretical framework considers three models: the Theory of Managerial Roles of Henry Mintzberg, the Theory of Information Processing, and Process Model Response to Rorschach by John Exner. The participants have been evaluated by Rorschach method. The results show that these managers are able to collect data, evaluate them and establish rankings properly. At same time, they are capable of being objective and accurate in the problems assessment. This information processing style permits an interpretation of the world around on basis of a very personal and characteristic processing way or cognitive style.

  6. Building information modelling (BIM)

    CSIR Research Space (South Africa)

    Conradie, Dirk CU

    2009-02-01

    Full Text Available The concept of a Building Information Model (BIM) also known as a Building Product Model (BPM) is nothing new. A short article on BIM will never cover the entire filed, because it is a particularly complex filed that is recently beginning to receive...

  7. Towards elicitation of users requirements for hospital information system: from a care process modelling technique to a web based collaborative tool.

    OpenAIRE

    Staccini, Pascal M.; Joubert, Michel; Quaranta, Jean-Francois; Fieschi, Marius

    2002-01-01

    Growing attention is being given to the use of process modeling methodology for user requirements elicitation. In the analysis phase of hospital information systems, the usefulness of care-process models has been investigated to evaluate the conceptual applicability and practical understandability by clinical staff and members of users teams. Nevertheless, there still remains a gap between users and analysts in their mutual ability to share conceptual views and vocabulary, keeping the meaning...

  8. Sampling and assessment accuracy in mate choice: a random-walk model of information processing in mating decision.

    Science.gov (United States)

    Castellano, Sergio; Cermelli, Paolo

    2011-04-07

    Mate choice depends on mating preferences and on the manner in which mate-quality information is acquired and used to make decisions. We present a model that describes how these two components of mating decision interact with each other during a comparative evaluation of prospective mates. The model, with its well-explored precedents in psychology and neurophysiology, assumes that decisions are made by the integration over time of noisy information until a stopping-rule criterion is reached. Due to this informational approach, the model builds a coherent theoretical framework for developing an integrated view of functions and mechanisms of mating decisions. From a functional point of view, the model allows us to investigate speed-accuracy tradeoffs in mating decision at both population and individual levels. It shows that, under strong time constraints, decision makers are expected to make fast and frugal decisions and to optimally trade off population-sampling accuracy (i.e. the number of sampled males) against individual-assessment accuracy (i.e. the time spent for evaluating each mate). From the proximate-mechanism point of view, the model makes testable predictions on the interactions of mating preferences and choosiness in different contexts and it might be of compelling empirical utility for a context-independent description of mating preference strength. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. Principles of neural information processing

    CERN Document Server

    Seelen, Werner v

    2016-01-01

    In this fundamental book the authors devise a framework that describes the working of the brain as a whole. It presents a comprehensive introduction to the principles of Neural Information Processing as well as recent and authoritative research. The books´ guiding principles are the main purpose of neural activity, namely, to organize behavior to ensure survival, as well as the understanding of the evolutionary genesis of the brain. Among the developed principles and strategies belong self-organization of neural systems, flexibility, the active interpretation of the world by means of construction and prediction as well as their embedding into the world, all of which form the framework of the presented description. Since, in brains, their partial self-organization, the lifelong adaptation and their use of various methods of processing incoming information are all interconnected, the authors have chosen not only neurobiology and evolution theory as a basis for the elaboration of such a framework, but also syst...

  10. On constitutive modelling and information for phenomenal distributed parameter control of multicomponent chemical processes in fluid- and solidphase

    International Nuclear Information System (INIS)

    Niemiec, W.

    1985-01-01

    The problem under consideration is to find common physicochemical conditions of kinetics and phenomena of multicomponent chemical processes in fluid- and solidphase, deciding yield and quality of final products of these processes. The paper is devoted to the construction of a fundamental distributed parameter constitutive theory of physicochemical modelling of these chemical processes treated from the view of isotropic and anisotropic nonhomogeneous media with space and time memories. On the basis of definition of derivative and constitutive equations of continuity, original system of partial differential constitutive state equations are deduced

  11. Markovian Processes for Quantitative Information Leakage

    DEFF Research Database (Denmark)

    Biondi, Fabrizio

    Quantification of information leakage is a successful approach for evaluating the security of a system. It models the system to be analyzed as a channel with the secret as the input and an output as observable by the attacker as the output, and applies information theory to quantify the amount...... and randomized processes with Markovian models and to compute their information leakage for a very general model of attacker. We present the QUAIL tool that automates such analysis and is able to compute the information leakage of an imperative WHILE language. Finally, we show how to use QUAIL to analyze some...... of information transmitted through such channel, thus effectively quantifying how many bits of the secret can be inferred by the attacker by analyzing the system’s output. Channels are usually encoded as matrices of conditional probabilities, known as channel matrices. Such matrices grow exponentially...

  12. Modelling Choice of Information Sources

    Directory of Open Access Journals (Sweden)

    Agha Faisal Habib Pathan

    2013-04-01

    Full Text Available This paper addresses the significance of traveller information sources including mono-modal and multimodal websites for travel decisions. The research follows a decision paradigm developed earlier, involving an information acquisition process for travel choices, and identifies the abstract characteristics of new information sources that deserve further investigation (e.g. by incorporating these in models and studying their significance in model estimation. A Stated Preference experiment is developed and the utility functions are formulated by expanding the travellers' choice set to include different combinations of sources of information. In order to study the underlying choice mechanisms, the resulting variables are examined in models based on different behavioural strategies, including utility maximisation and minimising the regret associated with the foregone alternatives. This research confirmed that RRM (Random Regret Minimisation Theory can fruitfully be used and can provide important insights for behavioural studies. The study also analyses the properties of travel planning websites and establishes a link between travel choices and the content, provenance, design, presence of advertisements, and presentation of information. The results indicate that travellers give particular credence to governmentowned sources and put more importance on their own previous experiences than on any other single source of information. Information from multimodal websites is more influential than that on train-only websites. This in turn is more influential than information from friends, while information from coachonly websites is the least influential. A website with less search time, specific information on users' own criteria, and real time information is regarded as most attractive

  13. Abnormal presynaptic short-term plasticity and information processing in a mouse model of fragile X syndrome.

    Science.gov (United States)

    Deng, Pan-Yue; Sojka, David; Klyachko, Vitaly A

    2011-07-27

    Fragile X syndrome (FXS) is the most common inherited form of intellectual disability and the leading genetic cause of autism. It is associated with the lack of fragile X mental retardation protein (FMRP), a regulator of protein synthesis in axons and dendrites. Studies on FXS have extensively focused on the postsynaptic changes underlying dysfunctions in long-term plasticity. In contrast, the presynaptic mechanisms of FXS have garnered relatively little attention and are poorly understood. Activity-dependent presynaptic processes give rise to several forms of short-term plasticity (STP), which is believed to control some of essential neural functions, including information processing, working memory, and decision making. The extent of STP defects and their contributions to the pathophysiology of FXS remain essentially unknown, however. Here we report marked presynaptic abnormalities at excitatory hippocampal synapses in Fmr1 knock-out (KO) mice leading to defects in STP and information processing. Loss of FMRP led to enhanced responses to high-frequency stimulation. Fmr1 KO mice also exhibited abnormal synaptic processing of natural stimulus trains, specifically excessive enhancement during the high-frequency spike discharges associated with hippocampal place fields. Analysis of individual STP components revealed strongly increased augmentation and reduced short-term depression attributable to loss of FMRP. These changes were associated with exaggerated calcium influx in presynaptic neurons during high-frequency stimulation, enhanced synaptic vesicle recycling, and enlarged readily-releasable and reserved vesicle pools. These data suggest that loss of FMRP causes abnormal STP and information processing, which may represent a novel mechanism contributing to cognitive impairments in FXS.

  14. Information Processing Models of Skilled Reading: The Relationship Between Chinese Orthography and Coding Tactics in Primary Memory.

    Science.gov (United States)

    Shwedel, Allan M.

    A probe recall short-term retention task was used to test the applicability of the "phonological recoding" (Conrad, 1972) and "flexible decoding" (Smith, 1972) models to processing tactics used by readers of Chinese. Subjects were 45 adult speakers of Cantonese. Stimuli were lists of Chinese characters which varied in terms of phonological and…

  15. The Mediation of Mothers’ Self-Fulfilling Effects on Their Children’s Alcohol Use: Self-Verification, Informational Conformity and Modeling Processes

    Science.gov (United States)

    Madon, Stephanie; Guyll, Max; Buller, Ashley A.; Scherr, Kyle C.; Willard, Jennifer; Spoth, Richard

    2010-01-01

    This research examined whether self-fulfilling prophecy effects are mediated by self-verification, informational conformity, and modeling processes. The authors examined these mediational processes across multiple time frames with longitudinal data obtained from two samples of mother – child dyads (N1 = 487; N2 = 287). Children’s alcohol use was the outcome variable. The results provided consistent support for the mediational process of self-verification. In both samples and across several years of adolescence, there was a significant indirect effect of mothers’ beliefs on children’s alcohol use through children’s self-assessed likelihood of drinking alcohol in the future. Comparatively less support was found for informational conformity and modeling processes as mediators of mothers’ self-fulfilling effects. The potential for self-fulfilling prophecies to produce long lasting changes in targets’ behavior via self-verification processes are discussed. PMID:18665708

  16. The mediation of mothers' self-fulfilling effects on their children's alcohol use: self-verification, informational conformity, and modeling processes.

    Science.gov (United States)

    Madon, Stephanie; Guyll, Max; Buller, Ashley A; Scherr, Kyle C; Willard, Jennifer; Spoth, Richard

    2008-08-01

    This research examined whether self-fulfilling prophecy effects are mediated by self-verification, informational conformity, and modeling processes. The authors examined these mediational processes across multiple time frames with longitudinal data obtained from two samples of mother-child dyads (N-sub-1 = 486; N-sub-2 = 287), with children's alcohol use as the outcome variable. The results provided consistent support for the mediational process of self-verification. In both samples and across several years of adolescence, there was a significant indirect effect of mothers' beliefs on children's alcohol use through children's self-assessed likelihood of drinking alcohol in the future. Comparatively less support was found for informational conformity and modeling processes as mediators of mothers' self-fulfilling effects. The potential for self-fulfilling prophecies to produce long-lasting changes in targets' behavior via self-verification processes are discussed. (c) 2008 APA, all rights reserved

  17. Semantic Business Process Modeling

    OpenAIRE

    Markovic, Ivan

    2010-01-01

    This book presents a process-oriented business modeling framework based on semantic technologies. The framework consists of modeling languages, methods, and tools that allow for semantic modeling of business motivation, business policies and rules, and business processes. Quality of the proposed modeling framework is evaluated based on the modeling content of SAP Solution Composer and several real-world business scenarios.

  18. Advanced Information Processing System (AIPS)

    Science.gov (United States)

    Pitts, Felix L.

    1993-01-01

    Advanced Information Processing System (AIPS) is a computer systems philosophy, a set of validated hardware building blocks, and a set of validated services as embodied in system software. The goal of AIPS is to provide the knowledgebase which will allow achievement of validated fault-tolerant distributed computer system architectures, suitable for a broad range of applications, having failure probability requirements of 10E-9 at 10 hours. A background and description is given followed by program accomplishments, the current focus, applications, technology transfer, FY92 accomplishments, and funding.

  19. Moral Judgment as Information Processing: An Integrative Review

    Directory of Open Access Journals (Sweden)

    Steve eGuglielmo

    2015-10-01

    Full Text Available This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two fundamental questions: (1 What input information guides moral judgments?; and (2 What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework, critically evaluates them on empirical and theoretical grounds, outlines a general integrative model grounded in information processing, and offers conceptual and methodological suggestions for future research. The information processing perspective provides a useful theoretical framework for organizing extant and future work in the rapidly growing field of moral judgment.

  20. Motivated information processing and group decision refusal

    NARCIS (Netherlands)

    Nijstad, Bernard A.; Oltmanns, Jan

    Group decision making has attracted much scientific interest, but few studies have investigated group decisions that do not get made. Based on the Motivated Information Processing in Groups model, this study analysed the effect of epistemic motivation (low vs. high) and social motivation (proself

  1. The Effects of Cassady and Justin's Functional Model for Emotional Information Processing on Improving Social Competence of First Grade Children with ADHD

    Science.gov (United States)

    Eissa, Mourad Ali

    2017-01-01

    This study explores whether or not Emotional Information Processing (EIP) model Intervention has positive effects on the Social Competency in first grade children with ADHD. 10 first graders primary who had been identified as having ADHD using Attention-Deficit Hyperactivity Disorder Test (ADHDT) (Jeong, 2005) and were experiencing social problems…

  2. Quantum information processing in nanostructures

    International Nuclear Information System (INIS)

    Reina Estupinan, John-Henry

    2002-01-01

    Since information has been regarded os a physical entity, the field of quantum information theory has blossomed. This brings novel applications, such as quantum computation. This field has attracted the attention of numerous researchers with backgrounds ranging from computer science, mathematics and engineering, to the physical sciences. Thus, we now have an interdisciplinary field where great efforts are being made in order to build devices that should allow for the processing of information at a quantum level, and also in the understanding of the complex structure of some physical processes at a more basic level. This thesis is devoted to the theoretical study of structures at the nanometer-scale, 'nanostructures', through physical processes that mainly involve the solid-state and quantum optics, in order to propose reliable schemes for the processing of quantum information. Initially, the main results of quantum information theory and quantum computation are briefly reviewed. Next, the state-of-the-art of quantum dots technology is described. In so doing, the theoretical background and the practicalities required for this thesis are introduced. A discussion of the current quantum hardware used for quantum information processing is given. In particular, the solid-state proposals to date are emphasised. A detailed prescription is given, using an optically-driven coupled quantum dot system, to reliably prepare and manipulate exciton maximally entangled Bell and Greenberger-Horne-Zeilinger (GHZ) states. Manipulation of the strength and duration of selective light-pulses needed for producing these highly entangled states provides us with crucial elements for the processing of solid-state based quantum information. The all-optical generation of states of the so-called Bell basis for a system of two quantum dots (QDs) is exploited for performing the quantum teleportation of the excitonic state of a dot in an array of three coupled QDs. Theoretical predictions suggest

  3. Executive Information Systems' Multidimensional Models

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available Executive Information Systems are design to improve the quality of strategic level of management in organization through a new type of technology and several techniques for extracting, transforming, processing, integrating and presenting data in such a way that the organizational knowledge filters can easily associate with this data and turn it into information for the organization. These technologies are known as Business Intelligence Tools. But in order to build analytic reports for Executive Information Systems (EIS in an organization we need to design a multidimensional model based on the business model from the organization. This paper presents some multidimensional models that can be used in EIS development and propose a new model that is suitable for strategic business requests.

  4. THE MODEL OF DISTINCTION OF ACCESS RIGHTS TO INFORMATION OBJECTS OF THE SYSTEM OF CONTROLLING OF BUSINESS PROCESSES OF AN AVIATION ENTERPRISE

    Directory of Open Access Journals (Sweden)

    Andrey V. Degtyarev

    2014-01-01

    Full Text Available On the basis of the analysis of controlling system of business processes ofaviation enterprise was formulated the approach for set up an hierarchicalmodel of personal permissions to information resources of an automatic of thesystem of controlling of projects and contracts (ASCPC on the instrumentaland procedure levels. On the model base structure of personalized key wasdeveloped. This model reflective of possibilities of the every category of userswhen working with ASCPC.

  5. Information model of economy

    Directory of Open Access Journals (Sweden)

    N.S.Gonchar

    2006-01-01

    Full Text Available A new stochastic model of economy is developed that takes into account the choice of consumers are the dependent random fields. Axioms of such a model are formulated. The existence of random fields of consumer's choice and decision making by firms are proved. New notions of conditionally independent random fields and random fields of evaluation of information by consumers are introduced. Using the above mentioned random fields the random fields of consumer choice and decision making by firms are constructed. The theory of economic equilibrium is developed.

  6. Modeling multiphase materials processes

    CERN Document Server

    Iguchi, Manabu

    2010-01-01

    ""Modeling Multiphase Materials Processes: Gas-Liquid Systems"" describes the methodology and application of physical and mathematical modeling to multi-phase flow phenomena in materials processing. The book focuses on systems involving gas-liquid interaction, the most prevalent in current metallurgical processes. The performance characteristics of these processes are largely dependent on transport phenomena. This volume covers the inherent characteristics that complicate the modeling of transport phenomena in such systems, including complex multiphase structure, intense turbulence, opacity of

  7. Motivated information processing and group decision-making : Effects of process accountability on information processing and decision quality

    NARCIS (Netherlands)

    Scholten, Lotte; van Knippenberg, Daan; Nijstad, Bernard A.; De Dreu, Carsten K. W.

    Integrating dual-process models [Chaiken, S., & Trope, Y. (Eds.). (1999). Dual-process theories in social psychology. NewYork: Guilford Press] with work on information sharing and group decision-making [Stasser, G., & Titus, W. (1985). Pooling of unshared information in group decision making: biased

  8. Information systems process and practice

    CERN Document Server

    Urquhart, Christine; Tbaishat, Dina; Yeoman, Alison

    2017-01-01

    This book adopts a holistic interpretation of information architecture, to offer a variety of methods, tools, and techniques that may be used when designing websites and information systems that support workflows and what people require when 'managing information'.

  9. Process modeling style

    CERN Document Server

    Long, John

    2014-01-01

    Process Modeling Style focuses on other aspects of process modeling beyond notation that are very important to practitioners. Many people who model processes focus on the specific notation used to create their drawings. While that is important, there are many other aspects to modeling, such as naming, creating identifiers, descriptions, interfaces, patterns, and creating useful process documentation. Experience author John Long focuses on those non-notational aspects of modeling, which practitioners will find invaluable. Gives solid advice for creating roles, work produ

  10. Optimal Information Processing in Biochemical Networks

    Science.gov (United States)

    Wiggins, Chris

    2012-02-01

    A variety of experimental results over the past decades provide examples of near-optimal information processing in biological networks, including in biochemical and transcriptional regulatory networks. Computing information-theoretic quantities requires first choosing or computing the joint probability distribution describing multiple nodes in such a network --- for example, representing the probability distribution of finding an integer copy number of each of two interacting reactants or gene products while respecting the `intrinsic' small copy number noise constraining information transmission at the scale of the cell. I'll given an overview of some recent analytic and numerical work facilitating calculation of such joint distributions and the associated information, which in turn makes possible numerical optimization of information flow in models of noisy regulatory and biochemical networks. Illustrating cases include quantification of form-function relations, ideal design of regulatory cascades, and response to oscillatory driving.

  11. Product and Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian T.; Gani, Rafiqul

    . These approaches are put into the context of life cycle modelling, where multiscale and multiform modelling is increasingly prevalent in the 21st century. The book commences with a discussion of modern product and process modelling theory and practice followed by a series of case studies drawn from a variety......This book covers the area of product and process modelling via a case study approach. It addresses a wide range of modelling applications with emphasis on modelling methodology and the subsequent in-depth analysis of mathematical models to gain insight via structural aspects of the models...... to biotechnology applications, food, polymer and human health application areas. The book highlights to important nature of modern product and process modelling in the decision making processes across the life cycle. As such it provides an important resource for students, researchers and industrial practitioners....

  12. Quantum information processing : science & technology.

    Energy Technology Data Exchange (ETDEWEB)

    Horton, Rebecca; Carroll, Malcolm S.; Tarman, Thomas David

    2010-09-01

    Qubits demonstrated using GaAs double quantum dots (DQD). The qubit basis states are the (1) singlet and (2) triplet stationary states. Long spin decoherence times in silicon spurs translation of GaAs qubit in to silicon. In the near term the goals are: (1) Develop surface gate enhancement mode double quantum dots (MOS & strained-Si/SiGe) to demonstrate few electrons and spin read-out and to examine impurity doped quantum-dots as an alternative architecture; (2) Use mobility, C-V, ESR, quantum dot performance & modeling to feedback and improve upon processing, this includes development of atomic precision fabrication at SNL; (3) Examine integrated electronics approaches to RF-SET; (4) Use combinations of numerical packages for multi-scale simulation of quantum dot systems (NEMO3D, EMT, TCAD, SPICE); and (5) Continue micro-architecture evaluation for different device and transport architectures.

  13. Standard Model processes

    CERN Document Server

    Mangano, M.L.; Aguilar-Saavedra, Juan Antonio; Alekhin, S.; Badger, S.; Bauer, C.W.; Becher, T.; Bertone, V.; Bonvini, M.; Boselli, S.; Bothmann, E.; Boughezal, R.; Cacciari, M.; Carloni Calame, C.M.; Caola, F.; Campbell, J.M.; Carrazza, S.; Chiesa, M.; Cieri, L.; Cimaglia, F.; Febres Cordero, F.; Ferrarese, P.; D'Enterria, D.; Ferrera, G.; Garcia i Tormo, X.; Garzelli, M.V.; Germann, E.; Hirschi, V.; Han, T.; Ita, H.; Jäger, B.; Kallweit, S.; Karlberg, A.; Kuttimalai, S.; Krauss, F.; Larkoski, A.J.; Lindert, J.; Luisoni, G.; Maierhöfer, P.; Mattelaer, O.; Martinez, H.; Moch, S.; Montagna, G.; Moretti, M.; Nason, P.; Nicrosini, O.; Oleari, C.; Pagani, D.; Papaefstathiou, A.; Petriello, F.; Piccinini, F.; Pierini, M.; Pierog, T.; Pozzorini, S.; Re, E.; Robens, T.; Rojo, J.; Ruiz, R.; Sakurai, K.; Salam, G.P.; Salfelder, L.; Schönherr, M.; Schulze, M.; Schumann, S.; Selvaggi, M.; Shivaji, A.; Siodmok, A.; Skands, P.; Torrielli, P.; Tramontano, F.; Tsinikos, I.; Tweedie, B.; Vicini, A.; Westhoff, S.; Zaro, M.; Zeppenfeld, D.; CERN. Geneva. ATS Department

    2017-06-22

    This report summarises the properties of Standard Model processes at the 100 TeV pp collider. We document the production rates and typical distributions for a number of benchmark Standard Model processes, and discuss new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

  14. Information processing for aerospace structural health monitoring

    Science.gov (United States)

    Lichtenwalner, Peter F.; White, Edward V.; Baumann, Erwin W.

    1998-06-01

    Structural health monitoring (SHM) technology provides a means to significantly reduce life cycle of aerospace vehicles by eliminating unnecessary inspections, minimizing inspection complexity, and providing accurate diagnostics and prognostics to support vehicle life extension. In order to accomplish this, a comprehensive SHM system will need to acquire data from a wide variety of diverse sensors including strain gages, accelerometers, acoustic emission sensors, crack growth gages, corrosion sensors, and piezoelectric transducers. Significant amounts of computer processing will then be required to convert this raw sensor data into meaningful information which indicates both the diagnostics of the current structural integrity as well as the prognostics necessary for planning and managing the future health of the structure in a cost effective manner. This paper provides a description of the key types of information processing technologies required in an effective SHM system. These include artificial intelligence techniques such as neural networks, expert systems, and fuzzy logic for nonlinear modeling, pattern recognition, and complex decision making; signal processing techniques such as Fourier and wavelet transforms for spectral analysis and feature extraction; statistical algorithms for optimal detection, estimation, prediction, and fusion; and a wide variety of other algorithms for data analysis and visualization. The intent of this paper is to provide an overview of the role of information processing for SHM, discuss various technologies which can contribute to accomplishing this role, and present some example applications of information processing for SHM implemented at the Boeing Company.

  15. Modelling Hospital Materials Management Processes

    Directory of Open Access Journals (Sweden)

    Raffaele Iannone

    2013-06-01

    integrated and detailed analysis and description model for hospital materials management data and tasks, which is able to tackle information from patient requirements to usage, from replenishment requests to supplying and handling activities. The model takes account of medical risk reduction, traceability and streamlined processes perspectives. Second, the paper translates this information into a business process model and mathematical formalization.The study provides a useful guide to the various relevant technology‐related, management and business issues, laying the foundations of an efficient reengineering of the supply chain to reduce healthcare costs and improve the quality of care.

  16. UML in business process modeling

    Directory of Open Access Journals (Sweden)

    Bartosz Marcinkowski

    2013-03-01

    Full Text Available Selection and proper application of business process modeling methods and techniques have a significant impact on organizational improvement capabilities as well as proper understanding of functionality of information systems that shall support activity of the organization. A number of business process modeling notations were popularized in practice in recent decades. Most significant of the notations include Business Process Modeling Notation (OMG BPMN and several Unified Modeling Language (OMG UML extensions. In this paper, the assessment whether one of the most flexible and strictly standardized contemporary business process modeling notations, i.e. Rational UML Profile for Business Modeling, enable business analysts to prepare business models that are all-embracing and understandable by all the stakeholders. After the introduction, methodology of research is discussed. Section 2 presents selected case study results. The paper is concluded with a summary.

  17. Human Information Processing and Supervisory Control.

    Science.gov (United States)

    1980-05-01

    errors (that is of the output of the human operator). There is growing evidence (Senders, personal communication; Norman , personal communication...relates to the relative tendency to depend on sensory information or to be more analytic and independent. Norman (personal communication) has referred...decision process model. Ergonomics, 12, 543-557. Senders, J., Elkid, J., Grignetti, M., & Smallwood , R. 1966. An investigation of the visual sampling

  18. Information Processing Capacity of Dynamical Systems

    Science.gov (United States)

    Dambre, Joni; Verstraeten, David; Schrauwen, Benjamin; Massar, Serge

    2012-07-01

    Many dynamical systems, both natural and artificial, are stimulated by time dependent external signals, somehow processing the information contained therein. We demonstrate how to quantify the different modes in which information can be processed by such systems and combine them to define the computational capacity of a dynamical system. This is bounded by the number of linearly independent state variables of the dynamical system, equaling it if the system obeys the fading memory condition. It can be interpreted as the total number of linearly independent functions of its stimuli the system can compute. Our theory combines concepts from machine learning (reservoir computing), system modeling, stochastic processes, and functional analysis. We illustrate our theory by numerical simulations for the logistic map, a recurrent neural network, and a two-dimensional reaction diffusion system, uncovering universal trade-offs between the non-linearity of the computation and the system's short-term memory.

  19. Information Processing Capacity of Dynamical Systems

    Science.gov (United States)

    Dambre, Joni; Verstraeten, David; Schrauwen, Benjamin; Massar, Serge

    2012-01-01

    Many dynamical systems, both natural and artificial, are stimulated by time dependent external signals, somehow processing the information contained therein. We demonstrate how to quantify the different modes in which information can be processed by such systems and combine them to define the computational capacity of a dynamical system. This is bounded by the number of linearly independent state variables of the dynamical system, equaling it if the system obeys the fading memory condition. It can be interpreted as the total number of linearly independent functions of its stimuli the system can compute. Our theory combines concepts from machine learning (reservoir computing), system modeling, stochastic processes, and functional analysis. We illustrate our theory by numerical simulations for the logistic map, a recurrent neural network, and a two-dimensional reaction diffusion system, uncovering universal trade-offs between the non-linearity of the computation and the system's short-term memory. PMID:22816038

  20. Practicality of quantum information processing

    Science.gov (United States)

    Lau, Hoi-Kwan

    Quantum Information Processing (QIP) is expected to bring revolutionary enhancement to various technological areas. However, today's QIP applications are far from being practical. The problem involves both hardware issues, i.e., quantum devices are imperfect, and software issues, i.e., the functionality of some QIP applications is not fully understood. Aiming to improve the practicality of QIP, in my PhD research I have studied various topics in quantum cryptography and ion trap quantum computation. In quantum cryptography, I first studied the security of position-based quantum cryptography (PBQC). I discovered a wrong assumption in the previous literature that the cheaters are not allowed to share entangled resources. I proposed entanglement attacks that could cheat all known PBQC protocols. I also studied the practicality of continuous-variable (CV) quantum secret sharing (QSS). While the security of CV QSS was considered by the literature only in the limit of infinite squeezing, I found that finitely squeezed CV resources could also provide finite secret sharing rate. Our work relaxes the stringent resources requirement of implementing QSS. In ion trap quantum computation, I studied the phase error of quantum information induced by dc Stark effect during ion transportation. I found an optimized ion trajectory for which the phase error is the minimum. I also defined a threshold speed, above which ion transportation would induce significant error. In addition, I proposed a new application for ion trap systems as universal bosonic simulators (UBS). I introduced two architectures, and discussed their respective strength and weakness. I illustrated the implementations of bosonic state initialization, transformation, and measurement by applying radiation fields or by varying the trap potential. When comparing with conducting optical experiments, the ion trap UBS is advantageous in higher state initialization efficiency and higher measurement accuracy. Finally, I

  1. Information processing of earth resources data

    Science.gov (United States)

    Zobrist, A. L.; Bryant, N. A.

    1982-01-01

    Current trends in the use of remotely sensed data include integration of multiple data sources of various formats and use of complex models. These trends have placed a strain on information processing systems because an enormous number of capabilities are needed to perform a single application. A solution to this problem is to create a general set of capabilities which can perform a wide variety of applications. General capabilities for the Image-Based Information System (IBIS) are outlined in this report. They are then cross-referenced for a set of applications performed at JPL.

  2. Towards elicitation of users requirements for hospital information system: from a care process modelling technique to a web based collaborative tool.

    Science.gov (United States)

    Staccini, Pascal M; Joubert, Michel; Quaranta, Jean-Francois; Fieschi, Marius

    2002-01-01

    Growing attention is being given to the use of process modeling methodology for user requirements elicitation. In the analysis phase of hospital information systems, the usefulness of care-process models has been investigated to evaluate the conceptual applicability and practical understandability by clinical staff and members of users teams. Nevertheless, there still remains a gap between users and analysts in their mutual ability to share conceptual views and vocabulary, keeping the meaning of clinical context while providing elements for analysis. One of the solutions for filling this gap is to consider the process model itself in the role of a hub as a centralized means of facilitating communication between team members. Starting with a robust and descriptive technique for process modeling called IDEF0/SADT, we refined the basic data model by extracting concepts from ISO 9000 process analysis and from enterprise ontology. We defined a web-based architecture to serve as a collaborative tool and implemented it using an object-oriented database. The prospects of such a tool are discussed notably regarding to its ability to generate data dictionaries and to be used as a navigation tool through the medium of hospital-wide documentation.

  3. Function Model for Community Health Service Information

    Science.gov (United States)

    Yang, Peng; Pan, Feng; Liu, Danhong; Xu, Yongyong

    In order to construct a function model of community health service (CHS) information for development of CHS information management system, Integration Definition for Function Modeling (IDEF0), an IEEE standard which is extended from Structured Analysis and Design(SADT) and now is a widely used function modeling method, was used to classifying its information from top to bottom. The contents of every level of the model were described and coded. Then function model for CHS information, which includes 4 super-classes, 15 classes and 28 sub-classed of business function, 43 business processes and 168 business activities, was established. This model can facilitate information management system development and workflow refinement.

  4. Handbook on neural information processing

    CERN Document Server

    Maggini, Marco; Jain, Lakhmi

    2013-01-01

    This handbook presents some of the most recent topics in neural information processing, covering both theoretical concepts and practical applications. The contributions include:                         Deep architectures                         Recurrent, recursive, and graph neural networks                         Cellular neural networks                         Bayesian networks                         Approximation capabilities of neural networks                         Semi-supervised learning                         Statistical relational learning                         Kernel methods for structured data                         Multiple classifier systems                         Self organisation and modal learning                         Applications to ...

  5. Information Processing in Auto-regulated Systems

    Directory of Open Access Journals (Sweden)

    Karl Javorszky

    2003-06-01

    Full Text Available Abstract: We present a model of information processing which is based on two concurrent ways of describing the world, where a description in one of the languages limits the possibilities for realisations in the other language. The two describing dimensions appear in our common sense as dichotomies of perspectives: subjective - objective; diversity - similarity; individual - collective. We abstract from the subjective connotations and treat the test theoretical case of an interval on which several concurrent categories can be introduced. We investigate multidimensional partitions as potential carriers of information and compare their efficiency to that of sequenced carriers. We regard the same assembly once as a contemporary collection, once as a longitudinal sequence and find promising inroads towards understanding information processing by auto-regulated systems. Information is understood to point out that what is the case from among alternatives, which could be the case. We have translated these ideas into logical operations on the set of natural numbers and have found two equivalence points on N where matches between sequential and commutative ways of presenting a state of the world can agree in a stable fashion: a flip-flop mechanism is envisioned. By following this new approach, a mathematical treatment of some poignant biomathematical problems is allowed. Also, the concepts presented in this treatise may well have relevance and applications within the information processing and the theory of language fields.

  6. Multidimensional biochemical information processing of dynamical patterns.

    Science.gov (United States)

    Hasegawa, Yoshihiko

    2018-02-01

    Cells receive signaling molecules by receptors and relay information via sensory networks so that they can respond properly depending on the type of signal. Recent studies have shown that cells can extract multidimensional information from dynamical concentration patterns of signaling molecules. We herein study how biochemical systems can process multidimensional information embedded in dynamical patterns. We model the decoding networks by linear response functions, and optimize the functions with the calculus of variations to maximize the mutual information between patterns and output. We find that, when the noise intensity is lower, decoders with different linear response functions, i.e., distinct decoders, can extract much information. However, when the noise intensity is higher, distinct decoders do not provide the maximum amount of information. This indicates that, when transmitting information by dynamical patterns, embedding information in multiple patterns is not optimal when the noise intensity is very large. Furthermore, we explore the biochemical implementations of these decoders using control theory and demonstrate that these decoders can be implemented biochemically through the modification of cascade-type networks, which are prevalent in actual signaling pathways.

  7. Business process modeling in healthcare.

    Science.gov (United States)

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.

  8. Expectation, information processing, and subjective duration.

    Science.gov (United States)

    Simchy-Gross, Rhimmon; Margulis, Elizabeth Hellmuth

    2018-01-01

    In research on psychological time, it is important to examine the subjective duration of entire stimulus sequences, such as those produced by music (Teki, Frontiers in Neuroscience, 10, 2016). Yet research on the temporal oddball illusion (according to which oddball stimuli seem longer than standard stimuli of the same duration) has examined only the subjective duration of single events contained within sequences, not the subjective duration of sequences themselves. Does the finding that oddballs seem longer than standards translate to entire sequences, such that entire sequences that contain oddballs seem longer than those that do not? Is this potential translation influenced by the mode of information processing-whether people are engaged in direct or indirect temporal processing? Two experiments aimed to answer both questions using different manipulations of information processing. In both experiments, musical sequences either did or did not contain oddballs (auditory sliding tones). To manipulate information processing, we varied the task (Experiment 1), the sequence event structure (Experiments 1 and 2), and the sequence familiarity (Experiment 2) independently within subjects. Overall, in both experiments, the sequences that contained oddballs seemed shorter than those that did not when people were engaged in direct temporal processing, but longer when people were engaged in indirect temporal processing. These findings support the dual-process contingency model of time estimation (Zakay, Attention, Perception & Psychophysics, 54, 656-664, 1993). Theoretical implications for attention-based and memory-based models of time estimation, the pacemaker accumulator and coding efficiency hypotheses of time perception, and dynamic attending theory are discussed.

  9. WWTP Process Tank Modelling

    DEFF Research Database (Denmark)

    Laursen, Jesper

    The present thesis considers numerical modeling of activated sludge tanks on municipal wastewater treatment plants. Focus is aimed at integrated modeling where the detailed microbiological model the Activated Sludge Model 3 (ASM3) is combined with a detailed hydrodynamic model based on a numerical...... solution of the Navier-Stokes equations in a multiphase scheme. After a general introduction to the activated sludge tank as a system, the activated sludge tank model is gradually setup in separate stages. The individual sub-processes that are often occurring in activated sludge tanks are initially...... hydrofoil shaped propellers. These two sub-processes deliver the main part of the supplied energy to the activated sludge tank, and for this reason they are important for the mixing conditions in the tank. For other important processes occurring in the activated sludge tank, existing models and measurements...

  10. Continuous-variable quantum information processing

    DEFF Research Database (Denmark)

    Andersen, Ulrik Lund; Leuchs, G.; Silberhorn, C.

    2010-01-01

    the continuous degree of freedom of a quantum system for encoding, processing or detecting information, one enters the field of continuous-variable (CV) quantum information processing. In this paper we review the basic principles of CV quantum information processing with main focus on recent developments...... in the field. We will be addressing the three main stages of a quantum information system; the preparation stage where quantum information is encoded into CVs of coherent states and single-photon states, the processing stage where CV information is manipulated to carry out a specified protocol and a detection...... stage where CV information is measured using homodyne detection or photon counting....

  11. Infochemistry Information Processing at the Nanoscale

    CERN Document Server

    Szacilowski, Konrad

    2012-01-01

    Infochemistry: Information Processing at the Nanoscale, defines a new field of science, and describes the processes, systems and devices at the interface between chemistry and information sciences. The book is devoted to the application of molecular species and nanostructures to advanced information processing. It includes the design and synthesis of suitable materials and nanostructures, their characterization, and finally applications of molecular species and nanostructures for information storage and processing purposes. Divided into twelve chapters; the first three chapters serve as an int

  12. Influence Processes for Information Technology Acceptance

    DEFF Research Database (Denmark)

    Bhattacherjee, Anol; Sanford, Clive Carlton

    2006-01-01

    This study examines how processes of external influence shape information technology acceptance among potential users, how such influence effects vary across a user population, and whether these effects are persistent over time. Drawing on the elaboration-likelihood model (ELM), we compared two...... alternative influence processes, the central and peripheral routes, in motivating IT acceptance. These processes were respectively operationalized using the argument quality and source credibility constructs, and linked to perceived usefulness and attitude, the core perceptual drivers of IT acceptance. We...... further examined how these influence processes were moderated by users' IT expertise and perceived job relevance and the temporal stability of such influence effects. Nine hypotheses thus developed were empirically validated using a field survey of document management system acceptance at an eastern...

  13. Markov Decision Process Measurement Model.

    Science.gov (United States)

    LaMar, Michelle M

    2018-03-01

    Within-task actions can provide additional information on student competencies but are challenging to model. This paper explores the potential of using a cognitive model for decision making, the Markov decision process, to provide a mapping between within-task actions and latent traits of interest. Psychometric properties of the model are explored, and simulation studies report on parameter recovery within the context of a simple strategy game. The model is then applied to empirical data from an educational game. Estimates from the model are found to correlate more strongly with posttest results than a partial-credit IRT model based on outcome data alone.

  14. Social Information Processing in Deaf Adolescents

    Science.gov (United States)

    Torres, Jesús; Saldaña, David; Rodríguez-Ortiz, Isabel R.

    2016-01-01

    The goal of this study was to compare the processing of social information in deaf and hearing adolescents. A task was developed to assess social information processing (SIP) skills of deaf adolescents based on Crick and Dodge's (1994; A review and reformulation of social information-processing mechanisms in children's social adjustment.…

  15. Business Process Modeling: Perceived Benefits

    Science.gov (United States)

    Indulska, Marta; Green, Peter; Recker, Jan; Rosemann, Michael

    The process-centered design of organizations and information systems is globally seen as an appropriate response to the increased economic pressure on organizations. At the methodological core of process-centered management is process modeling. However, business process modeling in large initiatives can be a time-consuming and costly exercise, making it potentially difficult to convince executive management of its benefits. To date, and despite substantial interest and research in the area of process modeling, the understanding of the actual benefits of process modeling in academia and practice is limited. To address this gap, this paper explores the perception of benefits derived from process modeling initiatives, as reported through a global Delphi study. The study incorporates the views of three groups of stakeholders - academics, practitioners and vendors. Our findings lead to the first identification and ranking of 19 unique benefits associated with process modeling. The study in particular found that process modeling benefits vary significantly between practitioners and academics. We argue that the variations may point to a disconnect between research projects and practical demands.

  16. Information processing by neuronal populations

    National Research Council Canada - National Science Library

    Hölscher, Christian; Munk, Matthias

    2009-01-01

    ... simultaneously recorded spike trains 120 Mark Laubach, Nandakumar S. Narayanan, and Eyal Y. Kimchi Part III Neuronal population information coding and plasticity in specific brain areas 149 7 F...

  17. How the dual process model of human cognition can inform efforts to de-implement ineffective and harmful clinical practices: A preliminary model of unlearning and substitution.

    Science.gov (United States)

    Helfrich, Christian D; Rose, Adam J; Hartmann, Christine W; van Bodegom-Vos, Leti; Graham, Ian D; Wood, Suzanne J; Majerczyk, Barbara R; Good, Chester B; Pogach, Leonard M; Ball, Sherry L; Au, David H; Aron, David C

    2018-02-01

    One way to understand medical overuse at the clinician level is in terms of clinical decision-making processes that are normally adaptive but become maladaptive. In psychology, dual process models of cognition propose 2 decision-making processes. Reflective cognition is a conscious process of evaluating options based on some combination of utility, risk, capabilities, and/or social influences. Automatic cognition is a largely unconscious process occurring in response to environmental or emotive cues based on previously learned, ingrained heuristics. De-implementation strategies directed at clinicians may be conceptualized as corresponding to cognition: (1) a process of unlearning based on reflective cognition and (2) a process of substitution based on automatic cognition. We define unlearning as a process in which clinicians consciously change their knowledge, beliefs, and intentions about an ineffective practice and alter their behaviour accordingly. Unlearning has been described as "the questioning of established knowledge, habits, beliefs and assumptions as a prerequisite to identifying inappropriate or obsolete knowledge underpinning and/or embedded in existing practices and routines." We hypothesize that as an unintended consequence of unlearning strategies clinicians may experience "reactance," ie, feel their professional prerogative is being violated and, consequently, increase their commitment to the ineffective practice. We define substitution as replacing the ineffective practice with one or more alternatives. A substitute is a specific alternative action or decision that either precludes the ineffective practice or makes it less likely to occur. Both approaches may work independently, eg, a substitute could displace an ineffective practice without changing clinicians' knowledge, and unlearning could occur even if no alternative exists. For some clinical practice, unlearning and substitution strategies may be most effectively used together. By taking into

  18. Multi-enzyme Process Modeling

    DEFF Research Database (Denmark)

    Andrade Santacoloma, Paloma de Gracia

    are affected (in a positive or negative way) by the presence of the other enzymes and compounds in the media. In this thesis the concept of multi-enzyme in-pot term is adopted for processes that are carried out by the combination of enzymes in a single reactor and implemented at pilot or industrial scale...... features of the process and provides the information required to structure the process model by using a step-by-step procedure with the required tools and methods. In this way, this framework increases efficiency of the model development process with respect to time and resources needed (fast and effective....... In this way the model parameters that drives the main dynamic behavior can be identified and thus a better understanding of this type of processes. In order to develop, test and verify the methodology, three case studies were selected, specifically the bi-enzyme process for the production of lactobionic acid...

  19. Is There Room for "Development" in Developmental Models of Information Processing Biases to Threat in Children and Adolescents?

    Science.gov (United States)

    Field, Andy P.; Lester, Kathryn J.

    2010-01-01

    Clinical and experimental theories assume that processing biases in attention and interpretation are a causal mechanism through which anxiety develops. Despite growing evidence that these processing biases are present in children and, therefore, develop long before adulthood, these theories ignore the potential role of child development. This…

  20. Model Process Control Language

    Data.gov (United States)

    National Aeronautics and Space Administration — The MPC (Model Process Control) language enables the capture, communication and preservation of a simulation instance, with sufficient detail that it can be...

  1. Information Retrieval Models

    NARCIS (Netherlands)

    Hiemstra, Djoerd; Göker, Ayse; Davies, John

    2009-01-01

    Many applications that handle information on the internet would be completely inadequate without the support of information retrieval technology. How would we find information on the world wide web if there were no web search engines? How would we manage our email without spam filtering? Much of the

  2. A proposed general model of information behaviour.

    Directory of Open Access Journals (Sweden)

    2003-01-01

    Full Text Available Presents a critical description of Wilson's (1996 global model of information behaviour and proposes major modification on the basis of research into information behaviour of managers, conducted in Poland. The theoretical analysis and research results suggest that Wilson's model has certain imperfections, both in its conceptual content, and in graphical presentation. The model, for example, cannot be used to describe managers' information behaviour, since managers basically are not the end users of external from organization or computerized information services, and they acquire information mainly through various intermediaries. Therefore, the model cannot be considered as a general model, applicable to every category of information users. The proposed new model encompasses the main concepts of Wilson's model, such as: person-in-context, three categories of intervening variables (individual, social and environmental, activating mechanisms, cyclic character of information behaviours, and the adoption of a multidisciplinary approach to explain them. However, the new model introduces several changes. They include: 1. identification of 'context' with the intervening variables; 2. immersion of the chain of information behaviour in the 'context', to indicate that the context variables influence behaviour at all stages of the process (identification of needs, looking for information, processing and using it; 3. stress is put on the fact that the activating mechanisms also can occur at all stages of the information acquisition process; 4. introduction of two basic strategies of looking for information: personally and/or using various intermediaries.

  3. Effects of foveal information processing

    Science.gov (United States)

    Harris, R. L., Sr.

    1984-01-01

    The scanning behavior of pilots must be understood so that cockpit displays can be assembled which will provide the most information accurately and quickly to the pilot. The results of seven years of collecting and analyzing pilot scanning data are summarized. The data indicate that pilot scanning behavior is: (1) subsconscious; (2) situation dependent; and (3) can be disrupted if pilots are forced to make conscious decisions. Testing techniques and scanning analysis techniques have been developed that are sensitive to pilot workload.

  4. Erythropoietin modulates neural and cognitive processing of emotional information in biomarker models of antidepressant drug action in depressed patients

    DEFF Research Database (Denmark)

    Miskowiak, Kamilla W; Favaron, Elisa; Hafizi, Sepehr

    2010-01-01

    Erythropoietin (Epo) has neuroprotective and neurotrophic effects, and may be a novel therapeutic agent in the treatment of psychiatric disorders. We have demonstrated antidepressant-like effects of Epo on the neural and cognitive processing of facial expressions in healthy volunteers. The curren...... study investigates the effects of Epo on the neural and cognitive response to emotional facial expressions in depressed patients.......Erythropoietin (Epo) has neuroprotective and neurotrophic effects, and may be a novel therapeutic agent in the treatment of psychiatric disorders. We have demonstrated antidepressant-like effects of Epo on the neural and cognitive processing of facial expressions in healthy volunteers. The current...

  5. Physiological arousal in processing recognition information

    Directory of Open Access Journals (Sweden)

    Guy Hochman

    2010-07-01

    Full Text Available The recognition heuristic (RH; Goldstein and Gigerenzer, 2002 suggests that, when applicable, probabilistic inferences are based on a noncompensatory examination of whether an object is recognized or not. The overall findings on the processes that underlie this fast and frugal heuristic are somewhat mixed, and many studies have expressed the need for considering a more compensatory integration of recognition information. Regardless of the mechanism involved, it is clear that recognition has a strong influence on choices, and this finding might be explained by the fact that recognition cues arouse affect and thus receive more attention than cognitive cues. To test this assumption, we investigated whether recognition results in a direct affective signal by measuring physiological arousal (i.e., peripheral arterial tone in the established city-size task. We found that recognition of cities does not directly result in increased physiological arousal. Moreover, the results show that physiological arousal increased with increasing inconsistency between recognition information and additional cue information. These findings support predictions derived by a compensatory Parallel Constraint Satisfaction model rather than predictions of noncompensatory models. Additional results concerning confidence ratings, response times, and choice proportions further demonstrated that recognition information and other cognitive cues are integrated in a compensatory manner.

  6. Shaping internal working models : parental love withdrawal, oxytocin, and asymmetric frontal brain activity affect socio-emotional information processing

    NARCIS (Netherlands)

    Huffmeijer, Renske

    2011-01-01

    The aim of this thesis is to gain insight into the associations between experiences of parental love withdrawal, oxytocin, and asymmetric frontal brain activity (reflecting basic motivational tendencies) on the one hand, and (neural) processing of and responses to socio-emotional stimuli on the

  7. Influence of information on behavioral effects in decision processes

    OpenAIRE

    Angelarosa Longo; Viviana Ventre

    2015-01-01

    Rational models in decision processes are marked out by many anomalies, caused by behavioral issues. We point out the importance of information in causing inconsistent preferences in a decision process. In a single or multi agent decision process each mental model is influenced by the presence, the absence or false information about the problem or about other members of the decision making group. The difficulty in modeling these effects increases because behavioral biases influence also the m...

  8. Information processing of sexual abuse in elders.

    Science.gov (United States)

    Burgess, Ann W; Clements, Paul T

    2006-01-01

    Sexual abuse is considered to be a pandemic contemporary public health issue, with significant physical and psychosocial consequences for its victims. However, the incidence of elder sexual assault is difficult to estimate with any degree of confidence. A convenience sample of 284 case records were reviewed for Post-Traumatic Stress Disorder (PTSD) symptoms. The purpose of this paper is to present the limited data noted on record review on four PTSD symptoms of startle, physiological upset, anger, and numbness. A treatment model for information processing of intrapsychic trauma is presented to describe domain disruption within a nursing diagnosis of rape trauma syndrome and provide guidance for sensitive assessment and intervention.

  9. INFORMATION MODELLING OF PROCESS OF ADOPTION OF ADMINISTRATIVE DECISIONS AT THE ORGANIZATION OF PROFESSIONAL DEVELOPMENT OF THE PERSONNEL

    Directory of Open Access Journals (Sweden)

    Yaroslav E. Prokushev

    2015-01-01

    Full Text Available The article is devoted to a problem of theorganization of professional developmentof personnel. The article is consideringtwo interconnected tasks. The fi rst task is: estimation of degree of need of professional development of the specifi c worker. The second task is: choice of the programof professional development. Functionalinformation models of procedures ofadoption of administrative decisions withinthese tasks are developed.

  10. Modelling and Analysis of Daylight, Solar Heat Gains and Thermal Losses to Inform the Early Stage of the Architectural Process

    OpenAIRE

    Baker, Nicholas

    2017-01-01

    The EU building sector is a main contributor to greenhouse gas emissions, which need to be cut as part of the global response to anthropogenic climate change. This cut can be realised through improvements in building energy performance, such as optimisation of facade design. The early stage of the architectural process has been identified as the ideal time to implement such sustainable design choices. There is need for simple guidelines and tools to provide quantitative data to support these ...

  11. Mathematics of Information Processing and the Internet

    Science.gov (United States)

    Hart, Eric W.

    2010-01-01

    The mathematics of information processing and the Internet can be organized around four fundamental themes: (1) access (finding information easily); (2) security (keeping information confidential); (3) accuracy (ensuring accurate information); and (4) efficiency (data compression). In this article, the author discusses each theme with reference to…

  12. Conjunction of wavelet transform and SOM-mutual information data pre-processing approach for AI-based Multi-Station nitrate modeling of watersheds

    Science.gov (United States)

    Nourani, Vahid; Andalib, Gholamreza; Dąbrowska, Dominika

    2017-05-01

    Accurate nitrate load predictions can elevate decision management of water quality of watersheds which affects to environment and drinking water. In this paper, two scenarios were considered for Multi-Station (MS) nitrate load modeling of the Little River watershed. In the first scenario, Markovian characteristics of streamflow-nitrate time series were proposed for the MS modeling. For this purpose, feature extraction criterion of Mutual Information (MI) was employed for input selection of artificial intelligence models (Feed Forward Neural Network, FFNN and least square support vector machine). In the second scenario for considering seasonality-based characteristics of the time series, wavelet transform was used to extract multi-scale features of streamflow-nitrate time series of the watershed's sub-basins to model MS nitrate loads. Self-Organizing Map (SOM) clustering technique which finds homogeneous sub-series clusters was also linked to MI for proper cluster agent choice to be imposed into the models for predicting the nitrate loads of the watershed's sub-basins. The proposed MS method not only considers the prediction of the outlet nitrate but also covers predictions of interior sub-basins nitrate load values. The results indicated that the proposed FFNN model coupled with the SOM-MI improved the performance of MS nitrate predictions compared to the Markovian-based models up to 39%. Overall, accurate selection of dominant inputs which consider seasonality-based characteristics of streamflow-nitrate process could enhance the efficiency of nitrate load predictions.

  13. Multidimensional Models of Information Need

    OpenAIRE

    Yun-jie (Calvin) Xu; Kai Huang (Joseph) Tan

    2009-01-01

    User studies in information science have recognised relevance as a multidimensional construct. An implication of multidimensional relevance is that a user's information need should be modeled by multiple data structures to represent different relevance dimensions. While the extant literature has attempted to model multiple dimensions of a user's information need, the fundamental assumption that a multidimensional model is better than a uni-dimensional model has not been addressed. This study ...

  14. Process-aware information systems : lessons to be learned from process mining

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Jensen, K.; Aalst, van der W.M.P.

    2009-01-01

    A Process-Aware Information System (PAIS) is a software system that manages and executes operational processes involving people, applications, and/or information sources on the basis of process models. Example PAISs are workflow management systems, case-handling systems, enterprise information

  15. Influence of information on behavioral effects in decision processes

    Directory of Open Access Journals (Sweden)

    Angelarosa Longo

    2015-07-01

    Full Text Available Rational models in decision processes are marked out by many anomalies, caused by behavioral issues. We point out the importance of information in causing inconsistent preferences in a decision process. In a single or multi agent decision process each mental model is influenced by the presence, the absence or false information about the problem or about other members of the decision making group. The difficulty in modeling these effects increases because behavioral biases influence also the modeler. Behavioral Operational Research (BOR studies these influences to create efficient models to define choices in similar decision processes.

  16. Aerospace Materials Process Modelling

    Science.gov (United States)

    1988-08-01

    Cooling Transformation diagram ( CCT diagram ) When a IT diagram is used in the heat process modelling, we suppose that a sudden cooling (instantaneous...processes. CE, chooses instead to study thermo-mechanical properties referring to a CCT diagram . This is thinked to be more reliable to give a true...k , mm-_____sml l ml A I 1 III 12.4 This determination is however based on the following approximations: i) A CCT diagram is valid only for the

  17. Erythropoietin modulates neural and cognitive processing of emotional information in biomarker models of antidepressant drug action in depressed patients

    DEFF Research Database (Denmark)

    Miskowiak, Kamilla W; Favaron, Elisa; Hafizi, Sepehr

    2010-01-01

    . The current study investigates the effects of Epo on the neural and cognitive response to emotional facial expressions in depressed patients. METHOD: Nineteen acutely depressed patients were randomized to receive Epo (40,000 IU) or saline intravenously in a double-blind, parallel-group design. On day 3, we......OBJECTIVE: Erythropoietin (Epo) has neuroprotective and neurotrophic effects, and may be a novel therapeutic agent in the treatment of psychiatric disorders. We have demonstrated antidepressant-like effects of Epo on the neural and cognitive processing of facial expressions in healthy volunteers...... assessed neuronal responses to fearful and happy faces using functional magnetic resonance imaging and measured facial expression recognition after the scan. RESULTS: Epo reduced neural response to fearful vs. happy faces in the amygdala and hippocampus, and to fearful faces vs. baseline in superior...

  18. Comparison on information-seeking behavior of postgraduated students in Isfahan University of Medical Sciences and University of Isfahan in writing dissertation based on Kuhlthau model of information search process.

    Science.gov (United States)

    Abedi, Mahnaz; Ashrafi-Rizi, Hasan; Zare-Farashbandi, Firoozeh; Nouri, Rasoul; Hassanzadeh, Akbar

    2014-01-01

    Information-seeking behaviors have been one of the main focuses of researchers in order to identify and solve the problems users face in information recovery. The aim of this research is Comparative on Information-Seeking Behavior of the Postgraduate Students in Isfahan University of Medical Sciences and Isfahan University in Writing Dissertation based on Kuhlthau Model of Information Search Process in 2012. The research method followed is survey and the data collection tool is Narmenji questionnaire. Statistical population was all postgraduate students in Isfahan University of Medical Sciences and Isfahan University. The sample size was 196 people and sampling was stratified randomly. The type of statistical analyses were descriptive (mean and frequency) and inferential (independent t test and Pearson's correlation) and the software used was SPSS20. The findings showed that Isfahan Medical Sciences University followed 20% of the order steps of this model and Isfahan University did not follow this model. In the first stage (Initiation) and sixth (Presentation) of feelings aspects and in actions (total stages) significant difference was found between students from the two universities. Between gender and fourth stage (Formulation) and the total score of feelings the Kuhlthau model there has a significant relationship. Also there was a significant and inverse relationship between the third stage (Exploration) of feelings and age of the students. The results showed that in writing dissertation there were some major differences in following up the Kuhlthau model between students of the two Universities. There are significant differences between some of the stages of feelings and actions of students' information-seeking behavior from the two universities. There is a significant relationship between the fourth stage (Formulation) of feelings in the Kuhlthau Model with gender and third stage of the Feelings (Exploration) with age.

  19. Payoff Information Biases a Fast Guess Process in Perceptual Decision Making under Deadline Pressure: Evidence from Behavior, Evoked Potentials, and Quantitative Model Comparison.

    Science.gov (United States)

    Noorbaloochi, Sharareh; Sharon, Dahlia; McClelland, James L

    2015-08-05

    We used electroencephalography (EEG) and behavior to examine the role of payoff bias in a difficult two-alternative perceptual decision under deadline pressure in humans. The findings suggest that a fast guess process, biased by payoff and triggered by stimulus onset, occurred on a subset of trials and raced with an evidence accumulation process informed by stimulus information. On each trial, the participant judged whether a rectangle was shifted to the right or left and responded by squeezing a right- or left-hand dynamometer. The payoff for each alternative (which could be biased or unbiased) was signaled 1.5 s before stimulus onset. The choice response was assigned to the first hand reaching a squeeze force criterion and reaction time was defined as time to criterion. Consistent with a fast guess account, fast responses were strongly biased toward the higher-paying alternative and the EEG exhibited an abrupt rise in the lateralized readiness potential (LRP) on a subset of biased payoff trials contralateral to the higher-paying alternative ∼ 150 ms after stimulus onset and 50 ms before stimulus information influenced the LRP. This rise was associated with poststimulus dynamometer activity favoring the higher-paying alternative and predicted choice and response time. Quantitative modeling supported the fast guess account over accounts of payoff effects supported in other studies. Our findings, taken with previous studies, support the idea that payoff and prior probability manipulations produce flexible adaptations to task structure and do not reflect a fixed policy for the integration of payoff and stimulus information. Humans and other animals often face situations in which they must make choices based on uncertain sensory information together with information about expected outcomes (gains or losses) about each choice. We investigated how differences in payoffs between available alternatives affect neural activity, overt choice, and the timing of choice

  20. Process information systems in nuclear reprocessing

    International Nuclear Information System (INIS)

    Jaeschke, A.; Keller, H.; Orth, H.

    1987-01-01

    On a production management level, a process information system in a nuclear reprocessing plant (NRP) has to fulfill conventional operating functions and functions for nuclear material surveillance (safeguards). Based on today's state of the art of on-line process control technology, the progress in hardware and software technology allows to introduce more process-specific intelligence into process information systems. Exemplified by an expert-system-aided laboratory management system as component of a NRP process information system, the paper demonstrates that these technologies can be applied already. (DG) [de

  1. Business Model Process Configurations

    DEFF Research Database (Denmark)

    Taran, Yariv; Nielsen, Christian; Thomsen, Peter

    2015-01-01

    , by developing (inductively) an ontological classification framework, in view of the BM process configurations typology developed. Design/methodology/approach – Given the inconsistencies found in the business model studies (e.g. definitions, configurations, classifications) we adopted the analytical induction...

  2. Holledge gauge failure testing using concurrent information processing algorithm

    International Nuclear Information System (INIS)

    Weeks, G.E.; Daniel, W.E.; Edwards, R.E.; Jannarone, R.J.; Joshi, S.N.; Palakodety, S.S.; Qian, D.

    1996-01-01

    For several decades, computerized information processing systems and human information processing models have developed with a good deal of mutual influence. Any comprehensive psychology text in this decade uses terms that originated in the computer industry, such as ''cache'' and ''memory'', to describe human information processing. Likewise, many engineers today are using ''artificial intelligence''and ''artificial neural network'' computing tools that originated as models of human thought to solve industrial problems. This paper concerns a recently developed human information processing model, called ''concurrent information processing'' (CIP), and a related set of computing tools for solving industrial problems. The problem of focus is adaptive gauge monitoring; the application is pneumatic pressure repeaters (Holledge gauges) used to measure liquid level and density in the Defense Waste Processing Facility and the Integrated DWPF Melter System

  3. Information risk and security modeling

    Science.gov (United States)

    Zivic, Predrag

    2005-03-01

    This research paper presentation will feature current frameworks to addressing risk and security modeling and metrics. The paper will analyze technical level risk and security metrics of Common Criteria/ISO15408, Centre for Internet Security guidelines, NSA configuration guidelines and metrics used at this level. Information IT operational standards view on security metrics such as GMITS/ISO13335, ITIL/ITMS and architectural guidelines such as ISO7498-2 will be explained. Business process level standards such as ISO17799, COSO and CobiT will be presented with their control approach to security metrics. Top level, the maturity standards such as SSE-CMM/ISO21827, NSA Infosec Assessment and CobiT will be explored and reviewed. For each defined level of security metrics the research presentation will explore the appropriate usage of these standards. The paper will discuss standards approaches to conducting the risk and security metrics. The research findings will demonstrate the need for common baseline for both risk and security metrics. This paper will show the relation between the attribute based common baseline and corporate assets and controls for risk and security metrics. IT will be shown that such approach spans over all mentioned standards. The proposed approach 3D visual presentation and development of the Information Security Model will be analyzed and postulated. Presentation will clearly demonstrate the benefits of proposed attributes based approach and defined risk and security space for modeling and measuring.

  4. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers

    DEFF Research Database (Denmark)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel

    2016-01-01

    of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling....... the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension...

  5. Spiral model pilot project information model

    Science.gov (United States)

    1991-01-01

    The objective was an evaluation of the Spiral Model (SM) development approach to allow NASA Marshall to develop an experience base of that software management methodology. A discussion is presented of the Information Model (IM) that was used as part of the SM methodology. A key concept of the SM is the establishment of an IM to be used by management to track the progress of a project. The IM is the set of metrics that is to be measured and reported throughout the life of the project. These metrics measure both the product and the process to ensure the quality of the final delivery item and to ensure the project met programmatic guidelines. The beauty of the SM, along with the IM, is the ability to measure not only the correctness of the specification and implementation of the requirements but to also obtain a measure of customer satisfaction.

  6. Process mining using BPMN: relating event logs and process models

    NARCIS (Netherlands)

    Kalenkova, A.A.; van der Aalst, W.M.P.; Lomazova, I.A.; Rubin, V.A.

    2017-01-01

    Process-aware information systems (PAIS) are systems relying on processes, which involve human and software resources to achieve concrete goals. There is a need to develop approaches for modeling, analysis, improvement and monitoring processes within PAIS. These approaches include process mining

  7. Process mining using BPMN : relating event logs and process models

    NARCIS (Netherlands)

    Kalenkova, A.A.; Aalst, van der W.M.P.; Lomazova, I.A.; Rubin, V.A.

    2015-01-01

    Process-aware information systems (PAIS) are systems relying on processes, which involve human and software resources to achieve concrete goals. There is a need to develop approaches for modeling, analysis, improvement and monitoring processes within PAIS. These approaches include process mining

  8. Moral judgment as information processing: an integrative review.

    Science.gov (United States)

    Guglielmo, Steve

    2015-01-01

    How do humans make moral judgments about others' behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment.

  9. Moral judgment as information processing: an integrative review

    Science.gov (United States)

    Guglielmo, Steve

    2015-01-01

    How do humans make moral judgments about others’ behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment. PMID:26579022

  10. Composing Models of Geographic Physical Processes

    Science.gov (United States)

    Hofer, Barbara; Frank, Andrew U.

    Processes are central for geographic information science; yet geographic information systems (GIS) lack capabilities to represent process related information. A prerequisite to including processes in GIS software is a general method to describe geographic processes independently of application disciplines. This paper presents such a method, namely a process description language. The vocabulary of the process description language is derived formally from mathematical models. Physical processes in geography can be described in two equivalent languages: partial differential equations or partial difference equations, where the latter can be shown graphically and used as a method for application specialists to enter their process models. The vocabulary of the process description language comprises components for describing the general behavior of prototypical geographic physical processes. These process components can be composed by basic models of geographic physical processes, which is shown by means of an example.

  11. A two-layered diffusion model traces the dynamics of information processing in the valuation-and-choice circuit of decision making.

    Science.gov (United States)

    Piu, Pietro; Fargnoli, Francesco; Innocenti, Alessandro; Rufa, Alessandra

    2014-01-01

    A circuit of evaluation and selection of the alternatives is considered a reliable model in neurobiology. The prominent contributions of the literature to this topic are reported. In this study, valuation and choice of a decisional process during Two-Alternative Forced-Choice (TAFC) task are represented as a two-layered network of computational cells, where information accrual and processing progress in nonlinear diffusion dynamics. The evolution of the response-to-stimulus map is thus modeled by two linked diffusive modules (2LDM) representing the neuronal populations involved in the valuation-and-decision circuit of decision making. Diffusion models are naturally appropriate for describing accumulation of evidence over the time. This allows the computation of the response times (RTs) in valuation and choice, under the hypothesis of ex-Wald distribution. A nonlinear transfer function integrates the activities of the two layers. The input-output map based on the infomax principle makes the 2LDM consistent with the reinforcement learning approach. Results from simulated likelihood time series indicate that 2LDM may account for the activity-dependent modulatory component of effective connectivity between the neuronal populations. Rhythmic fluctuations of the estimate gain functions in the delta-beta bands also support the compatibility of 2LDM with the neurobiology of DM.

  12. Career information processing strategies of secondary school ...

    African Journals Online (AJOL)

    This study examined the strategies commonly adopted by Osun state secondary school students in processing career information. It specifically examined the sources of career information available to the students, the uses to which the students put the information collected and how their career decision making skills can be ...

  13. User's manual for a process model code

    International Nuclear Information System (INIS)

    Kern, E.A.; Martinez, D.P.

    1981-03-01

    The MODEL code has been developed for computer modeling of materials processing facilities associated with the nuclear fuel cycle. However, it can also be used in other modeling applications. This report provides sufficient information for a potential user to apply the code to specific process modeling problems. Several examples that demonstrate most of the capabilities of the code are provided

  14. BIM. Building Information Model. Special issue; BIM. Building Information Model. Themanummer

    Energy Technology Data Exchange (ETDEWEB)

    Van Gelder, A.L.A. [Arta and Consultancy, Lage Zwaluwe (Netherlands); Van den Eijnden, P.A.A. [Stichting Marktwerking Installatietechniek, Zoetermeer (Netherlands); Veerman, J.; Mackaij, J.; Borst, E. [Royal Haskoning DHV, Nijmegen (Netherlands); Kruijsse, P.M.D. [Wolter en Dros, Amersfoort (Netherlands); Buma, W. [Merlijn Media, Waddinxveen (Netherlands); Bomhof, F.; Willems, P.H.; Boehms, M. [TNO, Delft (Netherlands); Hofman, M.; Verkerk, M. [ISSO, Rotterdam (Netherlands); Bodeving, M. [VIAC Installatie Adviseurs, Houten (Netherlands); Van Ravenswaaij, J.; Van Hoven, H. [BAM Techniek, Bunnik (Netherlands); Boeije, I.; Schalk, E. [Stabiplan, Bodegraven (Netherlands)

    2012-11-15

    A series of 14 articles illustrates the various aspects of the Building Information Model (BIM). The essence of BIM is to capture information about the building process and the building product. [Dutch] In 14 artikelen worden diverse aspecten m.b.t. het Building Information Model (BIM) belicht. De essentie van BIM is het vastleggen van informatie over het bouwproces en het bouwproduct.

  15. Business process model repositories : efficient process retrieval

    NARCIS (Netherlands)

    Yan, Z.

    2012-01-01

    As organizations increasingly work in process-oriented manner, the number of business process models that they develop and have to maintain increases. As a consequence, it has become common for organizations to have collections of hundreds or even thousands of business process models. When a

  16. Modelling, Information, Processing, and Control

    Science.gov (United States)

    1989-01-15

    have studied the use of transfer function methods to analyze closed - loop systems arising out of cer- tain linear feedback laws , use of transfer...classical Muntz- Szasz theory of real exponentials. As a re- sult it is seen that D has domain including that of A I / 2 in all cases. Further work is...approximation questions, we will discuss the use of transfer function methods to analyze closed - loop systems arising out of cer- tain linear feedback laws

  17. Motivated information processing in organizational teams: Progress, puzzles, and prospects

    NARCIS (Netherlands)

    Nijstad, B.A.; de Dreu, C.K.W.

    2012-01-01

    Much of the research into group and team functioning looks at groups that perform cognitive tasks, such as decision making, problem solving, and innovation. The Motivated Information Processing in Groups Model (MIP-G; De Dreu, Nijstad, & Van Knippenberg, 2008) conjectures that information processing

  18. Affect and Persuasion: Effects on Motivation for Information Processing.

    Science.gov (United States)

    Leach, Mark M; Stoltenberg, Cal D.

    The relationship between mood and information processing, particularly when reviewing the Elaboration Likelihood Model of persuasion, lacks conclusive evidence. This study was designed to investigate the hypothesis that information processing would be greater for mood-topic congruence than non mood-topic congruence. Undergraduate students (N=216)…

  19. Attachment in Middle Childhood: Associations with Information Processing

    Science.gov (United States)

    Zimmermann, Peter; Iwanski, Alexandra

    2015-01-01

    Attachment theory suggests that internal working models of self and significant others influence adjustment during development by controlling information processing and self-regulation. We provide a conceptual overview on possible mechanisms linking attachment and information processing and review the current literature in middle childhood.…

  20. Team confidence, motivated information processing, and dynamic group decision making

    NARCIS (Netherlands)

    de Dreu, C.K.W.; Beersma, B.

    2010-01-01

    According to the Motivated Information Processing in Groups (MIP-G) model, groups should perform ambiguous (non-ambiguous) tasks better when they have high (low) epistemic motivation and concomitant tendencies to engage in systematic (heuristic) information processing and exchange. The authors

  1. Motivated information processing in group judgement and decision making

    NARCIS (Netherlands)

    de Dreu, C.K.W.; Nijstad, B.A.; van Knippenberg, D.

    2008-01-01

    This article expands the view of groups as information processors into a motivated information processing in groups (MIP-G) model by emphasizing, first, the mixedmotive structure of many group tasks and, second, the idea that individuals engage in more or less deliberate information search and

  2. Motivated information processing in group judgment and decision making

    NARCIS (Netherlands)

    De Dreu, Carsten K. W.; Nijstad, Bernard A.; van Knippenberg, Daan

    This article expands the view of groups as information processors into a motivated information processing in groups (MIP-G) model by emphasizing, first, the mixed-motive structure of many group tasks and, second, the idea that individuals engage in more or less deliberate information search and

  3. Textual information access statistical models

    CERN Document Server

    Gaussier, Eric

    2013-01-01

    This book presents statistical models that have recently been developed within several research communities to access information contained in text collections. The problems considered are linked to applications aiming at facilitating information access:- information extraction and retrieval;- text classification and clustering;- opinion mining;- comprehension aids (automatic summarization, machine translation, visualization).In order to give the reader as complete a description as possible, the focus is placed on the probability models used in the applications

  4. Theory of Neural Information Processing Systems

    International Nuclear Information System (INIS)

    Galla, Tobias

    2006-01-01

    It is difficult not to be amazed by the ability of the human brain to process, to structure and to memorize information. Even by the toughest standards the behaviour of this network of about 10 11 neurons qualifies as complex, and both the scientific community and the public take great interest in the growing field of neuroscience. The scientific endeavour to learn more about the function of the brain as an information processing system is here a truly interdisciplinary one, with important contributions from biology, computer science, physics, engineering and mathematics as the authors quite rightly point out in the introduction of their book. The role of the theoretical disciplines here is to provide mathematical models of information processing systems and the tools to study them. These models and tools are at the centre of the material covered in the book by Coolen, Kuehn and Sollich. The book is divided into five parts, providing basic introductory material on neural network models as well as the details of advanced techniques to study them. A mathematical appendix complements the main text. The range of topics is extremely broad, still the presentation is concise and the book well arranged. To stress the breadth of the book let me just mention a few keywords here: the material ranges from the basics of perceptrons and recurrent network architectures to more advanced aspects such as Bayesian learning and support vector machines; Shannon's theory of information and the definition of entropy are discussed, and a chapter on Amari's information geometry is not missing either. Finally the statistical mechanics chapters cover Gardner theory and the replica analysis of the Hopfield model, not without being preceded by a brief introduction of the basic concepts of equilibrium statistical physics. The book also contains a part on effective theories of the macroscopic dynamics of neural networks. Many dynamical aspects of neural networks are usually hard to find in the

  5. Modeling Business Processes in Public Administration

    Science.gov (United States)

    Repa, Vaclav

    During more than 10 years of its existence business process modeling became a regular part of organization management practice. It is mostly regarded. as a part of information system development or even as a way to implement some supporting technology (for instance workflow system). Although I do not agree with such reduction of the real meaning of a business process, it is necessary to admit that information technologies play an essential role in business processes (see [1] for more information), Consequently, an information system is inseparable from a business process itself because it is a cornerstone of the general basic infrastructure of a business. This fact impacts on all dimensions of business process management. One of these dimensions is the methodology that postulates that the information systems development provide the business process management with exact methods and tools for modeling business processes. Also the methodology underlying the approach presented in this paper has its roots in the information systems development methodology.

  6. Integrating textual and model-based process descriptions for comprehensive process search

    NARCIS (Netherlands)

    Leopold, Henrik; van der Aa, Han; Pittke, Fabian; Raffel, Manuel; Mendling, Jan; Reijers, Hajo A.

    2016-01-01

    Documenting business processes using process models is common practice in many organizations. However, not all process information is best captured in process models. Hence, many organizations complement these models with textual descriptions that specify additional details. The problem with this

  7. Occurrence reporting and processing of operations information

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-07-21

    DOE O 232.1A, Occurrence Reporting and Processing of Operations Information, and 10 CFR 830.350, Occurrence Reporting and Processing of Operations Information (when it becomes effective), along with this manual, set forth occurrence reporting requirements for Department of Energy (DOE) Departmental Elements and contractors responsible for the management and operation of DOE-owned and -leased facilities. These requirements include categorization of occurrences related to safety, security, environment, health, or operations (``Reportable Occurrences``); DOE notification of these occurrences; and the development and submission of documented follow-up reports. This Manual provides detailed information for categorizing and reporting occurrences at DOE facilities. Information gathered by the Occurrence Reporting and processing System is used for analysis of the Department`s performance in environmental protection, safeguards and security, and safety and health of its workers and the public. This information is also used to develop lessons learned and document events that significantly impact DOE operations.

  8. Occurrence reporting and processing of operations information

    International Nuclear Information System (INIS)

    1997-01-01

    DOE O 232.1A, Occurrence Reporting and Processing of Operations Information, and 10 CFR 830.350, Occurrence Reporting and Processing of Operations Information (when it becomes effective), along with this manual, set forth occurrence reporting requirements for Department of Energy (DOE) Departmental Elements and contractors responsible for the management and operation of DOE-owned and -leased facilities. These requirements include categorization of occurrences related to safety, security, environment, health, or operations (''Reportable Occurrences''); DOE notification of these occurrences; and the development and submission of documented follow-up reports. This Manual provides detailed information for categorizing and reporting occurrences at DOE facilities. Information gathered by the Occurrence Reporting and processing System is used for analysis of the Department's performance in environmental protection, safeguards and security, and safety and health of its workers and the public. This information is also used to develop lessons learned and document events that significantly impact DOE operations

  9. Certainty and Uncertainty in Quantum Information Processing

    OpenAIRE

    Rieffel, Eleanor G.

    2007-01-01

    This survey, aimed at information processing researchers, highlights intriguing but lesser known results, corrects misconceptions, and suggests research areas. Themes include: certainty in quantum algorithms; the "fewer worlds" theory of quantum mechanics; quantum learning; probability theory versus quantum mechanics.

  10. Developments in quantum information processing by nuclear ...

    Indian Academy of Sciences (India)

    qubits, the 2n energy levels of the spin-system can be treated as an n-qubit system. ... Quantum information processing; qubit; nuclear magnetic resonance quantum comput- ing. ..... The equilibrium spectrum has theoretical intensities in the ra-.

  11. Modeling styles in business process modeling

    NARCIS (Netherlands)

    Pinggera, J.; Soffer, P.; Zugal, S.; Weber, B.; Weidlich, M.; Fahland, D.; Reijers, H.A.; Mendling, J.; Bider, I.; Halpin, T.; Krogstie, J.; Nurcan, S.; Proper, E.; Schmidt, R.; Soffer, P.; Wrycza, S.

    2012-01-01

    Research on quality issues of business process models has recently begun to explore the process of creating process models. As a consequence, the question arises whether different ways of creating process models exist. In this vein, we observed 115 students engaged in the act of modeling, recording

  12. A process framework for information security management

    Directory of Open Access Journals (Sweden)

    Knut Haufe

    2016-01-01

    Full Text Available Securing sensitive organizational data has become increasingly vital to organizations. An Information Security Management System (ISMS is a systematic approach for establishing, implementing, operating, monitoring, reviewing, maintaining and improving an organization's information security. Key elements of the operation of an ISMS are ISMS processes. However, and in spite of its importance, an ISMS process framework with a description of ISMS processes and their interaction as well as the interaction with other management processes is not available in the literature. Cost benefit analysis of information security investments regarding single measures protecting information and ISMS processes are not in the focus of current research, mostly focused on economics. This article aims to fill this research gap by proposing such an ISMS process framework as the main contribution. Based on a set of agreed upon ISMS processes in existing standards like ISO 27000 series, COBIT and ITIL. Within the framework, identified processes are described and their interaction and interfaces are specified. This framework helps to focus on the operation of the ISMS, instead of focusing on measures and controls. By this, as a main finding, the systemic character of the ISMS consisting of processes and the perception of relevant roles of the ISMS is strengthened.

  13. Teaching Information Systems Development via Process Variants

    Science.gov (United States)

    Tan, Wee-Kek; Tan, Chuan-Hoo

    2010-01-01

    Acquiring the knowledge to assemble an integrated Information System (IS) development process that is tailored to the specific needs of a project has become increasingly important. It is therefore necessary for educators to impart to students this crucial skill. However, Situational Method Engineering (SME) is an inherently complex process that…

  14. Information technology, knowledge processes, and innovation success

    NARCIS (Netherlands)

    Song, X.M.; Zang, F.; Bij, van der J.D.; Weggeman, M.C.D.P.

    2001-01-01

    Despite the obvious linkage between information technologies (IT) and knowledge processes and the apparent strategic importance of both, little research has done to explicitly examine how, if at all, IT and knowledge processes affect firm outcomes. The purpose of this study is to bridge this

  15. ENERGETIC CHARGE OF AN INFORMATION PROCESS

    Directory of Open Access Journals (Sweden)

    Popova T.M.

    2009-12-01

    Full Text Available Main laws of technical thermodynamics are universal and could be applied to processes other than thermodynamic ones. The results of the comparison of peculiarities of irreversible informational and thermodynamic processes are presented in the article and a new term “Infopy” is used. A more precise definition of “infopy” as an energetic charge is given in the article.

  16. Process informed accurate compact modelling of 14-nm FinFET variability and application to statistical 6T-SRAM simulations

    OpenAIRE

    Wang, Xingsheng; Reid, Dave; Wang, Liping; Millar, Campbell; Burenkov, Alex; Evanschitzky, Peter; Baer, Eberhard; Lorenz, Juergen; Asenov, Asen

    2016-01-01

    This paper presents a TCAD based design technology co-optimization (DTCO) process for 14nm SOI FinFET based SRAM, which employs an enhanced variability aware compact modeling approach that fully takes process and lithography simulations and their impact on 6T-SRAM layout into account. Realistic double patterned gates and fins and their impacts are taken into account in the development of the variability-aware compact model. Finally, global process induced variability and local statistical var...

  17. a N-D Virtual Notebook about the Basilica of S. Ambrogio in Milan: Information Modeling for the Communication of Historical Phases Subtraction Process

    Science.gov (United States)

    Stanga, C.; Spinelli, C.; Brumana, R.; Oreni, D.; Valente, R.; Banfi, F.

    2017-08-01

    This essay describes the combination of 3D solutions and software techniques with traditional studies and researches in order to achieve an integrated digital documentation between performed surveys, collected data, and historical research. The approach of this study is based on the comparison of survey data with historical research, and interpretations deduced from a data cross-check between the two mentioned sources. The case study is the Basilica of S. Ambrogio in Milan, one of the greatest monuments in the city, a pillar of the Christianity and of the History of Architecture. It is characterized by a complex stratification of phases of restoration and transformation. Rediscovering the great richness of the traditional architectural notebook, which collected surveys and data, this research aims to realize a virtual notebook, based on a 3D model that supports the dissemination of the collected information. It can potentially be understandable and accessible by anyone through the development of a mobile app. The 3D model was used to explore the different historical phases, starting from the recent layers to the oldest ones, through a virtual subtraction process, following the methods of Archaeology of Architecture. Its components can be imported into parametric software and recognized both in their morphological and typological aspects. It is based on the concept of LoD and ReverseLoD in order to fit the accuracy required by each step of the research.

  18. A N-D VIRTUAL NOTEBOOK ABOUT THE BASILICA OF S. AMBROGIO IN MILAN: INFORMATION MODELING FOR THE COMMUNICATION OF HISTORICAL PHASES SUBTRACTION PROCESS

    Directory of Open Access Journals (Sweden)

    C. Stanga

    2017-08-01

    Full Text Available This essay describes the combination of 3D solutions and software techniques with traditional studies and researches in order to achieve an integrated digital documentation between performed surveys, collected data, and historical research. The approach of this study is based on the comparison of survey data with historical research, and interpretations deduced from a data cross-check between the two mentioned sources. The case study is the Basilica of S. Ambrogio in Milan, one of the greatest monuments in the city, a pillar of the Christianity and of the History of Architecture. It is characterized by a complex stratification of phases of restoration and transformation. Rediscovering the great richness of the traditional architectural notebook, which collected surveys and data, this research aims to realize a virtual notebook, based on a 3D model that supports the dissemination of the collected information. It can potentially be understandable and accessible by anyone through the development of a mobile app. The 3D model was used to explore the different historical phases, starting from the recent layers to the oldest ones, through a virtual subtraction process, following the methods of Archaeology of Architecture. Its components can be imported into parametric software and recognized both in their morphological and typological aspects. It is based on the concept of LoD and ReverseLoD in order to fit the accuracy required by each step of the research.

  19. Process control using modern systems of information processing

    International Nuclear Information System (INIS)

    Baldeweg, F.

    1984-01-01

    Modern digital automation techniques allow the application of demanding types of process control. These types of process control are characterized by their belonging to higher levels in a multilevel model. Functional and technical aspects of the performance of digital automation plants are presented and explained. A modern automation system is described considering special procedures of process control (e.g. real time diagnosis)

  20. Quantum information processing with atoms and photons

    International Nuclear Information System (INIS)

    Monroe, C.

    2003-01-01

    Quantum information processors exploit the quantum features of superposition and entanglement for applications not possible in classical devices, offering the potential for significant improvements in the communication and processing of information. Experimental realization of large-scale quantum information processors remains a long term vision, as the required nearly pure quantum behaviour is observed only in exotic hardware such as individual laser-cooled atoms and isolated photons. But recent theoretical and experimental advances suggest that cold atoms and individual photons may lead the way towards bigger and better quantum information processors, effectively building mesoscopic versions of Schroedinger's cat' from the bottom up. (author)

  1. Application of the Life Cycle Analysis and the Building Information Modelling Software in the Architectural Climate Change-Oriented Design Process

    Science.gov (United States)

    Gradziński, Piotr

    2017-10-01

    Whereas World’s climate is changing (inter alia, under the influence of architecture activity), the author attempts to reorientations design practice primarily in a direction the use and adapt to the climatic conditions. Architectural Design using in early stages of the architectural Design Process of the building, among other Life Cycle Analysis (LCA) and digital analytical tools BIM (Building Information Modelling) defines the overriding requirements which the designer/architect should meet. The first part, the text characterized the architecture activity influences (by consumption, pollution, waste, etc.) and the use of building materials (embodied energy, embodied carbon, Global Warming Potential, etc.) within the meaning of the direct negative environmental impact. The second part, the paper presents the revision of the methods and analytical techniques prevent negative influences. Firstly, showing the study of the building by using the Life Cycle Analysis of the structure (e.g. materials) and functioning (e.g. energy consumptions) of the architectural object (stages: before use, use, after use). Secondly, the use of digital analytical tools for determining the benefits of running multi-faceted simulations in terms of environmental factors (exposure to light, shade, wind) directly affecting shaping the form of the building. The conclusion, author’s research results highlight the fact that indicates the possibility of building design using the above-mentioned elements (LCA, BIM) causes correction, early designs decisions in the design process of architectural form, minimizing the impact on nature, environment. The work refers directly to the architectural-environmental dimensions, orienting the design process of buildings in respect of widely comprehended climatic changes.

  2. Algorithmic information theory mathematics of digital information processing

    CERN Document Server

    Seibt, Peter

    2007-01-01

    Treats the Mathematics of many important areas in digital information processing. This book covers, in a unified presentation, five topics: Data Compression, Cryptography, Sampling (Signal Theory), Error Control Codes, Data Reduction. It is useful for teachers, students and practitioners in Electronic Engineering, Computer Science and Mathematics.

  3. PHYSICAL RESOURCES OF INFORMATION PROCESSES AND TECHNOLOGIES

    Directory of Open Access Journals (Sweden)

    Mikhail O. Kolbanev

    2014-11-01

    Full Text Available Subject of study. The paper describes basic information technologies for automating of information processes of data storage, distribution and processing in terms of required physical resources. It is shown that the study of these processes with such traditional objectives of modern computer science, as the ability to transfer knowledge, degree of automation, information security, coding, reliability, and others, is not enough. The reasons are: on the one hand, the increase in the volume and intensity of information exchange in the subject of human activity and, on the other hand, drawing near to the limit of information systems efficiency based on semiconductor technologies. Creation of such technologies, which not only provide support for information interaction, but also consume a rational amount of physical resources, has become an actual problem of modern engineering development. Thus, basic information technologies for storage, distribution and processing of information to support the interaction between people are the object of study, and physical temporal, spatial and energy resources required for implementation of these technologies are the subject of study. Approaches. An attempt is made to enlarge the possibilities of traditional cybernetics methodology, which replaces the consideration of material information component by states search for information objects. It is done by taking explicitly into account the amount of physical resources required for changes in the states of information media. Purpose of study. The paper deals with working out of a common approach to the comparison and subsequent selection of basic information technologies for storage, distribution and processing of data, taking into account not only the requirements for the quality of information exchange in particular subject area and the degree of technology application, but also the amounts of consumed physical resources. Main findings. Classification of resources

  4. A Model to Identify the Most Effective Business Rule in Information Systems using Rough Set Theory: Study on Loan Business Process

    Directory of Open Access Journals (Sweden)

    Mohammad Aghdasi

    2011-09-01

    In this paper, a practical model is used to identify the most effective rules in information systems. In this model, first, critical business attributes which fit to strategic expectations are taken into account. These are the attributes which their changes are more important than others in achieving the strategic expectations. To identify these attributes we utilize rough set theory. Those business rules which use critical information attribute in their structures are identified as the most effective business rules. The Proposed model helps information system developers to identify scope of effective business rules. It causes a decrease in time and cost of information system maintenance. Also it helps business analyst to focus on managing critical business attributes in order to achieve a specific goal.

  5. Process Information System - Nuclear Power Plant Krsko

    International Nuclear Information System (INIS)

    Mandic, D.; Barbic, B.; Linke, B.; Colak, I.

    1998-01-01

    Original NEK design was using several Process Computer Systems (PCS) for both process control and process supervision. PCS were built by different manufacturers around different hardware and software platforms. Operational experience and new regulatory requirements imposed new technical and functional requirements on the PCS. Requirements such as: - Acquisition of new signals from the technological processes and environment - Implementation of new application programs - Significant improvement of MMI (Man Machine Interface) - Process data transfer to other than Main Control Room (MCR) locations - Process data archiving and capability to retrieve same data for future analysis were impossible to be implemented within old systems. In order to satisfy new requirements, NEK has decided to build new Process Information System (PIS). During the design and construction of the PIS Project Phase I, in addition to the main foreign contractor, there was significant participation of local architect engineering and construction companies. This paper presents experience of NEK and local partners. (author)

  6. The minimal work cost of information processing

    Science.gov (United States)

    Faist, Philippe; Dupuis, Frédéric; Oppenheim, Jonathan; Renner, Renato

    2015-07-01

    Irreversible information processing cannot be carried out without some inevitable thermodynamical work cost. This fundamental restriction, known as Landauer's principle, is increasingly relevant today, as the energy dissipation of computing devices impedes the development of their performance. Here we determine the minimal work required to carry out any logical process, for instance a computation. It is given by the entropy of the discarded information conditional to the output of the computation. Our formula takes precisely into account the statistically fluctuating work requirement of the logical process. It enables the explicit calculation of practical scenarios, such as computational circuits or quantum measurements. On the conceptual level, our result gives a precise and operational connection between thermodynamic and information entropy, and explains the emergence of the entropy state function in macroscopic thermodynamics.

  7. Consciousness: a unique way of processing information.

    Science.gov (United States)

    Marchetti, Giorgio

    2018-02-08

    In this article, I argue that consciousness is a unique way of processing information, in that: it produces information, rather than purely transmitting it; the information it produces is meaningful for us; the meaning it has is always individuated. This uniqueness allows us to process information on the basis of our personal needs and ever-changing interactions with the environment, and consequently to act autonomously. Three main basic cognitive processes contribute to realize this unique way of information processing: the self, attention and working memory. The self, which is primarily expressed via the central and peripheral nervous systems, maps our body, the environment, and our relations with the environment. It is the primary means by which the complexity inherent to our composite structure is reduced into the "single voice" of a unique individual. It provides a reference system that (albeit evolving) is sufficiently stable to define the variations that will be used as the raw material for the construction of conscious information. Attention allows for the selection of those variations in the state of the self that are most relevant in the given situation. Attention originates and is deployed from a single locus inside our body, which represents the center of the self, around which all our conscious experiences are organized. Whatever is focused by attention appears in our consciousness as possessing a spatial quality defined by this center and the direction toward which attention is focused. In addition, attention determines two other features of conscious experience: periodicity and phenomenal quality. Self and attention are necessary but not sufficient for conscious information to be produced. Complex forms of conscious experiences, such as the various modes of givenness of conscious experience and the stream of consciousness, need a working memory mechanism to assemble the basic pieces of information selected by attention.

  8. A framework for information warehouse development processes

    OpenAIRE

    Holten, Roland

    1999-01-01

    Since the terms Data Warehouse and On-Line Analytical Processing were proposed by Inmon and Codd, Codd, Sally respectively the traditional ideas of creating information systems in support of management¿s decision became interesting again in theory and practice. Today information warehousing is a strategic market for any data base systems vendor. Nevertheless the theoretical discussions of this topic go back to the early years of the 20th century as far as management science and accounting the...

  9. Reshaping the Enterprise through an Information Architecture and Process Reengineering.

    Science.gov (United States)

    Laudato, Nicholas C.; DeSantis, Dennis J.

    1995-01-01

    The approach used by the University of Pittsburgh (Pennsylvania) in designing a campus-wide information architecture and a framework for reengineering the business process included building consensus on a general philosophy for information systems, using pattern-based abstraction techniques, applying data modeling and application prototyping, and…

  10. Unveiling the mystery of visual information processing in human brain.

    Science.gov (United States)

    Diamant, Emanuel

    2008-08-15

    It is generally accepted that human vision is an extremely powerful information processing system that facilitates our interaction with the surrounding world. However, despite extended and extensive research efforts, which encompass many exploration fields, the underlying fundamentals and operational principles of visual information processing in human brain remain unknown. We still are unable to figure out where and how along the path from eyes to the cortex the sensory input perceived by the retina is converted into a meaningful object representation, which can be consciously manipulated by the brain. Studying the vast literature considering the various aspects of brain information processing, I was surprised to learn that the respected scholarly discussion is totally indifferent to the basic keynote question: "What is information?" in general or "What is visual information?" in particular. In the old days, it was assumed that any scientific research approach has first to define its basic departure points. Why was it overlooked in brain information processing research remains a conundrum. In this paper, I am trying to find a remedy for this bizarre situation. I propose an uncommon definition of "information", which can be derived from Kolmogorov's Complexity Theory and Chaitin's notion of Algorithmic Information. Embracing this new definition leads to an inevitable revision of traditional dogmas that shape the state of the art of brain information processing research. I hope this revision would better serve the challenging goal of human visual information processing modeling.

  11. Process-aware information systems : bridging people and software through process technology

    NARCIS (Netherlands)

    Dumas, M.; Aalst, van der W.M.P.; Hofstede, ter A.H.M.

    2005-01-01

    A unifying foundation to design and implement process-aware information systems This publication takes on the formidable task of establishing a unifying foundation and set of common underlying principles to effectively model, design, and implement process-aware information systems. Authored by

  12. Scalable Networked Information Processing Environment (SNIPE)

    Energy Technology Data Exchange (ETDEWEB)

    Fagg, G.E.; Moore, K. [Univ. of Tennessee, Knoxville, TN (United States). Dept. of Computer Science; Dongarra, J.J. [Univ. of Tennessee, Knoxville, TN (United States). Dept. of Computer Science]|[Oak Ridge National Lab., TN (United States). Computer Science and Mathematics Div.; Geist, A. [Oak Ridge National Lab., TN (United States). Computer Science and Mathematics Div.

    1997-11-01

    SNIPE is a metacomputing system that aims to provide a reliable, secure, fault tolerant environment for long term distributed computing applications and data stores across the global Internet. This system combines global naming and replication of both processing and data to support large scale information processing applications leading to better availability and reliability than currently available with typical cluster computing and/or distributed computer environments.

  13. Integrated model of assisted parking system and performance evaluation with entropy weight extended analytic hierarchy process and two-tuple linguistic information

    Directory of Open Access Journals (Sweden)

    Yiding Hua

    2016-06-01

    Full Text Available Evaluating comprehensive performance of assisted parking system has been a very important issue for car companies for years, because the overall performance of assisted parking system directly influences car intellectualization and customers’ degree of satisfaction. Therefore, this article proposes two-tuple linguistic analytic hierarchy process to evaluate assisted parking system so as to avoid information loss during the processes of evaluation integration. The performance evaluation attributes for assisted parking system are established initially. Subsequently, the information entropy theory is proposed to improve the evaluation attribute weight determined by analytic hierarchy process for the influencing factors of the randomness in parking test process. Furthermore, the evaluation attribute measure values of comprehensive performance are calculated and the assisted parking system evaluation results are obtained with ordered weighted averaging operator. Finally, numerical examples of vehicle types equipped with eight different assisted parking systems and computational results are presented.

  14. A Model for an Electronic Information Marketplace

    Directory of Open Access Journals (Sweden)

    Wei Ge

    2005-11-01

    Full Text Available As the information content on the Internet increases, the task of locating desired information and assessing its quality becomes increasingly difficult. This development causes users to be more willing to pay for information that is focused on specific issues, verifiable, and available upon request. Thus, the nature of the Internet opens up the opportunity for information trading. In this context, the Internet cannot only be used to close the transaction, but also to deliver the product - desired information - to the user. Early attempts to implement such business models have fallen short of expectations. In this paper, we discuss the limitations of such practices and present a modified business model for information trading, which uses a reverse auction approach together with a multiple-buyer price discovery process

  15. Discovering Process Reference Models from Process Variants Using Clustering Techniques

    NARCIS (Netherlands)

    Li, C.; Reichert, M.U.; Wombacher, Andreas

    2008-01-01

    In today's dynamic business world, success of an enterprise increasingly depends on its ability to react to changes in a quick and flexible way. In response to this need, process-aware information systems (PAIS) emerged, which support the modeling, orchestration and monitoring of business processes

  16. Contexts for concepts: Information modeling for semantic interoperability

    NARCIS (Netherlands)

    Oude Luttighuis, P.H.W.M.; Stap, R.E.; Quartel, D.

    2011-01-01

    Conceptual information modeling is a well-established practice, aimed at preparing the implementation of information systems, the specification of electronic message formats, and the design of information processes. Today's ever more connected world however poses new challenges for conceptual

  17. The Evolution Process on Information Technology Outsourcing Relationship

    Directory of Open Access Journals (Sweden)

    Duan Weihua

    2017-01-01

    Full Text Available Information technology outsourcing relationship is one of the key issues to IT outsourcing success. To explore how to manage and promote IT outsourcing relationship, it is necessary to understand its evolution process. Firstly, the types of IT outsourcing based on relationship quality and IT outsourcing project level will be analyzed; Secondly, two evolution process models of IT outsourcing relationship are proposed based on relationship quality and IT outsourcing project level, and the IT outsourcing relationship evolution process is indicated; Finally, an IT outsourcing relationship evolution process model is developed, and the development process of IT outsourcing relationship from low to high under the internal and external power is explained.

  18. Information Systems Outsourcing Relationship Model

    Directory of Open Access Journals (Sweden)

    Richard Flemming

    2007-09-01

    Full Text Available Increasing attention is being paid to what determines the success of an information systems outsourcing arrangement. The current research aims to provide an improved understanding of the factors influencing the outcome of an information systems outsourcing relationship and to provide a preliminary validation of an extended outsourcing relationship model by interviews with information systems outsourcing professionals in both the client and vendor of a major Australian outsourcing relationship. It also investigates whether the client and the vendor perceive the relationship differently and if so, how they perceive it differently and whether the two perspectives are interrelated.

  19. Information Processing and Dynamics in Minimally Cognitive Agents

    Science.gov (United States)

    Beer, Randall D.; Williams, Paul L.

    2015-01-01

    There has been considerable debate in the literature about the relative merits of information processing versus dynamical approaches to understanding cognitive processes. In this article, we explore the relationship between these two styles of explanation using a model agent evolved to solve a relational categorization task. Specifically, we…

  20. The combining of multiple hemispheric resources in learning-disabled and skilled readers' recall of words: a test of three information-processing models.

    Science.gov (United States)

    Swanson, H L

    1987-01-01

    Three theoretical models (additive, independence, maximum rule) that characterize and predict the influence of independent hemispheric resources on learning-disabled and skilled readers' simultaneous processing were tested. Predictions related to word recall performance during simultaneous encoding conditions (dichotic listening task) were made from unilateral (dichotic listening task) presentations. The maximum rule model best characterized both ability groups in that simultaneous encoding produced no better recall than unilateral presentations. While the results support the hypothesis that both ability groups use similar processes in the combining of hemispheric resources (i.e., weak/dominant processing), ability group differences do occur in the coordination of such resources.

  1. Generating process model collections

    NARCIS (Netherlands)

    Yan, Z.; Dijkman, R.M.; Grefen, P.W.P.J.

    2017-01-01

    Business process management plays an important role in the management of organizations. More and more organizations describe their operations as business processes. It is common for organizations to have collections of thousands of business processes, but for reasons of confidentiality these

  2. Springfield Processing Plant (SPP) Facility Information

    Energy Technology Data Exchange (ETDEWEB)

    Leach, Janice; Torres, Teresa M.

    2012-10-01

    The Springfield Processing Plant is a hypothetical facility. It has been constructed for use in training workshops. Information is provided about the facility and its surroundings, particularly security-related aspects such as target identification, threat data, entry control, and response force data.

  3. Information-Theoretic Perspectives on Geophysical Models

    Science.gov (United States)

    Nearing, Grey

    2016-04-01

    To test any hypothesis about any dynamic system, it is necessary to build a model that places that hypothesis into the context of everything else that we know about the system: initial and boundary conditions and interactions between various governing processes (Hempel and Oppenheim, 1948, Cartwright, 1983). No hypothesis can be tested in isolation, and no hypothesis can be tested without a model (for a geoscience-related discussion see Clark et al., 2011). Science is (currently) fundamentally reductionist in the sense that we seek some small set of governing principles that can explain all phenomena in the universe, and such laws are ontological in the sense that they describe the object under investigation (Davies, 1990 gives several competing perspectives on this claim). However, since we cannot build perfect models of complex systems, any model that does not also contain an epistemological component (i.e., a statement, like a probability distribution, that refers directly to the quality of of the information from the model) is falsified immediately (in the sense of Popper, 2002) given only a small number of observations. Models necessarily contain both ontological and epistemological components, and what this means is that the purpose of any robust scientific method is to measure the amount and quality of information provided by models. I believe that any viable philosophy of science must be reducible to this statement. The first step toward a unified theory of scientific models (and therefore a complete philosophy of science) is a quantitative language that applies to both ontological and epistemological questions. Information theory is one such language: Cox' (1946) theorem (see Van Horn, 2003) tells us that probability theory is the (only) calculus that is consistent with Classical Logic (Jaynes, 2003; chapter 1), and information theory is simply the integration of convex transforms of probability ratios (integration reduces density functions to scalar

  4. Enduring sensorimotor gating abnormalities following predator exposure or corticotropin-releasing factor in rats: a model for PTSD-like information-processing deficits?

    Science.gov (United States)

    Bakshi, Vaishali P; Alsene, Karen M; Roseboom, Patrick H; Connors, Elenora E

    2012-02-01

    emerged) by both stressors and CRF, but returned to normal control levels 24 h later, when PPI deficits were present. Thus, predator exposure produces a delayed disruption of PPI, and stimulation of CRF receptors recapitulates these effects. Contemporaneous HPA axis activation is neither necessary nor sufficient for these PPI deficits. These results indicate that predator exposure, perhaps acting through CRF, may model the delayed-onset and persistent sensorimotor gating abnormalities that have been observed clinically in PTSD, and that further studies using this model may shed insight on the mechanisms of information-processing deficits in this disorder. This article is part of a Special Issue entitled 'Post-Traumatic Stress Disorder'. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Information processing in the vertebrate habenula.

    Science.gov (United States)

    Fore, Stephanie; Palumbo, Fabrizio; Pelgrims, Robbrecht; Yaksi, Emre

    2018-06-01

    The habenula is a brain region that has gained increasing popularity over the recent years due to its role in processing value-related and experience-dependent information with a strong link to depression, addiction, sleep and social interactions. This small diencephalic nucleus is proposed to act as a multimodal hub or a switchboard, where inputs from different brain regions converge. These diverse inputs to the habenula carry information about the sensory world and the animal's internal state, such as reward expectation or mood. However, it is not clear how these diverse habenular inputs interact with each other and how such interactions contribute to the function of habenular circuits in regulating behavioral responses in various tasks and contexts. In this review, we aim to discuss how information processing in habenular circuits, can contribute to specific behavioral programs that are attributed to the habenula. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Information interfaces for process plant diagnosis

    International Nuclear Information System (INIS)

    Lind, M.

    1984-02-01

    The paper describes a systematic approach to the design of information interfaces for operator support in diagnosing complex systems faults. The need of interpreting primary measured plant variables within the framework of different system representations organized into an abstraction hierarchy is identified from an analysis of the problem of diagnosing complex systems. A formalized approach to the modelling of production systems, called Multilevel Flow Modelling, is described. A MFM model specifies plant control requirements and the associated need for plant information and provide a consistent context for the interpretation of real time plant signals in diagnosis of malfunctions. The use of MFM models as a basis for functional design of the plant instrumentation system is outlined, and the use of knowledge Based (Expert) Systems for the design of man-machine interfaces is mentioned. Such systems would allow an active user participation in diagnosis and thus provide the basis for cooperative problem solving. 14 refs. (author)

  7. Testing an alternate informed consent process.

    Science.gov (United States)

    Yates, Bernice C; Dodendorf, Diane; Lane, Judy; LaFramboise, Louise; Pozehl, Bunny; Duncan, Kathleen; Knodel, Kendra

    2009-01-01

    One of the main problems in conducting clinical trials is low participation rate due to potential participants' misunderstanding of the rationale for the clinical trial or perceptions of loss of control over treatment decisions. The objective of this study was to test an alternate informed consent process in cardiac rehabilitation participants that involved the use of a multimedia flip chart to describe a future randomized clinical trial and then asked, hypothetically, if they would participate in the future trial. An attractive and inviting visual presentation of the study was created in the form of a 23-page flip chart that included 24 color photographs displaying information about the purpose of the study, similarities and differences between the two treatment groups, and the data collection process. We tested the flip chart in 35 cardiac rehabilitation participants. Participants were asked if they would participate in this future study on two occasions: immediately after the description of the flip chart and 24 hours later, after reading through the informed consent document. Participants were also asked their perceptions of the flip chart and consent process. Of the 35 participants surveyed, 19 (54%) indicated that they would participate in the future study. No participant changed his or her decision 24 hours later after reading the full consent form. The participation rate improved 145% over that of an earlier feasibility study where the recruitment rate was 22%. Most participants stated that the flip chart was helpful and informative and that the photographs were effective in communicating the purpose of the study. Participation rates could be enhanced in future clinical trials by using a visual presentation to explain and describe the study as part of the informed consent process. More research is needed to test alternate methods of obtaining informed consent.

  8. Modeling the reemergence of information diffusion in social network

    Science.gov (United States)

    Yang, Dingda; Liao, Xiangwen; Shen, Huawei; Cheng, Xueqi; Chen, Guolong

    2018-01-01

    Information diffusion in networks is an important research topic in various fields. Existing studies either focus on modeling the process of information diffusion, e.g., independent cascade model and linear threshold model, or investigate information diffusion in networks with certain structural characteristics such as scale-free networks and small world networks. However, there are still several phenomena that have not been captured by existing information diffusion models. One of the prominent phenomena is the reemergence of information diffusion, i.e., a piece of information reemerges after the completion of its initial diffusion process. In this paper, we propose an optimized information diffusion model by introducing a new informed state into traditional susceptible-infected-removed model. We verify the proposed model via simulations in real-world social networks, and the results indicate that the model can reproduce the reemergence of information during the diffusion process.

  9. How Students Learn: Information Processing, Intellectual Development and Confrontation

    Science.gov (United States)

    Entwistle, Noel

    1975-01-01

    A model derived from information processing theory is described, which helps to explain the complex verbal learning of students and suggests implications for lecturing techniques. Other factors affecting learning, which are not covered by the model, are discussed in relationship to it: student's intellectual development and effects of individual…

  10. Modeling business processes: theoretical and practical aspects

    Directory of Open Access Journals (Sweden)

    V.V. Dubininа

    2015-06-01

    Full Text Available The essence of process-oriented enterprise management has been examined in the article. The content and types of information technology have been analyzed in the article, due to the complexity and differentiation of existing methods, as well as the specificity of language, terminology of the enterprise business processes modeling. The theoretical aspects of business processes modeling have been reviewed and the modern traditional modeling techniques received practical application in the visualization model of retailers activity have been studied in the article. In the process of theoretical analysis of the modeling methods found that UFO-toolkit method that has been developed by Ukrainian scientists due to it systemology integrated opportunities, is the most suitable for structural and object analysis of retailers business processes. It was designed visualized simulation model of the business process "sales" as is" of retailers using a combination UFO-elements with the aim of the further practical formalization and optimization of a given business process.

  11. Event-related potentials and information processing

    NARCIS (Netherlands)

    Brookhuis, Karel Anton

    1989-01-01

    We set out to test the hypotheses generated by Shiffrin & Schneider’s model of information procesing with our new tool, the ERP. The experiments were devised to test hypotheses that were orginally based on performance data alone, i.e. reaction time and errors. Although the overt behaviour was

  12. Modeling Human Information Acquisition Strategies

    NARCIS (Netherlands)

    Heuvelink, Annerieke; Klein, Michel C. A.; van Lambalgen, Rianne; Taatgen, Niels A.; Rijn, Hedderik van

    2009-01-01

    The focus of this paper is the development of a computational model for intelligent agents that decides on whether to acquire required information by retrieving it from memory or by interacting with the world. First, we present a task for which such decisions have to be made. Next, we discuss an

  13. Disjunctive Information Flow for Communicating Processes

    DEFF Research Database (Denmark)

    Li, Ximeng; Nielson, Flemming; Nielson, Hanne Riis

    2016-01-01

    The security validation of practical computer systems calls for the ability to specify and verify information flow policies that are dependent on data content. Such policies play an important role in concurrent, communicating systems: consider a scenario where messages are sent to different...... processes according to their tagging. We devise a security type system that enforces content-dependent information flow policies in the presence of communication and concurrency. The type system soundly guarantees a compositional noninterference property. All theoretical results have been formally proved...

  14. Applying the Business Process and Practice Alignment Meta-model: Daily Practices and Process Modelling

    Directory of Open Access Journals (Sweden)

    Ventura Martins Paula

    2017-03-01

    Full Text Available Background: Business Process Modelling (BPM is one of the most important phases of information system design. Business Process (BP meta-models allow capturing informational and behavioural aspects of business processes. Unfortunately, standard BP meta-modelling approaches focus just on process description, providing different BP models. It is not possible to compare and identify related daily practices in order to improve BP models. This lack of information implies that further research in BP meta-models is needed to reflect the evolution/change in BP. Considering this limitation, this paper introduces a new BP meta-model designed by Business Process and Practice Alignment Meta-model (BPPAMeta-model. Our intention is to present a meta-model that addresses features related to the alignment between daily work practices and BP descriptions. Objectives: This paper intends to present a metamodel which is going to integrate daily work information into coherent and sound process definitions. Methods/Approach: The methodology employed in the research follows a design-science approach. Results: The results of the case study are related to the application of the proposed meta-model to align the specification of a BP model with work practices models. Conclusions: This meta-model can be used within the BPPAM methodology to specify or improve business processes models based on work practice descriptions.

  15. An analytical approach to customer requirement information processing

    Science.gov (United States)

    Zhou, Zude; Xiao, Zheng; Liu, Quan; Ai, Qingsong

    2013-11-01

    'Customer requirements' (CRs) management is a key component of customer relationship management (CRM). By processing customer-focused information, CRs management plays an important role in enterprise systems (ESs). Although two main CRs analysis methods, quality function deployment (QFD) and Kano model, have been applied to many fields by many enterprises in the past several decades, the limitations such as complex processes and operations make them unsuitable for online businesses among small- and medium-sized enterprises (SMEs). Currently, most SMEs do not have the resources to implement QFD or Kano model. In this article, we propose a method named customer requirement information (CRI), which provides a simpler and easier way for SMEs to run CRs analysis. The proposed method analyses CRs from the perspective of information and applies mathematical methods to the analysis process. A detailed description of CRI's acquisition, classification and processing is provided.

  16. Visual Information Processing for Television and Telerobotics

    Science.gov (United States)

    Huck, Friedrich O. (Editor); Park, Stephen K. (Editor)

    1989-01-01

    This publication is a compilation of the papers presented at the NASA conference on Visual Information Processing for Television and Telerobotics. The conference was held at the Williamsburg Hilton, Williamsburg, Virginia on May 10 to 12, 1989. The conference was sponsored jointly by NASA Offices of Aeronautics and Space Technology (OAST) and Space Science and Applications (OSSA) and the NASA Langley Research Center. The presentations were grouped into three sessions: Image Gathering, Coding, and Advanced Concepts; Systems; and Technologies. The program was organized to provide a forum in which researchers from industry, universities, and government could be brought together to discuss the state of knowledge in image gathering, coding, and processing methods.

  17. Quantum information processing with trapped ions

    International Nuclear Information System (INIS)

    Haeffner, H.; Haensel, W.; Rapol, U.; Koerber, T.; Benhelm, J.; Riebe, M.; Chek-al-Kar, D.; Schmidt-Kaler, F.; Becher, C.; Roos, C.; Blatt, R.

    2005-01-01

    Single Ca + ions and crystals of Ca + ions are confined in a linear Paul trap and are investigated for quantum information processing. Here we report on recent experimental advancements towards a quantum computer with such a system. Laser-cooled trapped ions are ideally suited systems for the investigation and implementation of quantum information processing as one can gain almost complete control over their internal and external degrees of freedom. The combination of a Paul type ion trap with laser cooling leads to unique properties of trapped cold ions, such as control of the motional state down to the zero-point of the trapping potential, a high degree of isolation from the environment and thus a very long time available for manipulations and interactions at the quantum level. The very same properties make single trapped atoms and ions well suited for storing quantum information in long lived internal states, e.g. by encoding a quantum bit (qubit) of information within the coherent superposition of the S 1/2 ground state and the metastable D 5/2 excited state of Ca + . Recently we have achieved the implementation of simple algorithms with up to 3 qubits on an ion-trap quantum computer. We will report on methods to implement single qubit rotations, the realization of a two-qubit universal quantum gate (Cirac-Zoller CNOT-gate), the deterministic generation of multi-particle entangled states (GHZ- and W-states), their full tomographic reconstruction, the realization of deterministic quantum teleportation, its quantum process tomography and the encoding of quantum information in decoherence-free subspaces with coherence times exceeding 20 seconds. (author)

  18. Quantum Information Processing with Trapped Ions

    International Nuclear Information System (INIS)

    Barrett, M.D.; Schaetz, T.; Chiaverini, J.; Leibfried, D.; Britton, J.; Itano, W.M.; Jost, J.D.; Langer, C.; Ozeri, R.; Wineland, D.J.; Knill, E.

    2005-01-01

    We summarize two experiments on the creation and manipulation of multi-particle entangled states of trapped atomic ions - quantum dense coding and quantum teleportation. The techniques used in these experiments constitute an important step toward performing large-scale quantum information processing. The techniques also have application in other areas of physics, providing improvement in quantum-limited measurement and fundamental tests of quantum mechanical principles, for example

  19. Aiming for knowledge information processing systems

    Energy Technology Data Exchange (ETDEWEB)

    Fuchi, K

    1982-01-01

    The Fifth Generation Computer Project in Japan intends to develop a new generation of computers by extensive research in many areas. This paper discusses many research topics which the Japanese are hoping will lead to a radical new knowledge information processing system. Topics discussed include new computer architecture, programming styles, semantics of programming languages, relational databases, linguistics theory, artificial intelligence, functional images and interference systems.

  20. Processing Information in Quantum Decision Theory

    OpenAIRE

    Yukalov, V. I.; Sornette, D.

    2008-01-01

    A survey is given summarizing the state of the art of describing information processing in Quantum Decision Theory, which has been recently advanced as a novel variant of decision making, based on the mathematical theory of separable Hilbert spaces. This mathematical structure captures the effect of superposition of composite prospects, including many incorporated intended actions. The theory characterizes entangled decision making, non-commutativity of subsequent decisions, and intention int...

  1. Manipulating cold atoms for quantum information processing

    International Nuclear Information System (INIS)

    Knight, P.

    2005-01-01

    Full text: I will describe how cold atoms can be manipulated to realize arrays of addressable qbits as prototype quantum registers, focussing on how atom chips can be used in combination with cavity qed techniques to form such an array. I will discuss how the array can be generated and steered using optical lattices and the Mott transition, and describe the sources of noise and how these place limits on the use of such chips in quantum information processing. (author)

  2. Topic Models in Information Retrieval

    Science.gov (United States)

    2007-08-01

    Information Processing Systems, Cambridge, MA, MIT Press, 2004. Brown, P.F., Della Pietra, V.J., deSouza, P.V., Lai, J.C. and Mercer, R.L., Class-based...2003. http://www.wkap.nl/prod/b/1-4020-1216-0. Croft, W.B., Lucia , T.J., Cringean, J., and Willett, P., Retrieving Documents By Plausible Inference

  3. Quantum information processing and nuclear magnetic resonance

    International Nuclear Information System (INIS)

    Cummins, H.K.

    2001-01-01

    Quantum computers are information processing devices which operate by and exploit the laws of quantum mechanics, potentially allowing them to solve problems which are intractable using classical computers. This dissertation considers the practical issues involved in one of the more successful implementations to date, nuclear magnetic resonance (NMR). Techniques for dealing with systematic errors are presented, and a quantum protocol is implemented. Chapter 1 is a brief introduction to quantum computation. The physical basis of its efficiency and issues involved in its implementation are discussed. NMR quantum information processing is reviewed in more detail in Chapter 2. Chapter 3 considers some of the errors that may be introduced in the process of implementing an algorithm, and high-level ways of reducing the impact of these errors by using composite rotations. Novel general expressions for stabilising composite rotations are presented in Chapter 4 and a new class of composite rotations, tailored composite rotations, presented in Chapter 5. Chapter 6 describes some of the advantages and pitfalls of combining composite rotations. Experimental evaluations of the composite rotations are given in each case. An actual implementation of a quantum information protocol, approximate quantum cloning, is presented in Chapter 7. The dissertation ends with appendices which contain expansions of some equations and detailed calculations of certain composite rotation results, as well as spectrometer pulse sequence programs. (author)

  4. Building Information Modeling Comprehensive Overview

    Directory of Open Access Journals (Sweden)

    Sergey Kalinichuk

    2015-07-01

    Full Text Available The article is addressed to provide a comprehensive review on recently accelerated development of the Information Technology within project market such as industrial, engineering, procurement and construction. Author’s aim is to cover the last decades of the growth of the Information and Communication Technology in construction industry in particular Building Information Modeling and testifies that the problem of a choice of the effective project realization method not only has not lost its urgency, but has also transformed into one of the major condition of the intensive technology development. All of it has created a great impulse on shortening the project duration and has led to the development of various schedule compression techniques what becomes a focus of modern construction.

  5. Processing multilevel secure test and evaluation information

    Science.gov (United States)

    Hurlburt, George; Hildreth, Bradley; Acevedo, Teresa

    1994-07-01

    The Test and Evaluation Community Network (TECNET) is building a Multilevel Secure (MLS) system. This system features simultaneous access to classified and unclassified information and easy access through widely available communications channels. It provides the necessary separation of classification levels, assured through the use of trusted system design techniques, security assessments and evaluations. This system enables cleared T&E users to view and manipulate classified and unclassified information resources either using a single terminal interface or multiple windows in a graphical user interface. TECNET is in direct partnership with the National Security Agency (NSA) to develop and field the MLS TECNET capability in the near term. The centerpiece of this partnership is a state-of-the-art Concurrent Systems Security Engineering (CSSE) process. In developing the MLS TECNET capability, TECNET and NSA are providing members, with various expertise and diverse backgrounds, to participate in the CSSE process. The CSSE process is founded on the concepts of both Systems Engineering and Concurrent Engineering. Systems Engineering is an interdisciplinary approach to evolve and verify an integrated and life cycle balanced set of system product and process solutions that satisfy customer needs (ASD/ENS-MIL STD 499B 1992). Concurrent Engineering is design and development using the simultaneous, applied talents of a diverse group of people with the appropriate skills. Harnessing diverse talents to support CSSE requires active participation by team members in an environment that both respects and encourages diversity.

  6. Application of information and communication technology in process reengineering

    Directory of Open Access Journals (Sweden)

    Đurović Aleksandar M.

    2014-01-01

    Full Text Available This paper examines the role of information communication technologies in reengineering processes. General analysis of a process will show that information communication technologies improve their efficiency. Reengineering model based on the BPMN 2.0 standard will be applied to the process of seeking internship/job by students from Faculty of Transport and Traffic Engineering. In the paper, after defining the technical characteristics and required functionalities, web / mobile application is proposed, enabling better visibility of traffic engineers to companies seeking that education profile.

  7. Neuroscientific Model of Motivational Process

    OpenAIRE

    Kim, Sung-il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Rewa...

  8. Hierarchical process memory: memory as an integral component of information processing

    Science.gov (United States)

    Hasson, Uri; Chen, Janice; Honey, Christopher J.

    2015-01-01

    Models of working memory commonly focus on how information is encoded into and retrieved from storage at specific moments. However, in the majority of real-life processes, past information is used continuously to process incoming information across multiple timescales. Considering single unit, electrocorticography, and functional imaging data, we argue that (i) virtually all cortical circuits can accumulate information over time, and (ii) the timescales of accumulation vary hierarchically, from early sensory areas with short processing timescales (tens to hundreds of milliseconds) to higher-order areas with long processing timescales (many seconds to minutes). In this hierarchical systems perspective, memory is not restricted to a few localized stores, but is intrinsic to information processing that unfolds throughout the brain on multiple timescales. “The present contains nothing more than the past, and what is found in the effect was already in the cause.”Henri L Bergson PMID:25980649

  9. Process-aware information systems : design, enactment and analysis

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Wah, B.W.

    2009-01-01

    Process-aware information systems support operational business processes by combining advances in information technology with recent insights from management science. Workflow management systems are typical examples of such systems. However, many other types of information systems are also "process

  10. A descriptive model of information problem solving while using internet

    NARCIS (Netherlands)

    Brand-Gruwel, Saskia; Wopereis, Iwan; Walraven, Amber

    2009-01-01

    This paper presents the IPS-I-model: a model that describes the process of information problem solving (IPS) in which the Internet (I) is used to search information. The IPS-I-model is based on three studies, in which students in secondary and (post) higher education were asked to solve information

  11. Conditioning from an information processing perspective.

    Science.gov (United States)

    Gallistel, C R.

    2003-04-28

    The framework provided by Claude Shannon's [Bell Syst. Technol. J. 27 (1948) 623] theory of information leads to a quantitatively oriented reconceptualization of the processes that mediate conditioning. The focus shifts from processes set in motion by individual events to processes sensitive to the information carried by the flow of events. The conception of what properties of the conditioned and unconditioned stimuli are important shifts from the tangible properties to the intangible properties of number, duration, frequency and contingency. In this view, a stimulus becomes a CS if its onset substantially reduces the subject's uncertainty about the time of occurrence of the next US. One way to represent the subject's knowledge of that time of occurrence is by the cumulative probability function, which has two limiting forms: (1) The state of maximal uncertainty (minimal knowledge) is represented by the inverse exponential function for the random rate condition, in which the US is equally likely at any moment. (2) The limit to the subject's attainable certainty is represented by the cumulative normal function, whose momentary expectation is the CS-US latency minus the time elapsed since CS onset. Its standard deviation is the Weber fraction times the CS-US latency.

  12. Information processing in decision-making systems.

    Science.gov (United States)

    van der Meer, Matthijs; Kurth-Nelson, Zeb; Redish, A David

    2012-08-01

    Decisions result from an interaction between multiple functional systems acting in parallel to process information in very different ways, each with strengths and weaknesses. In this review, the authors address three action-selection components of decision-making: The Pavlovian system releases an action from a limited repertoire of potential actions, such as approaching learned stimuli. Like the Pavlovian system, the habit system is computationally fast but, unlike the Pavlovian system permits arbitrary stimulus-action pairings. These associations are a "forward'' mechanism; when a situation is recognized, the action is released. In contrast, the deliberative system is flexible but takes time to process. The deliberative system uses knowledge of the causal structure of the world to search into the future, planning actions to maximize expected rewards. Deliberation depends on the ability to imagine future possibilities, including novel situations, and it allows decisions to be taken without having previously experienced the options. Various anatomical structures have been identified that carry out the information processing of each of these systems: hippocampus constitutes a map of the world that can be used for searching/imagining the future; dorsal striatal neurons represent situation-action associations; and ventral striatum maintains value representations for all three systems. Each system presents vulnerabilities to pathologies that can manifest as psychiatric disorders. Understanding these systems and their relation to neuroanatomy opens up a deeper way to treat the structural problems underlying various disorders.

  13. Deterministic geologic processes and stochastic modeling

    International Nuclear Information System (INIS)

    Rautman, C.A.; Flint, A.L.

    1992-01-01

    This paper reports that recent outcrop sampling at Yucca Mountain, Nevada, has produced significant new information regarding the distribution of physical properties at the site of a potential high-level nuclear waste repository. consideration of the spatial variability indicates that her are a number of widespread deterministic geologic features at the site that have important implications for numerical modeling of such performance aspects as ground water flow and radionuclide transport. Because the geologic processes responsible for formation of Yucca Mountain are relatively well understood and operate on a more-or-less regional scale, understanding of these processes can be used in modeling the physical properties and performance of the site. Information reflecting these deterministic geologic processes may be incorporated into the modeling program explicitly using geostatistical concepts such as soft information, or implicitly, through the adoption of a particular approach to modeling

  14. Modeling of column apparatus processes

    CERN Document Server

    Boyadjiev, Christo; Boyadjiev, Boyan; Popova-Krumova, Petya

    2016-01-01

    This book presents a new approach for the modeling of chemical and interphase mass transfer processes in industrial column apparatuses, using convection-diffusion and average-concentration models. The convection-diffusion type models are used for a qualitative analysis of the processes and to assess the main, small and slight physical effects, and then reject the slight effects. As a result, the process mechanism can be identified. It also introduces average concentration models for quantitative analysis, which use the average values of the velocity and concentration over the cross-sectional area of the column. The new models are used to analyze different processes (simple and complex chemical reactions, absorption, adsorption and catalytic reactions), and make it possible to model the processes of gas purification with sulfur dioxide, which form the basis of several patents.

  15. Quantum Information Processing using Nonlinear Optical Effects

    DEFF Research Database (Denmark)

    Andersen, Lasse Mejling

    This PhD thesis treats applications of nonlinear optical effects for quantum information processing. The two main applications are four-wave mixing in the form of Bragg scattering (BS) for quantum-state-preserving frequency conversion, and sum-frequency generation (SFG) in second-order nonlinear......-chirping the pumps. In the high-conversion regime without the effects of NPM, exact Green functions for BS are derived. In this limit, separability is possible for conversion efficiencies up to 60 %. However, the system still allows for selective frequency conversion as well as re-shaping of the output. One way...

  16. Quantum wells for optical information processing

    International Nuclear Information System (INIS)

    Miller, D.A.B.

    1989-01-01

    Quantum wells, alternate thin layers of two different semiconductor materials, show an exceptional electric field dependence of the optical absorption, called the quantum-confined Stark effect (QCSE), for electric fields perpendicular to the layers. This enables electrically controlled optical modulators and optically controlled self-electro-optic-effect devices that can operate at high speed and low energy density. Recent developments in these QCSE devices are summarized, including new device materials and novel device structures. The variety of sophisticated devices now demonstrated is promising for applications to information processing

  17. Evolutionary relevance facilitates visual information processing.

    Science.gov (United States)

    Jackson, Russell E; Calvillo, Dusti P

    2013-11-03

    Visual search of the environment is a fundamental human behavior that perceptual load affects powerfully. Previously investigated means for overcoming the inhibitions of high perceptual load, however, generalize poorly to real-world human behavior. We hypothesized that humans would process evolutionarily relevant stimuli more efficiently than evolutionarily novel stimuli, and evolutionary relevance would mitigate the repercussions of high perceptual load during visual search. Animacy is a significant component to evolutionary relevance of visual stimuli because perceiving animate entities is time-sensitive in ways that pose significant evolutionary consequences. Participants completing a visual search task located evolutionarily relevant and animate objects fastest and with the least impact of high perceptual load. Evolutionarily novel and inanimate objects were located slowest and with the highest impact of perceptual load. Evolutionary relevance may importantly affect everyday visual information processing.

  18. Evolutionary Relevance Facilitates Visual Information Processing

    Directory of Open Access Journals (Sweden)

    Russell E. Jackson

    2013-07-01

    Full Text Available Visual search of the environment is a fundamental human behavior that perceptual load affects powerfully. Previously investigated means for overcoming the inhibitions of high perceptual load, however, generalize poorly to real-world human behavior. We hypothesized that humans would process evolutionarily relevant stimuli more efficiently than evolutionarily novel stimuli, and evolutionary relevance would mitigate the repercussions of high perceptual load during visual search. Animacy is a significant component to evolutionary relevance of visual stimuli because perceiving animate entities is time-sensitive in ways that pose significant evolutionary consequences. Participants completing a visual search task located evolutionarily relevant and animate objects fastest and with the least impact of high perceptual load. Evolutionarily novel and inanimate objects were located slowest and with the highest impact of perceptual load. Evolutionary relevance may importantly affect everyday visual information processing.

  19. Modelling the Replication Management in Information Systems

    Directory of Open Access Journals (Sweden)

    Cezar TOADER

    2017-01-01

    Full Text Available In the modern economy, the benefits of Web services are significant because they facilitates the activities automation in the framework of Internet distributed businesses as well as the cooperation between organizations through interconnection process running in the computer systems. This paper presents the development stages of a model for a reliable information system. This paper describes the communication between the processes within the distributed system, based on the message exchange, and also presents the problem of distributed agreement among processes. A list of objectives for the fault-tolerant systems is defined and a framework model for distributed systems is proposed. This framework makes distinction between management operations and execution operations. The proposed model promotes the use of a central process especially designed for the coordination and control of other application processes. The execution phases and the protocols for the management and the execution components are presented. This model of a reliable system could be a foundation for an entire class of distributed systems models based on the management of replication process.

  20. Group creativity and innovation: a motivated information processing perspective

    NARCIS (Netherlands)

    de Dreu, C.K.W.; Nijstad, B.A.; Bechtoldt, M.N.; Baas, M.

    2011-01-01

    The authors review the Motivated Information Processing in Groups Model (De Dreu, Nijstad, & Van Knippenberg, 2008) to understand group creativity and innovation. Although distinct phenomena, group creativity and innovation are both considered a function of epistemic motivation (EM; the degree to

  1. Information Processing in Adolescents with Bipolar I Disorder

    Science.gov (United States)

    Whitney, Jane; Joormann, Jutta; Gotlib, Ian H.; Kelley, Ryan G.; Acquaye, Tenah; Howe, Meghan; Chang, Kiki D.; Singh, Manpreet K.

    2012-01-01

    Background: Cognitive models of bipolar I disorder (BD) may aid in identification of children who are especially vulnerable to chronic mood dysregulation. Information-processing biases related to memory and attention likely play a role in the development and persistence of BD among adolescents; however, these biases have not been extensively…

  2. Review of "Conceptual Structures: Information Processing in Mind and Machine."

    Science.gov (United States)

    Smoliar, Stephen W.

    This review of the book, "Conceptual Structures: Information Processing in Mind and Machine," by John F. Sowa, argues that anyone who plans to get involved with issues of knowledge representation should have at least a passing acquaintance with Sowa's conceptual graphs for a database interface. (Used to model the underlying semantics of…

  3. Interactivity, Information Processing, and Learning on the World Wide Web.

    Science.gov (United States)

    Tremayne, Mark; Dunwoody, Sharon

    2001-01-01

    Examines the role of interactivity in the presentation of science news on the World Wide Web. Proposes and tests a model of interactive information processing that suggests that characteristics of users and Web sites influence interactivity, which influences knowledge acquisition. Describes use of a think-aloud method to study participants' mental…

  4. The Evolution Process on Information Technology Outsourcing Relationship

    OpenAIRE

    Duan Weihua

    2017-01-01

    Information technology outsourcing relationship is one of the key issues to IT outsourcing success. To explore how to manage and promote IT outsourcing relationship, it is necessary to understand its evolution process. Firstly, the types of IT outsourcing based on relationship quality and IT outsourcing project level will be analyzed; Secondly, two evolution process models of IT outsourcing relationship are proposed based on relationship quality and IT outsourcing project level, and the IT ou...

  5. Real-time information and processing system for radiation protection

    International Nuclear Information System (INIS)

    Oprea, I.; Oprea, M.; Stoica, M.; Badea, E.; Guta, V.

    1999-01-01

    The real-time information and processing system has as main task to record, collect, process and transmit the radiation level and weather data, being proposed for radiation protection, environmental monitoring around nuclear facilities and for civil defence. Such a system can offer information in order to provide mapping, data base, modelling and communication and to assess the consequences of nuclear accidents. The system incorporates a number of stationary or mobile radiation monitoring equipment, weather parameter measuring station, a GIS-based information processing center and the communication network, all running on a real-time operating system. It provides the automatic data collection on-line and off-line, remote diagnostic, advanced presentation techniques, including a graphically oriented executive support, which has the ability to respond to an emergency by geographical representation of the hazard zones on the map.The system can be integrated into national or international environmental monitoring systems, being based on local intelligent measuring and transmission units, simultaneous processing and data presentation using a real-time operating system for PC and geographical information system (GIS). Such an integrated system is composed of independent applications operating under the same computer, which is capable to improve the protection of the population and decision makers efforts, updating the remote GIS data base. All information can be managed directly from the map by multilevel data retrieving and presentation by using on-line dynamic evolution of the events, environment information, evacuation optimization, image and voice processing

  6. Modeling nuclear processes by Simulink

    Energy Technology Data Exchange (ETDEWEB)

    Rashid, Nahrul Khair Alang Md, E-mail: nahrul@iium.edu.my [Faculty of Engineering, International Islamic University Malaysia, Jalan Gombak, Selangor (Malaysia)

    2015-04-29

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  7. Modeling nuclear processes by Simulink

    International Nuclear Information System (INIS)

    Rashid, Nahrul Khair Alang Md

    2015-01-01

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples

  8. Quantum teleportation for continuous variables and related quantum information processing

    International Nuclear Information System (INIS)

    Furusawa, Akira; Takei, Nobuyuki

    2007-01-01

    Quantum teleportation is one of the most important subjects in quantum information science. This is because quantum teleportation can be regarded as not only quantum information transfer but also a building block for universal quantum information processing. Furthermore, deterministic quantum information processing is very important for efficient processing and it can be realized with continuous-variable quantum information processing. In this review, quantum teleportation for continuous variables and related quantum information processing are reviewed from these points of view

  9. Utility-based early modulation of processing distracting stimulus information.

    Science.gov (United States)

    Wendt, Mike; Luna-Rodriguez, Aquiles; Jacobsen, Thomas

    2014-12-10

    Humans are selective information processors who efficiently prevent goal-inappropriate stimulus information to gain control over their actions. Nonetheless, stimuli, which are both unnecessary for solving a current task and liable to cue an incorrect response (i.e., "distractors"), frequently modulate task performance, even when consistently paired with a physical feature that makes them easily discernible from target stimuli. Current models of cognitive control assume adjustment of the processing of distractor information based on the overall distractor utility (e.g., predictive value regarding the appropriate response, likelihood to elicit conflict with target processing). Although studies on distractor interference have supported the notion of utility-based processing adjustment, previous evidence is inconclusive regarding the specificity of this adjustment for distractor information and the stage(s) of processing affected. To assess the processing of distractors during sensory-perceptual phases we applied EEG recording in a stimulus identification task, involving successive distractor-target presentation, and manipulated the overall distractor utility. Behavioral measures replicated previously found utility modulations of distractor interference. Crucially, distractor-evoked visual potentials (i.e., posterior N1) were more pronounced in high-utility than low-utility conditions. This effect generalized to distractors unrelated to the utility manipulation, providing evidence for item-unspecific adjustment of early distractor processing to the experienced utility of distractor information. Copyright © 2014 the authors 0270-6474/14/3416720-06$15.00/0.

  10. Event Modeling in UML. Unified Modeling Language and Unified Process

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2002-01-01

    We show how events can be modeled in terms of UML. We view events as change agents that have consequences and as information objects that represent information. We show how to create object-oriented structures that represent events in terms of attributes, associations, operations, state charts......, and messages. We outline a run-time environment for the processing of events with multiple participants....

  11. Learning to rank for information retrieval and natural language processing

    CERN Document Server

    Li, Hang

    2014-01-01

    Learning to rank refers to machine learning techniques for training a model in a ranking task. Learning to rank is useful for many applications in information retrieval, natural language processing, and data mining. Intensive studies have been conducted on its problems recently, and significant progress has been made. This lecture gives an introduction to the area including the fundamental problems, major approaches, theories, applications, and future work.The author begins by showing that various ranking problems in information retrieval and natural language processing can be formalized as tw

  12. New certification process starts. Pt. 2. Application of Building Information Modeling (BIM) for the certification according to DGNB; Zertifizierungsprozess auf ''neuen Beinen''. T. 2. Anwendung von Building Informationen (BIM) zur Zertifizierung nach DGNB

    Energy Technology Data Exchange (ETDEWEB)

    Essig, Bernd [SCHOLZE Consulting GmbH, Leinfelden-Echterdingen (Germany). Geschaeftsbereich Ingenieur-Beratungsleistungen in Facility-, Informations-, Qualitaets-, Nachhaltigkeitsmanagement und Energieberatung; Ernst, Tatjana [SCHOLZE Consulting GmbH, Leinfelden-Echterdingen (Germany)

    2013-04-01

    The overall life cycle plays a significant role in the certification by German Sustainable Building Council (Stuttgart, Federal Republic of Germany). In order that certification becomes much more than only a documentation, comprehensive data are requested already in an early phase of planning. Early investigations of alternatives may have a significant influence on the life cycle of a building. But, from where do the auditors achieve these amounts of data without overcharging the partners involved in the planning process with questions. The Building Information Modelling contributes to the development of the future enhancement of process quality by an innovative information management.

  13. Role of Information Anxiety and Information Load on Processing of Prescription Drug Information Leaflets.

    Science.gov (United States)

    Bapat, Shweta S; Patel, Harshali K; Sansgiry, Sujit S

    2017-10-16

    In this study, we evaluate the role of information anxiety and information load on the intention to read information from prescription drug information leaflets (PILs). These PILs were developed based on the principals of information load and consumer information processing. This was an experimental prospective repeated measures study conducted in the United States where 360 (62% response rate) university students (>18 years old) participated. Participants were presented with a scenario followed by exposure to the three drug product information sources used to operationalize information load. The three sources were: (i) current practice; (ii) pre-existing one-page text only; and (iii) interventional one-page prototype PILs designed for the study. Information anxiety was measured as anxiety experienced by the individual when encountering information. The outcome variable of intention to read PILs was defined as the likelihood that the patient will read the information provided in the leaflets. A survey questionnaire was used to capture the data and the objectives were analyzed by performing a repeated measures MANOVA using SAS version 9.3. When compared to current practice and one-page text only leaflets, one-page PILs had significantly lower scores on information anxiety ( p information load ( p Information anxiety and information load significantly impacted intention to read ( p < 0.001). Newly developed PILs increased patient's intention to read and can help in improving the counseling services provided by pharmacists.

  14. modeling grinding modeling grinding processes as micro processes

    African Journals Online (AJOL)

    eobe

    industrial precision grinding processes are cylindrical, center less and ... Several model shave been proposed and used to study grinding ..... grinding force for the two cases were 9.07237N/mm ..... International Journal of Machine Tools &.

  15. Working memory capacity and redundant information processing efficiency.

    Science.gov (United States)

    Endres, Michael J; Houpt, Joseph W; Donkin, Chris; Finn, Peter R

    2015-01-01

    Working memory capacity (WMC) is typically measured by the amount of task-relevant information an individual can keep in mind while resisting distraction or interference from task-irrelevant information. The current research investigated the extent to which differences in WMC were associated with performance on a novel redundant memory probes (RMP) task that systematically varied the amount of to-be-remembered (targets) and to-be-ignored (distractor) information. The RMP task was designed to both facilitate and inhibit working memory search processes, as evidenced by differences in accuracy, response time, and Linear Ballistic Accumulator (LBA) model estimates of information processing efficiency. Participants (N = 170) completed standard intelligence tests and dual-span WMC tasks, along with the RMP task. As expected, accuracy, response-time, and LBA model results indicated memory search and retrieval processes were facilitated under redundant-target conditions, but also inhibited under mixed target/distractor and redundant-distractor conditions. Repeated measures analyses also indicated that, while individuals classified as high (n = 85) and low (n = 85) WMC did not differ in the magnitude of redundancy effects, groups did differ in the efficiency of memory search and retrieval processes overall. Results suggest that redundant information reliably facilitates and inhibits the efficiency or speed of working memory search, and these effects are independent of more general limits and individual differences in the capacity or space of working memory.

  16. Random Matrices for Information Processing – A Democratic Vision

    DEFF Research Database (Denmark)

    Cakmak, Burak

    The thesis studies three important applications of random matrices to information processing. Our main contribution is that we consider probabilistic systems involving more general random matrix ensembles than the classical ensembles with iid entries, i.e. models that account for statistical...... dependence between the entries. Specifically, the involved matrices are invariant or fulfill a certain asymptotic freeness condition as their dimensions grow to infinity. Informally speaking, all latent variables contribute to the system model in a democratic fashion – there are no preferred latent variables...

  17. The cognitive viewpoint on information science and processing information in cognitive psychology - a vision for interdisciplinary

    Directory of Open Access Journals (Sweden)

    Shirley Guimarães Pimenta

    2012-08-01

    Full Text Available The interaction amongst the ‘user’, ‘information’, and ‘text’ is of interest to Information Science although it has deserved insufficient attention in the literature. This issue is addressed by this paper whose main purpose is to contribute to the discussion of theoretical affinity between the cognitive viewpoint in Information Science and the information processing approach in Cognitive Psychology. Firstly, the interdisciplinary nature of Information Science is discussed and justified as a means to deepen and strengthen its theoretical framework. Such interdisciplinarity helps to avoid stagnation and keep pace with other disciplines. Secondly, the discussion takes into consideration the cognitive paradigm, which originates the cognitive viewpoint approach in Information Science. It is highlighted that the cognitive paradigm represented a change in the Social Sciences due to the shift of focus from the object and the signal to the individual. Besides that, it sheds light to the notion of models of worlds, i.e., the systems of categories and concepts that guide the interaction between the individual and his/her environment. Thirdly, the theoretical assumptions of the cognitive viewpoint approach are discussed, with emphasis on the concept of ‘information’, as resulting of cognitive processes and as related to the notion of ‘text’. This approach points out the relevance of understanding the interaction amongst users, information, and text. However, it lacks further development. Using notions which are common to both approaches, some of the gaps can be fulfilled. Finally, the concept of ‘text’, its constituents and structures are presented from the perspective of text comprehension models and according to the information processing approach. As a concluding remark, it is suggested that bringing together the cognitive viewpoint and the information processing approach can be enriching and fruitful to the both Information

  18. Prosody's Contribution to Fluency: An Examination of the Theory of Automatic Information Processing

    Science.gov (United States)

    Schrauben, Julie E.

    2010-01-01

    LaBerge and Samuels' (1974) theory of automatic information processing in reading offers a model that explains how and where the processing of information occurs and the degree to which processing of information occurs. These processes are dependent upon two criteria: accurate word decoding and automatic word recognition. However, LaBerge and…

  19. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    -change of a homogeneous Levy process. While the processes in these two classes share the same average behavior over time, the associated intensities exhibit very different properties. Concrete specifications are calibrated to data on the single names included in the iTraxx Europe index. The performances are compared......In reduced form default models, the instantaneous default intensity is classically the modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature has shown a tendency towards...... specifying the cumulative hazard process directly. Within this framework we present a new model class where cumulative hazards are described by self-similar additive processes, also known as Sato processes. Furthermore we also analyze specifications obtained via a simple deterministic time...

  20. Information processing and dynamics in minimally cognitive agents.

    Science.gov (United States)

    Beer, Randall D; Williams, Paul L

    2015-01-01

    There has been considerable debate in the literature about the relative merits of information processing versus dynamical approaches to understanding cognitive processes. In this article, we explore the relationship between these two styles of explanation using a model agent evolved to solve a relational categorization task. Specifically, we separately analyze the operation of this agent using the mathematical tools of information theory and dynamical systems theory. Information-theoretic analysis reveals how task-relevant information flows through the system to be combined into a categorization decision. Dynamical analysis reveals the key geometrical and temporal interrelationships underlying the categorization decision. Finally, we propose a framework for directly relating these two different styles of explanation and discuss the possible implications of our analysis for some of the ongoing debates in cognitive science. Copyright © 2014 Cognitive Science Society, Inc.

  1. Development of the operational information processing platform

    International Nuclear Information System (INIS)

    Shin, Hyun Kook; Park, Jeong Seok; Baek, Seung Min; Kim, Young Jin; Joo, Jae Yoon; Lee, Sang Mok; Jeong, Young Woo; Seo, Ho Jun; Kim, Do Youn; Lee, Tae Hoon

    1996-02-01

    The Operational Information Processing Platform(OIPP) is platform system which was designed to provide the development and operation environments for plant operation and plant monitoring. It is based on the Plant Computer Systems (PCS) of Yonggwang 3 and 4, Ulchin 3 and 4, and Yonggwang 5 and 6 Nuclear Power Plants (NPP). The UNIX based workstation, real time kernel and graphics design tool are selected and installed through the reviewing the function of PCS. In order to construct the development environment for open system architecture and distributed computer system, open computer system architecture was adapted both in hardware and software. For verification of system design and evaluation of technical methodologies, the PCS running under the OIPP is being designed and implemented. In this system, the man-machine interface and system functions are being designed and implemented to evaluate the differences between the UCN 3, 4 PCS and OIPP. 15 tabs., 32 figs., 11 refs. (Author)

  2. The Automation of Nowcast Model Assessment Processes

    Science.gov (United States)

    2016-09-01

    secondly, provide modelers with the information needed to understand the model errors and how their algorithm changes might mitigate these errors. In...by ARL modelers. 2. Development Environment The automation of Point-Stat processes (i.e., PSA) was developed using Python 3.5.* Python was selected...because it is easy to use, widely used for scripting, and satisfies all the requirements to automate the implementation of the Point-Stat tool. In

  3. Quantum information processing with optical vortices

    Energy Technology Data Exchange (ETDEWEB)

    Khoury, Antonio Z. [Universidade Federal Fluminense (UFF), Niteroi, RJ (Brazil)

    2012-07-01

    Full text: In this work we discuss several proposals for quantum information processing using the transverse structure of paraxial beams. Different techniques for production and manipulation of optical vortices have been employed and combined with polarization transformations in order to investigate fundamental properties of quantum entanglement as well as to propose new tools for quantum information processing. As an example, we have recently proposed and demonstrated a controlled NOT (CNOT) gate based on a Michelson interferometer in which the photon polarization is the control bit and the first order transverse mode is the target. The device is based on a single lens design for an astigmatic mode converter that transforms the transverse mode of paraxial optical beams. In analogy with Bell's inequality for two-qubit quantum states, we propose an inequality criterion for the non-separability of the spin-orbit degrees of freedom of a laser beam. A definition of separable and non-separable spin-orbit modes is used in consonance with the one presented in Phys. Rev. Lett. 99, 2007. As the usual Bell's inequality can be violated for entangled two-qubit quantum states, we show both theoretically and experimentally that the proposed spin-orbit inequality criterion can be violated for non-separable modes. The inequality is discussed both in the classical and quantum domains. We propose a polarization to orbital angular momentum teleportation scheme using entangled photon pairs generated by spontaneous parametric down conversion. By making a joint detection of the polarization and angular momentum parity of a single photon, we are able to detect all the Bell-states and perform, in principle, perfect teleportation from a discrete to a continuous system using minimal resources. The proposed protocol implementation demands experimental resources that are currently available in quantum optics laboratories. (author)

  4. Information Support of Processes in Warehouse Logistics

    Directory of Open Access Journals (Sweden)

    Gordei Kirill

    2013-11-01

    Full Text Available In the conditions of globalization and the world economic communications, the role of information support of business processes increases in various branches and fields of activity. There is not an exception for the warehouse activity. Such information support is realized in warehouse logistic systems. In relation to territorial administratively education, the warehouse logistic system gets a format of difficult social and economic structure which controls the economic streams covering the intermediary, trade and transport organizations and the enterprises of other branches and spheres. Spatial movement of inventory items makes new demands to participants of merchandising. Warehousing (in the meaning – storage – is one of the operations entering into logistic activity, on the organization of a material stream, as a requirement. Therefore, warehousing as "management of spatial movement of stocks" – is justified. Warehousing, in such understanding, tries to get rid of the perception as to containing stocks – a business expensive. This aspiration finds reflection in the logistic systems working by the principle: "just in time", "economical production" and others. Therefore, the role of warehouses as places of storage is transformed to understanding of warehousing as an innovative logistic system.

  5. Natural language processing and advanced information management

    Science.gov (United States)

    Hoard, James E.

    1989-01-01

    Integrating diverse information sources and application software in a principled and general manner will require a very capable advanced information management (AIM) system. In particular, such a system will need a comprehensive addressing scheme to locate the material in its docuverse. It will also need a natural language processing (NLP) system of great sophistication. It seems that the NLP system must serve three functions. First, it provides an natural language interface (NLI) for the users. Second, it serves as the core component that understands and makes use of the real-world interpretations (RWIs) contained in the docuverse. Third, it enables the reasoning specialists (RSs) to arrive at conclusions that can be transformed into procedures that will satisfy the users' requests. The best candidate for an intelligent agent that can satisfactorily make use of RSs and transform documents (TDs) appears to be an object oriented data base (OODB). OODBs have, apparently, an inherent capacity to use the large numbers of RSs and TDs that will be required by an AIM system and an inherent capacity to use them in an effective way.

  6. Analytic information processing style in epilepsy patients.

    Science.gov (United States)

    Buonfiglio, Marzia; Di Sabato, Francesco; Mandillo, Silvia; Albini, Mariarita; Di Bonaventura, Carlo; Giallonardo, Annateresa; Avanzini, Giuliano

    2017-08-01

    Relevant to the study of epileptogenesis is learning processing, given the pivotal role that neuroplasticity assumes in both mechanisms. Recently, evoked potential analyses showed a link between analytic cognitive style and altered neural excitability in both migraine and healthy subjects, regardless of cognitive impairment or psychological disorders. In this study we evaluated analytic/global and visual/auditory perceptual dimensions of cognitive style in patients with epilepsy. Twenty-five cryptogenic temporal lobe epilepsy (TLE) patients matched with 25 idiopathic generalized epilepsy (IGE) sufferers and 25 healthy volunteers were recruited and participated in three cognitive style tests: "Sternberg-Wagner Self-Assessment Inventory", the C. Cornoldi test series called AMOS, and the Mariani Learning style Questionnaire. Our results demonstrate a significant association between analytic cognitive style and both IGE and TLE and respectively a predominant auditory and visual analytic style (ANOVA: p values <0,0001). These findings should encourage further research to investigate information processing style and its neurophysiological correlates in epilepsy. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Process modelling on a canonical basis[Process modelling; Canonical modelling

    Energy Technology Data Exchange (ETDEWEB)

    Siepmann, Volker

    2006-12-20

    Based on an equation oriented solving strategy, this thesis investigates a new approach to process modelling. Homogeneous thermodynamic state functions represent consistent mathematical models of thermodynamic properties. Such state functions of solely extensive canonical state variables are the basis of this work, as they are natural objective functions in optimisation nodes to calculate thermodynamic equilibrium regarding phase-interaction and chemical reactions. Analytical state function derivatives are utilised within the solution process as well as interpreted as physical properties. By this approach, only a limited range of imaginable process constraints are considered, namely linear balance equations of state variables. A second-order update of source contributions to these balance equations is obtained by an additional constitutive equation system. These equations are general dependent on state variables and first-order sensitivities, and cover therefore practically all potential process constraints. Symbolic computation technology efficiently provides sparsity and derivative information of active equations to avoid performance problems regarding robustness and computational effort. A benefit of detaching the constitutive equation system is that the structure of the main equation system remains unaffected by these constraints, and a priori information allows to implement an efficient solving strategy and a concise error diagnosis. A tailor-made linear algebra library handles the sparse recursive block structures efficiently. The optimisation principle for single modules of thermodynamic equilibrium is extended to host entire process models. State variables of different modules interact through balance equations, representing material flows from one module to the other. To account for reusability and encapsulation of process module details, modular process modelling is supported by a recursive module structure. The second-order solving algorithm makes it

  8. Towards the understanding of network information processing in biology

    Science.gov (United States)

    Singh, Vijay

    Living organisms perform incredibly well in detecting a signal present in the environment. This information processing is achieved near optimally and quite reliably, even though the sources of signals are highly variable and complex. The work in the last few decades has given us a fair understanding of how individual signal processing units like neurons and cell receptors process signals, but the principles of collective information processing on biological networks are far from clear. Information processing in biological networks, like the brain, metabolic circuits, cellular-signaling circuits, etc., involves complex interactions among a large number of units (neurons, receptors). The combinatorially large number of states such a system can exist in makes it impossible to study these systems from the first principles, starting from the interactions between the basic units. The principles of collective information processing on such complex networks can be identified using coarse graining approaches. This could provide insights into the organization and function of complex biological networks. Here I study models of biological networks using continuum dynamics, renormalization, maximum likelihood estimation and information theory. Such coarse graining approaches identify features that are essential for certain processes performed by underlying biological networks. We find that long-range connections in the brain allow for global scale feature detection in a signal. These also suppress the noise and remove any gaps present in the signal. Hierarchical organization with long-range connections leads to large-scale connectivity at low synapse numbers. Time delays can be utilized to separate a mixture of signals with temporal scales. Our observations indicate that the rules in multivariate signal processing are quite different from traditional single unit signal processing.

  9. Implementation of the Business Process Modelling Notation (BPMN) in the modelling of anatomic pathology processes.

    Science.gov (United States)

    Rojo, Marcial García; Rolón, Elvira; Calahorra, Luis; García, Felix Oscar; Sánchez, Rosario Paloma; Ruiz, Francisco; Ballester, Nieves; Armenteros, María; Rodríguez, Teresa; Espartero, Rafael Martín

    2008-07-15

    Process orientation is one of the essential elements of quality management systems, including those in use in healthcare. Business processes in hospitals are very complex and variable. BPMN (Business Process Modelling Notation) is a user-oriented language specifically designed for the modelling of business (organizational) processes. Previous experiences of the use of this notation in the processes modelling within the Pathology in Spain or another country are not known. We present our experience in the elaboration of the conceptual models of Pathology processes, as part of a global programmed surgical patient process, using BPMN. With the objective of analyzing the use of BPMN notation in real cases, a multidisciplinary work group was created, including software engineers from the Dep. of Technologies and Information Systems from the University of Castilla-La Mancha and health professionals and administrative staff from the Hospital General de Ciudad Real. The work in collaboration was carried out in six phases: informative meetings, intensive training, process selection, definition of the work method, process describing by hospital experts, and process modelling. The modelling of the processes of Anatomic Pathology is presented using BPMN. The presented subprocesses are those corresponding to the surgical pathology examination of the samples coming from operating theatre, including the planning and realization of frozen studies. The modelling of Anatomic Pathology subprocesses has allowed the creation of an understandable graphical model, where management and improvements are more easily implemented by health professionals.

  10. What do information reuse and automated processing require in engineering design? Semantic process

    Directory of Open Access Journals (Sweden)

    Ossi Nykänen

    2011-12-01

    Full Text Available Purpose: The purpose of this study is to characterize, analyze, and demonstrate machine-understandable semantic process for validating, integrating, and processing technical design information. This establishes both a vision and tools for information reuse and semi-automatic processing in engineering design projects, including virtual machine laboratory applications with generated components.Design/methodology/approach: The process model has been developed iteratively in terms of action research, constrained by the existing technical design practices and assumptions (design documents, expert feedback, available technologies (pre-studies and experiments with scripting and pipeline tools, benchmarking with other process models and methods (notably the RUP and DITA, and formal requirements (computability and the critical information paths for the generated applications. In practice, the work includes both quantitative and qualitative components.Findings: Technical design processes may be greatly enhanced in terms of semantic process thinking, by enriching design information, and automating information validation and transformation tasks. Contemporary design information, however, is mainly intended for human consumption, and needs to be explicitly enriched with the currently missing data and interfaces. In practice, this may require acknowledging the role of technical information or knowledge engineer, to lead the development of the semantic design information process in a design organization. There is also a trade-off between machine-readability and system complexity that needs to be studied further, both empirically and in theory.Research limitations/implications: The conceptualization of the semantic process is essentially an abstraction based on the idea of progressive design. While this effectively allows implementing semantic processes with, e.g., pipeline technologies, the abstraction is valid only when technical design is organized into

  11. Social Models: Blueprints or Processes?

    Science.gov (United States)

    Little, Graham R.

    1981-01-01

    Discusses the nature and implications of two different models for societal planning: (1) the problem-solving process approach based on Karl Popper; and (2) the goal-setting "blueprint" approach based on Karl Marx. (DC)

  12. Simple Models for Process Control

    Czech Academy of Sciences Publication Activity Database

    Gorez, R.; Klán, Petr

    2011-01-01

    Roč. 22, č. 2 (2011), s. 58-62 ISSN 0929-2268 Institutional research plan: CEZ:AV0Z10300504 Keywords : process model s * PID control * second order dynamics Subject RIV: JB - Sensors, Measurment, Regulation

  13. Key drivers of the e-waste recycling system: Assessing and modelling e-waste processing in the informal sector in Delhi

    International Nuclear Information System (INIS)

    Streicher-Porte, Martin; Widmer, Rolf; Jain, Amit; Bader, Hans-Peter; Scheidegger, Ruth; Kytzia, Susanne

    2005-01-01

    The management and recycling of waste electrical and electronic equipment WEEE was assessed in the city of Delhi, India. In order to do this, the personal computer was defined as the tracer for which a model was designed. The model depicts the entire life cycle of the tracer, from production through sale and consumption-including reuse and refurbishment-to the material recovery in the mainly informal recycling industry. The field work included interviews with the relevant stakeholders, transect walks and literature study, which was followed by a software-supported material flow analysis (MFA) of the whole life cycle chain of the tracer item. In addition to the MFA, several economic aspects of the recycling system were investigated. The study revealed that the life span of a personal computer has considerable influence upon the system, most notably in the following two aspects: (i) a prolonged life span creates value by means of refurbishing and upgrading activities, and (ii) it slows down the flow rate of the whole system. This is one of the simplest ways of preventing an uncontrolled increase in environmentally hazardous emissions by the recycling sector. The material recovery of the system is mainly driven by the precious metal content of personal computers. A first estimate showed that precious metal recovery contributes to over 80% of the personal computer materials' market value, despite the small quantity of them found in computers

  14. Model feedstock supply processing plants

    Directory of Open Access Journals (Sweden)

    V. M. Bautin

    2013-01-01

    Full Text Available The model of raw providing the processing enterprises entering into vertically integrated structure on production and processing of dairy raw materials, differing by an orientation on achievement of cumulative effect by the integrated structure acting as criterion function which maximizing is reached by optimization of capacities, volumes of deliveries of raw materials and its qualitative characteristics, costs of industrial processing of raw materials and demand for dairy production is developed.

  15. Information processing by networks of quantum decision makers

    Science.gov (United States)

    Yukalov, V. I.; Yukalova, E. P.; Sornette, D.

    2018-02-01

    We suggest a model of a multi-agent society of decision makers taking decisions being based on two criteria, one is the utility of the prospects and the other is the attractiveness of the considered prospects. The model is the generalization of quantum decision theory, developed earlier for single decision makers realizing one-step decisions, in two principal aspects. First, several decision makers are considered simultaneously, who interact with each other through information exchange. Second, a multistep procedure is treated, when the agents exchange information many times. Several decision makers exchanging information and forming their judgment, using quantum rules, form a kind of a quantum information network, where collective decisions develop in time as a result of information exchange. In addition to characterizing collective decisions that arise in human societies, such networks can describe dynamical processes occurring in artificial quantum intelligence composed of several parts or in a cluster of quantum computers. The practical usage of the theory is illustrated on the dynamic disjunction effect for which three quantitative predictions are made: (i) the probabilistic behavior of decision makers at the initial stage of the process is described; (ii) the decrease of the difference between the initial prospect probabilities and the related utility factors is proved; (iii) the existence of a common consensus after multiple exchange of information is predicted. The predicted numerical values are in very good agreement with empirical data.

  16. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    2010-01-01

    In reduced form default models, the instantaneous default intensity is the classical modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature tends to specify the cumulative haz...

  17. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    In reduced form default models, the instantaneous default intensity is classically the modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature has shown a tendency towards...

  18. Metrics for Business Process Models

    Science.gov (United States)

    Mendling, Jan

    Up until now, there has been little research on why people introduce errors in real-world business process models. In a more general context, Simon [404] points to the limitations of cognitive capabilities and concludes that humans act rationally only to a certain extent. Concerning modeling errors, this argument would imply that human modelers lose track of the interrelations of large and complex models due to their limited cognitive capabilities and introduce errors that they would not insert in a small model. A recent study by Mendling et al. [275] explores in how far certain complexity metrics of business process models have the potential to serve as error determinants. The authors conclude that complexity indeed appears to have an impact on error probability. Before we can test such a hypothesis in a more general setting, we have to establish an understanding of how we can define determinants that drive error probability and how we can measure them.

  19. Using Interaction Scenarios to Model Information Systems

    DEFF Research Database (Denmark)

    Bækgaard, Lars; Bøgh Andersen, Peter

    The purpose of this paper is to define and discuss a set of interaction primitives that can be used to model the dynamics of socio-technical activity systems, including information systems, in a way that emphasizes structural aspects of the interaction that occurs in such systems. The primitives...... a number of case studies that indicate that interaction primitives can be useful modeling tools for supplementing conventional flow-oriented modeling of business processes....... are based on a unifying, conceptual definition of the disparate interaction types - a robust model of the types. The primitives can be combined and may thus represent mediated interaction. We present a set of visualizations that can be used to define multiple related interactions and we present and discuss...

  20. Behavioral conformance of artifact-centric process models

    NARCIS (Netherlands)

    Fahland, D.; Leoni, de M.; Dongen, van B.F.; Aalst, van der W.M.P.; Abramowicz, W.

    2011-01-01

    The use of process models in business information systems for analysis, execution, and improvement of processes assumes that the models describe reality. Conformance checking is a technique to validate how good a given process model describes recorded executions of the actual process. Recently,

  1. Parsimonious Language Models for Information Retrieval

    NARCIS (Netherlands)

    Hiemstra, Djoerd; Robertson, Stephen; Zaragoza, Hugo

    We systematically investigate a new approach to estimating the parameters of language models for information retrieval, called parsimonious language models. Parsimonious language models explicitly address the relation between levels of language models that are typically used for smoothing. As such,

  2. Organizational restructuring in response to changes in information-processing technology

    OpenAIRE

    Andrzej Baniak; Jacek Cukrowski

    1999-01-01

    This paper examines the effects of changes in information-processing technology on the efficient organizational forms of data-processing in decision-making systems. Data-processing is modelled in the framework of the dynamic parallel processing model of associative computation with an endogenous set-up costs of the processors. In such a model, the conditions for efficient organization of information-processing are defined and the architecture of the efficient structures is considered. It is s...

  3. Computer Modelling of Dynamic Processes

    Directory of Open Access Journals (Sweden)

    B. Rybakin

    2000-10-01

    Full Text Available Results of numerical modeling of dynamic problems are summed in the article up. These problems are characteristic for various areas of human activity, in particular for problem solving in ecology. The following problems are considered in the present work: computer modeling of dynamic effects on elastic-plastic bodies, calculation and determination of performances of gas streams in gas cleaning equipment, modeling of biogas formation processes.

  4. Declarative modeling for process supervision

    International Nuclear Information System (INIS)

    Leyval, L.

    1989-01-01

    Our work is a contribution to computer aided supervision of continuous processes. It is inspired by an area of Artificial Intelligence: qualitative physics. Here, supervision is based on a model which continuously provides operators with a synthetic view of the process; but this model is founded on general principles of control theory rather than on physics. It involves concepts such as high gain or small time response. It helps in linking temporally the evolution of various variables. Moreover, the model provides predictions of the future behaviour of the process, which allows action advice and alarm filtering. This should greatly reduce the famous cognitive overload associated to any complex and dangerous evolution of the process

  5. Models of organisation and information system design | Mohamed ...

    African Journals Online (AJOL)

    We devote this paper to the models of organisation, and see which is best suited to provide a basis for information processing and transmission. In this respect we shall be dealing with four models of organisation, namely: the classical mode, the behavioural model, the systems model and the cybernetic model of ...

  6. Aligning Business Process Quality and Information System Quality

    OpenAIRE

    Heinrich, Robert

    2013-01-01

    Business processes and information systems mutually affect each other in non-trivial ways. Frequently, the business process design and the information system design are not well aligned. This means that business processes are designed without taking the information system impact into account, and vice versa. Missing alignment at design time often results in quality problems at runtime, such as large response times of information systems, large process execution times, overloaded information s...

  7. Acquisition of Computers That Process Corporate Information

    National Research Council Canada - National Science Library

    Gimble, Thomas

    1999-01-01

    The Secretary of Defense announced the Corporate Information Management initiative on November 16, 1990, to establish a DoD-wide concept for managing computer, communications, and information management functions...

  8. Refreshing Information Literacy: Learning from Recent British Information Literacy Models

    Science.gov (United States)

    Martin, Justine

    2013-01-01

    Models play an important role in helping practitioners implement and promote information literacy. Over time models can lose relevance with the advances in technology, society, and learning theory. Practitioners and scholars often call for adaptations or transformations of these frameworks to articulate the learning needs in information literacy…

  9. On the correlation between process model metrics and errors

    NARCIS (Netherlands)

    Mendling, J.; Neumann, G.; Aalst, van der W.M.P.; Grundy, J.; Hartmann, S.; Laender, S.; Maciaszek, L.; Roddick, J.F.

    2007-01-01

    Business process models play an important role for the management, design, and improvement of process organizations and process-aware information systems. Despite the extensive application of process modeling in practice there are hardly empirical results available on quality aspects of process

  10. On the Intensification of Information Protection Processes

    Directory of Open Access Journals (Sweden)

    A. A. Malyuk

    2011-03-01

    Full Text Available The features of the information protection task solution in its modern statement as a complex problem that encompasses all aspects of information technology development are discussed. Such an interpretation would inevitably lead to an increase of the role of the systemic problems solution of which relies on advanced scientific and methodological basis, so called information protection processes’ intensification.

  11. Building information modelling (BIM: now and beyond

    Directory of Open Access Journals (Sweden)

    Salman Azhar

    2015-10-01

    Full Text Available Building Information Modeling (BIM, also called n-D Modeling or Virtual Prototyping Technology, is a revolutionary development that is quickly reshaping the Architecture-Engineering-Construction (AEC industry. BIM is both a technology and a process. The technology component of BIM helps project stakeholders to visualize what is to be built in a simulated environment to identify any potential design, construction or operational issues. The process component enables close collaboration and encourages integration of the roles of all stakeholders on a project. The paper presents an overview of BIM with focus on its core concepts, applications in the project life cycle and benefits for project stakeholders with the help of case studies. The paper also elaborates risks and barriers to BIM implementation and future trends.

  12. Building information modelling (BIM: now and beyond

    Directory of Open Access Journals (Sweden)

    Salman Azhar

    2012-12-01

    Full Text Available Building Information Modeling (BIM, also called n-D Modeling or Virtual Prototyping Technology, is a revolutionary development that is quickly reshaping the Architecture-Engineering-Construction (AEC industry. BIM is both a technology and a process. The technology component of BIM helps project stakeholders to visualize what is to be built in a simulated environment to identify any potential design, construction or operational issues. The process component enables close collaboration and encourages integration of the roles of all stakeholders on a project. The paper presents an overview of BIM with focus on its core concepts, applications in the project life cycle and benefits for project stakeholders with the help of case studies. The paper also elaborates risks and barriers to BIM implementation and future trends.

  13. Item Information in the Rasch Model

    NARCIS (Netherlands)

    Engelen, Ron J.H.; van der Linden, Willem J.; Oosterloo, Sebe J.

    1988-01-01

    Fisher's information measure for the item difficulty parameter in the Rasch model and its marginal and conditional formulations are investigated. It is shown that expected item information in the unconditional model equals information in the marginal model, provided the assumption of sampling

  14. Promoting information diffusion through interlayer recovery processes in multiplex networks

    Science.gov (United States)

    Wang, Xin; Li, Weihua; Liu, Longzhao; Pei, Sen; Tang, Shaoting; Zheng, Zhiming

    2017-09-01

    For information diffusion in multiplex networks, the effect of interlayer contagion on spreading dynamics has been explored in different settings. Nevertheless, the impact of interlayer recovery processes, i.e., the transition of nodes to stiflers in all layers after they become stiflers in any layer, still remains unclear. In this paper, we propose a modified ignorant-spreader-stifler model of rumor spreading equipped with an interlayer recovery mechanism. We find that the information diffusion can be effectively promoted for a range of interlayer recovery rates. By combining the mean-field approximation and the Markov chain approach, we derive the evolution equations of the diffusion process in two-layer homogeneous multiplex networks. The optimal interlayer recovery rate that achieves the maximal enhancement can be calculated by solving the equations numerically. In addition, we find that the promoting effect on a certain layer can be strengthened if information spreads more extensively within the counterpart layer. When applying the model to two-layer scale-free multiplex networks, with or without degree correlation, similar promoting effect is also observed in simulations. Our work indicates that the interlayer recovery process is beneficial to information diffusion in multiplex networks, which may have implications for designing efficient spreading strategies.

  15. Development of technical information processing system (VII)

    International Nuclear Information System (INIS)

    Kim, Tae Whan; Choi, Kwang; Oh, Jeong Hoon; Jeong, Hyun Sook; Keum, Jong Yong

    1995-12-01

    The goal of this project is to establish integrated environment focused on enhanced information services to researchers through the providing of acquisition information, key phrase retrieval function, journal content information linked with various subsystems already developed. The results of the project are as follows. 1. It is possible to serve information on unreceivable materials among required materials throughout the system. 2. Retrieval efficiency is increased by the adding of key phrase retrieval function. 3. Rapidity of information service is enhanced by the providing of journal contents of each issue received and work performance of contents service is become higher. 4. It is possible to acquire, store, serve technical information needed in R and D synthetically and systematically throughout the development of total system linked with various subsystems required to technical information management and service. 21 refs. (Author)

  16. Modeling of biopharmaceutical processes. Part 2: Process chromatography unit operation

    DEFF Research Database (Denmark)

    Kaltenbrunner, Oliver; McCue, Justin; Engel, Philip

    2008-01-01

    Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent. The theoret......Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent...

  17. Levy Random Bridges and the Modelling of Financial Information

    OpenAIRE

    Hoyle, Edward; Hughston, Lane P.; Macrina, Andrea

    2009-01-01

    The information-based asset-pricing framework of Brody, Hughston and Macrina (BHM) is extended to include a wider class of models for market information. In the BHM framework, each asset is associated with a collection of random cash flows. The price of the asset is the sum of the discounted conditional expectations of the cash flows. The conditional expectations are taken with respect to a filtration generated by a set of "information processes". The information processes carry imperfect inf...

  18. Neuroscientific Model of Motivational Process

    Science.gov (United States)

    Kim, Sung-il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment. PMID:23459598

  19. Neuroscientific model of motivational process.

    Science.gov (United States)

    Kim, Sung-Il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment.

  20. Model Information Exchange System (MIXS).

    Science.gov (United States)

    2013-08-01

    Many travel demand forecast models operate at state, regional, and local levels. While they share the same physical network in overlapping geographic areas, they use different and uncoordinated modeling networks. This creates difficulties for models ...

  1. Introduction to spiking neural networks: Information processing, learning and applications.

    Science.gov (United States)

    Ponulak, Filip; Kasinski, Andrzej

    2011-01-01

    The concept that neural information is encoded in the firing rate of neurons has been the dominant paradigm in neurobiology for many years. This paradigm has also been adopted by the theory of artificial neural networks. Recent physiological experiments demonstrate, however, that in many parts of the nervous system, neural code is founded on the timing of individual action potentials. This finding has given rise to the emergence of a new class of neural models, called spiking neural networks. In this paper we summarize basic properties of spiking neurons and spiking networks. Our focus is, specifically, on models of spike-based information coding, synaptic plasticity and learning. We also survey real-life applications of spiking models. The paper is meant to be an introduction to spiking neural networks for scientists from various disciplines interested in spike-based neural processing.

  2. ECONOMIC MODELING PROCESSES USING MATLAB

    Directory of Open Access Journals (Sweden)

    Anamaria G. MACOVEI

    2008-06-01

    Full Text Available To study economic phenomena and processes using mathem atical modeling, and to determine the approximatesolution to a problem we need to choose a method of calculation and a numerical computer program, namely thepackage of programs MatLab. Any economic process or phenomenon is a mathematical description of h is behavior,and thus draw up an economic and mathematical model that has the following stages: formulation of the problem, theanalysis process modeling, the production model and design verification, validation and implementation of the model.This article is presented an economic model and its modeling is using mathematical equations and software packageMatLab, which helps us approximation effective solution. As data entry is considered the net cost, the cost of direct andtotal cost and the link between them. I presented the basic formula for determining the total cost. Economic modelcalculations were made in MatLab software package and with graphic representation of its interpretation of the resultsachieved in terms of our specific problem.

  3. Modelling a uranium ore bioleaching process

    International Nuclear Information System (INIS)

    Chien, D.C.H.; Douglas, P.L.; Herman, D.H.; Marchbank, A.

    1990-01-01

    A dynamic simulation model for the bioleaching of uranium ore in a stope leaching process has been developed. The model incorporates design and operating conditions, reaction kinetics enhanced by Thiobacillus ferroxidans present in the leaching solution and transport properties. Model predictions agree well with experimental data with an average deviation of about ± 3%. The model is sensitive to small errors in the estimates of fragment size and ore grade. Because accurate estimates are difficult to obtain a parameter estimation approach was developed to update the value of fragment size and ore grade using on-line plant information

  4. Path modeling and process control

    DEFF Research Database (Denmark)

    Høskuldsson, Agnar; Rodionova, O.; Pomerantsev, A.

    2007-01-01

    and having three or more stages. The methods are applied to a process control of a multi-stage production process having 25 variables and one output variable. When moving along the process, variables change their roles. It is shown how the methods of path modeling can be applied to estimate variables...... be performed regarding the foreseeable output property y, and with respect to an admissible range of correcting actions for the parameters of the next stage. In this paper the basic principles of path modeling is presented. The mathematics is presented for processes having only one stage, having two stages...... of the next stage with the purpose of obtaining optimal or almost optimal quality of the output variable. An important aspect of the methods presented is the possibility of extensive graphic analysis of data that can provide the engineer with a detailed view of the multi-variate variation in data....

  5. NEA, Nuclear law and information processing

    International Nuclear Information System (INIS)

    Reyners, P.

    1977-01-01

    NEA has for many years now been collating information on, and analysing, laws and regulations on the peaceful uses of nuclear energy, and this work has resulted in a series of publications. However, as seen by the multiplication of computer-based legal information centres, both at national and international level, conventional information systems are no longer adequate to deal with the increasing volume of information and with users' needs. In view of the particular aspects of nuclear law and of its own availabilities, NEA has endeavoured to make the best possible use of existing structures by opting for participation in the IAEA International Nuclear Information System rather than by creating a specialised centre. Before becoming operational, the arrangements concluded between NEA and IAEA required that the INIS rules be altered somewhat to take account of the specific problems raised by treatment of legal literature and also to improve the quality of information provided to users. (auth.) [fr

  6. Integrating Geographical Information Systems, Fuzzy Logic and Analytical Hierarchy Process in Modelling Optimum Sites for Locating Water Reservoirs. A Case Study of the Debub District in Eritrea

    Directory of Open Access Journals (Sweden)

    Rodney G. Tsiko

    2011-03-01

    Full Text Available The aim of this study was to model water reservoir site selection for a real world application in the administrative district of Debub, Eritrea. This is a region were scarcity of water is a fundamental problem. Erratic rainfall, drought and unfavourable hydro-geological characteristics exacerbates the region’s water supply. Consequently, the population of Debub is facing severe water shortages and building reservoirs has been promoted as a possible solution to meet the future demand of water supply. This was the most powerful motivation to identify candidate sites for locating water reservoirs. A number of conflicting qualitative and quantitative criteria exist for evaluating alternative sites. Decisions regarding criteria are often accompanied by ambiguities and vagueness. This makes fuzzy logic a more natural approach to this kind of Multi-criteria Decision Analysis (MCDA problems. This paper proposes a combined two-stage MCDA methodology. The first stage involved utilizing the most simplistic type of data aggregation techniques known as Boolean Intersection or logical AND to identify areas restricted by environmental and hydrological constraints and therefore excluded from further study. The second stage involved integrating fuzzy logic with the Analytic Hierarchy Process (AHP to identify optimum and back-up candidate water reservoir sites in the area designated for further study.

  7. Modeling interdependencies between business and communication processes in hospitals.

    Science.gov (United States)

    Brigl, Birgit; Wendt, Thomas; Winter, Alfred

    2003-01-01

    The optimization and redesign of business processes in hospitals is an important challenge for the hospital information management who has to design and implement a suitable HIS architecture. Nevertheless, there are no tools available specializing in modeling information-driven business processes and the consequences on the communication between information processing, tools. Therefore, we will present an approach which facilitates the representation and analysis of business processes and resulting communication processes between application components and their interdependencies. This approach aims not only to visualize those processes, but to also to evaluate if there are weaknesses concerning the information processing infrastructure which hinder the smooth implementation of the business processes.

  8. Informational and Causal Architecture of Discrete-Time Renewal Processes

    Directory of Open Access Journals (Sweden)

    Sarah E. Marzen

    2015-07-01

    Full Text Available Renewal processes are broadly used to model stochastic behavior consisting of isolated events separated by periods of quiescence, whose durations are specified by a given probability law. Here, we identify the minimal sufficient statistic for their prediction (the set of causal states, calculate the historical memory capacity required to store those states (statistical complexity, delineate what information is predictable (excess entropy, and decompose the entropy of a single measurement into that shared with the past, future, or both. The causal state equivalence relation defines a new subclass of renewal processes with a finite number of causal states despite having an unbounded interevent count distribution. We use the resulting formulae to analyze the output of the parametrized Simple Nonunifilar Source, generated by a simple two-state hidden Markov model, but with an infinite-state ϵ-machine presentation. All in all, the results lay the groundwork for analyzing more complex processes with infinite statistical complexity and infinite excess entropy.

  9. Process Models for Security Architectures

    Directory of Open Access Journals (Sweden)

    Floarea NASTASE

    2006-01-01

    Full Text Available This paper presents a model for an integrated security system, which can be implemented in any organization. It is based on security-specific standards and taxonomies as ISO 7498-2 and Common Criteria. The functionalities are derived from the classes proposed in the Common Criteria document. In the paper we present the process model for each functionality and also we focus on the specific components.

  10. Imperfect Information in Software Design Processes

    NARCIS (Netherlands)

    Noppen, J.A.R.

    2007-01-01

    The process of designing high-quality software systems is one of the major issues in software engineering research. Over the years, this has resulted in numerous design methods, each with specific qualities and drawbacks. For example, the Rational Unified Process is a comprehensive design process,

  11. Information Systems’ Portfolio: Contributions of Enterprise and Process Architecture

    Directory of Open Access Journals (Sweden)

    Silvia Fernandes

    2017-09-01

    Full Text Available We are witnessing a need for a quick and intelligent reaction from organizations to the level and speed of change in business processes.New information technologies and systems (IT/IS are challenging business models and products. One of the great shakes comes from the online and/or mobile apps and platforms.These are having a tremendous impact in launching innovative and competitive services through the combination of digital and physical features. This leads to actively rethink enterprise information systems’ portfolio, its management and suitability. One relevant way for enterprises to manage their IT/IS in order to cope with those challenges is enterprise and process architecture. A decision-making culture based on processes helps to understand and define the different elements that shape an organization and how those elements inter-relate inside and outside it. IT/IS portfolio management requires an increasing need of modeling data and process flows for better discerning and acting at its selection and alignment with business goals. The new generation of enterprise architecture (NGEA helps to design intelligent processes that answer quickly and creatively to new and challenging trends. This has to be open, agile and context-aware to allow well-designed services that match users’ expectations. This study includes two real cases/problems to solve quickly in companies and solutions are presented in line with this architectural approach.

  12. Essays on Imperfect Information Processing in Economics

    NARCIS (Netherlands)

    S.S. Ficco (Stefano)

    2007-01-01

    textabstractEconomic agents generally operate in uncertain environments and, prior to making decisions, invest time and resources to collect useful information. Consumers compare the prices charged by di..erent firms before purchasing a product. Politicians gather information from di..erent

  13. Analytic Hierarchy Process for Personalising Environmental Information

    Science.gov (United States)

    Kabassi, Katerina

    2014-01-01

    This paper presents how a Geographical Information System (GIS) can be incorporated in an intelligent learning software system for environmental matters. The system is called ALGIS and incorporates the GIS in order to present effectively information about the physical and anthropogenic environment of Greece in a more interactive way. The system…

  14. Directory of Energy Information Administration Models 1994

    International Nuclear Information System (INIS)

    1994-07-01

    This directory revises and updates the 1993 directory and includes 15 models of the National Energy Modeling System (NEMS). Three other new models in use by the Energy Information Administration (EIA) have also been included: the Motor Gasoline Market Model (MGMM), Distillate Market Model (DMM), and the Propane Market Model (PPMM). This directory contains descriptions about each model, including title, acronym, purpose, followed by more detailed information on characteristics, uses and requirements. Sources for additional information are identified. Included in this directory are 37 EIA models active as of February 1, 1994

  15. Directory of Energy Information Administration Models 1994

    Energy Technology Data Exchange (ETDEWEB)

    1994-07-01

    This directory revises and updates the 1993 directory and includes 15 models of the National Energy Modeling System (NEMS). Three other new models in use by the Energy Information Administration (EIA) have also been included: the Motor Gasoline Market Model (MGMM), Distillate Market Model (DMM), and the Propane Market Model (PPMM). This directory contains descriptions about each model, including title, acronym, purpose, followed by more detailed information on characteristics, uses and requirements. Sources for additional information are identified. Included in this directory are 37 EIA models active as of February 1, 1994.

  16. Quantum information processing with graph states

    International Nuclear Information System (INIS)

    Schlingemann, Dirk-Michael

    2005-04-01

    Graph states are multiparticle states which are associated with graphs. Each vertex of the graph corresponds to a single system or particle. The links describe quantum correlations (entanglement) between pairs of connected particles. Graph states were initiated independently by two research groups: On the one hand, graph states were introduced by Briegel and Raussendorf as a resource for a new model of one-way quantum computing, where algorithms are implemented by a sequence of measurements at single particles. On the other hand, graph states were developed by the author of this thesis and ReinhardWerner in Braunschweig, as a tool to build quantum error correcting codes, called graph codes. The connection between the two approaches was fully realized in close cooperation of both research groups. This habilitation thesis provides a survey of the theory of graph codes, focussing mainly, but not exclusively on the author's own research work. We present the theoretical and mathematical background for the analysis of graph codes. The concept of one-way quantum computing for general graph states is discussed. We explicitly show how to realize the encoding and decoding device of a graph code on a one-way quantum computer. This kind of implementation is to be seen as a mathematical description of a quantum memory device. In addition to that, we investigate interaction processes, which enable the creation of graph states on very large systems. Particular graph states can be created, for instance, by an Ising type interaction between next neighbor particles which sits at the points of an infinitely extended cubic lattice. Based on the theory of quantum cellular automata, we give a constructive characterization of general interactions which create a translationally invariant graph state. (orig.)

  17. New Product Development (Npd) Process In Subsidiary: Information Perspectives

    OpenAIRE

    Firmanzah

    2008-01-01

    Information is an important resource for new product development (NPD) process in subsidiary. However, we still lack of research to analyze NPD process from information perspective in subsidiary context. This research is an exploratory research and it exploited 8 cases of NPD process in consumer goods subsidiaries operating in Indonesian market. Three types of information have been identified and analyzed NPD process; global, regional and local information. The result of this research ...

  18. Influence Business Process On The Quality Of Accounting Information System

    OpenAIRE

    Meiryani; Muhammad Syaifullah

    2015-01-01

    Abstract The purpose of this study was to determine the influence of business process to the quality of the accounting information system. This study aims to examine the influence of business process on the quality of the information system of accounting information system. The study was theoritical research which considered the roles of business process on quality of accounting information system which use secondary data collection. The results showed that the business process have a signifi...

  19. Information Systems to Support a Decision Process at Stanford.

    Science.gov (United States)

    Chaffee, Ellen Earle

    1982-01-01

    When a rational decision process is desired, information specialists can contribute information and also contribute to the process in which that information is used, thereby promoting rational decision-making. The contribution of Stanford's information specialists to rational decision-making is described. (MLW)

  20. Information dissemination model for social media with constant updates

    Science.gov (United States)

    Zhu, Hui; Wu, Heng; Cao, Jin; Fu, Gang; Li, Hui

    2018-07-01

    With the development of social media tools and the pervasiveness of smart terminals, social media has become a significant source of information for many individuals. However, false information can spread rapidly, which may result in negative social impacts and serious economic losses. Thus, reducing the unfavorable effects of false information has become an urgent challenge. In this paper, a new competitive model called DMCU is proposed to describe the dissemination of information with constant updates in social media. In the model, we focus on the competitive relationship between the original false information and updated information, and then propose the priority of related information. To more effectively evaluate the effectiveness of the proposed model, data sets containing actual social media activity are utilized in experiments. Simulation results demonstrate that the DMCU model can precisely describe the process of information dissemination with constant updates, and that it can be used to forecast information dissemination trends on social media.

  1. Mathematical modelling in economic processes.

    Directory of Open Access Journals (Sweden)

    L.V. Kravtsova

    2008-06-01

    Full Text Available In article are considered a number of methods of mathematical modelling of economic processes and opportunities of use of spreadsheets Excel for reception of the optimum decision of tasks or calculation of financial operations with the help of the built-in functions.

  2. Cognitive Structures in Vocational Information Processing and Decision Making.

    Science.gov (United States)

    Nevill, Dorothy D.; And Others

    1986-01-01

    Tested the assumptions that the structural features of vocational schemas affect vocational information processing and career self-efficacy. Results indicated that effective vocational information processing was facilitated by well-integrated systems that processed information along fewer dimensions. The importance of schematic organization on the…

  3. Open Standards for Sensor Information Processing

    Energy Technology Data Exchange (ETDEWEB)

    Pouchard, Line Catherine [ORNL; Poole, Stephen W [ORNL; Lothian, Josh [ORNL

    2009-07-01

    This document explores sensor standards, sensor data models, and computer sensor software in order to determine the specifications and data representation best suited for analyzing and monitoring computer system health using embedded sensor data. We review IEEE 1451, OGC Sensor Model Language and Transducer Model Language (TML), lm-sensors and Intelligent Platform Management Inititative (IPMI).

  4. Visualizing the process of process modeling with PPMCharts

    NARCIS (Netherlands)

    Claes, J.; Vanderfeesten, I.T.P.; Pinggera, J.; Reijers, H.A.; Weber, B.; Poels, G.; La Rosa, M.; Soffer, P.

    2013-01-01

    In the quest for knowledge about how to make good process models, recent research focus is shifting from studying the quality of process models to studying the process of process modeling (often abbreviated as PPM) itself. This paper reports on our efforts to visualize this specific process in such

  5. Information modelling and knowledge bases XXV

    CERN Document Server

    Tokuda, T; Jaakkola, H; Yoshida, N

    2014-01-01

    Because of our ever increasing use of and reliance on technology and information systems, information modelling and knowledge bases continue to be important topics in those academic communities concerned with data handling and computer science. As the information itself becomes more complex, so do the levels of abstraction and the databases themselves. This book is part of the series Information Modelling and Knowledge Bases, which concentrates on a variety of themes in the important domains of conceptual modeling, design and specification of information systems, multimedia information modelin

  6. Neuroscientific Model of Motivational Process

    Directory of Open Access Journals (Sweden)

    Sung-Il eKim

    2013-03-01

    Full Text Available Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three subprocesses, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous subprocesses, namely reward-driven approach, value-based decision making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area and the dorsolateral prefrontal cortex (cognitive control area are the main neural circuits related to regulation of motivation. These three subprocesses interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment.

  7. PUBLIC RELATIONS AS AN INFORMATION PROCESS PHENOMENON

    Directory of Open Access Journals (Sweden)

    TKACH L. M.

    2016-06-01

    Full Text Available Formulation of the problem. If public relations as a phenomenon of information management are examined, we deal with the question of knowledge content and nature of relationship of PR with environment, ability to manage the perception and attitude of people to events in the environment; ensure priority of information over other resources. Goal. To investigate the concept of "public relations" of foreign and domestic experts; consider the typology of the public and the "laws" of public opinion; define the basic principles according to which relations with public should be built, and to identify PR activities as a kind of social communication. Conclusions. Public relations on the basis of advanced information and communication technologies create fundamentally new opportunities for information control and influence on public consciousness.

  8. Integrated Site Model Process Model Report

    International Nuclear Information System (INIS)

    Booth, T.

    2000-01-01

    The Integrated Site Model (ISM) provides a framework for discussing the geologic features and properties of Yucca Mountain, which is being evaluated as a potential site for a geologic repository for the disposal of nuclear waste. The ISM is important to the evaluation of the site because it provides 3-D portrayals of site geologic, rock property, and mineralogic characteristics and their spatial variabilities. The ISM is not a single discrete model; rather, it is a set of static representations that provide three-dimensional (3-D), computer representations of site geology, selected hydrologic and rock properties, and mineralogic-characteristics data. These representations are manifested in three separate model components of the ISM: the Geologic Framework Model (GFM), the Rock Properties Model (RPM), and the Mineralogic Model (MM). The GFM provides a representation of the 3-D stratigraphy and geologic structure. Based on the framework provided by the GFM, the RPM and MM provide spatial simulations of the rock and hydrologic properties, and mineralogy, respectively. Functional summaries of the component models and their respective output are provided in Section 1.4. Each of the component models of the ISM considers different specific aspects of the site geologic setting. Each model was developed using unique methodologies and inputs, and the determination of the modeled units for each of the components is dependent on the requirements of that component. Therefore, while the ISM represents the integration of the rock properties and mineralogy into a geologic framework, the discussion of ISM construction and results is most appropriately presented in terms of the three separate components. This Process Model Report (PMR) summarizes the individual component models of the ISM (the GFM, RPM, and MM) and describes how the three components are constructed and combined to form the ISM

  9. Acceptance model of a Hospital Information System.

    Science.gov (United States)

    Handayani, P W; Hidayanto, A N; Pinem, A A; Hapsari, I C; Sandhyaduhita, P I; Budi, I

    2017-03-01

    The purpose of this study is to develop a model of Hospital Information System (HIS) user acceptance focusing on human, technological, and organizational characteristics for supporting government eHealth programs. This model was then tested to see which hospital type in Indonesia would benefit from the model to resolve problems related to HIS user acceptance. This study used qualitative and quantitative approaches with case studies at four privately owned hospitals and three government-owned hospitals, which are general hospitals in Indonesia. The respondents involved in this study are low-level and mid-level hospital management officers, doctors, nurses, and administrative staff who work at medical record, inpatient, outpatient, emergency, pharmacy, and information technology units. Data was processed using Structural Equation Modeling (SEM) and AMOS 21.0. The study concludes that non-technological factors, such as human characteristics (i.e. compatibility, information security expectancy, and self-efficacy), and organizational characteristics (i.e. management support, facilitating conditions, and user involvement) which have level of significance of p<0.05, significantly influenced users' opinions of both the ease of use and the benefits of the HIS. This study found that different factors may affect the acceptance of each user in each type of hospital regarding the use of HIS. Finally, this model is best suited for government-owned hospitals. Based on the results of this study, hospital management and IT developers should have more understanding on the non-technological factors to better plan for HIS implementation. Support from management is critical to the sustainability of HIS implementation to ensure HIS is easy to use and provides benefits to the users as well as hospitals. Finally, this study could assist hospital management and IT developers, as well as researchers, to understand the obstacles faced by hospitals in implementing HIS. Copyright © 2016

  10. COMPLEMENTARITY OF HISTORIC BUILDING INFORMATION MODELLING AND GEOGRAPHIC INFORMATION SYSTEMS

    Directory of Open Access Journals (Sweden)

    X. Yang

    2016-06-01

    Full Text Available In this paper, we discuss the potential of integrating both semantically rich models from Building Information Modelling (BIM and Geographical Information Systems (GIS to build the detailed 3D historic model. BIM contributes to the creation of a digital representation having all physical and functional building characteristics in several dimensions, as e.g. XYZ (3D, time and non-architectural information that are necessary for construction and management of buildings. GIS has potential in handling and managing spatial data especially exploring spatial relationships and is widely used in urban modelling. However, when considering heritage modelling, the specificity of irregular historical components makes it problematic to create the enriched model according to its complex architectural elements obtained from point clouds. Therefore, some open issues limiting the historic building 3D modelling will be discussed in this paper: how to deal with the complex elements composing historic buildings in BIM and GIS environment, how to build the enriched historic model, and why to construct different levels of details? By solving these problems, conceptualization, documentation and analysis of enriched Historic Building Information Modelling are developed and compared to traditional 3D models aimed primarily for visualization.

  11. Investigating the Process of Process Modeling with Eye Movement Analysis

    OpenAIRE

    Pinggera, Jakob; Furtner, Marco; Martini, Markus; Sachse, Pierre; Reiter, Katharina; Zugal, Stefan; Weber, Barbara

    2015-01-01

    Research on quality issues of business process models has recently begun to explore the process of creating process models by analyzing the modeler's interactions with the modeling environment. In this paper we aim to complement previous insights on the modeler's modeling behavior with data gathered by tracking the modeler's eye movements when engaged in the act of modeling. We present preliminary results and outline directions for future research to triangulate toward a more comprehensive un...

  12. Welding process modelling and control

    Science.gov (United States)

    Romine, Peter L.; Adenwala, Jinen A.

    1993-01-01

    The research and analysis performed, and software developed, and hardware/software recommendations made during 1992 in development of the PC-based data acquisition system for support of Welding Process Modeling and Control is reported. A need was identified by the Metals Processing Branch of NASA Marshall Space Flight Center, for a mobile data aquisition and analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC-based system was chosen. The Welding Measurement System (WMS) is a dedicated instrument, strictly for the use of data aquisition and analysis. Although the WMS supports many of the functions associated with the process control, it is not the intention for this system to be used for welding process control.

  13. User-guided discovery of declarative process models

    NARCIS (Netherlands)

    Maggi, F.M.; Mooij, A.J.; Aalst, van der W.M.P.; Chawla, N.; King, I.; Sperduti, A.

    2011-01-01

    Process mining techniques can be used to effectively discover process models from logs with example behaviour. Cross-correlating a discovered model with information in the log can be used to improve the underlying process. However, existing process discovery techniques have two important drawbacks.

  14. Bayesian or Laplacien inference, entropy and information theory and information geometry in data and signal processing

    Science.gov (United States)

    Mohammad-Djafari, Ali

    2015-01-01

    The main object of this tutorial article is first to review the main inference tools using Bayesian approach, Entropy, Information theory and their corresponding geometries. This review is focused mainly on the ways these tools have been used in data, signal and image processing. After a short introduction of the different quantities related to the Bayes rule, the entropy and the Maximum Entropy Principle (MEP), relative entropy and the Kullback-Leibler divergence, Fisher information, we will study their use in different fields of data and signal processing such as: entropy in source separation, Fisher information in model order selection, different Maximum Entropy based methods in time series spectral estimation and finally, general linear inverse problems.

  15. On the fragmentation of process information : challenges, solutions, and outlook

    NARCIS (Netherlands)

    Aa, van der J.H.; Leopold, H.; Mannhardt, F.; Reijers, H.A.; Gaaloul, K.; Schmidt, R.; Nurcan, S.; Guerreiro, S.; Ma, Q.

    2015-01-01

    An organization’s knowledge on its business processes represents valuable corporate knowledge because it can be used to enhance the performance of these processes. In many organizations, documentation of process knowledge is scattered around various process information sources. Such information

  16. 40 CFR 68.65 - Process safety information.

    Science.gov (United States)

    2010-07-01

    ... (CONTINUED) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.65 Process safety... 40 Protection of Environment 15 2010-07-01 2010-07-01 false Process safety information. 68.65... compilation of written process safety information before conducting any process hazard analysis required by...

  17. Development of technical information processing systems

    International Nuclear Information System (INIS)

    Lee, Ji Ho; Kim, Tae Whan; Kim, Sun Ja; Kim, Young Min; Choi, Kwang; Oh, Joung Hun; Choung, Hyun Suk; Keum, Jong Yong; Yoo, An Na; Harn, Deuck Haing; Choun, Young Chun

    1993-12-01

    The major goal of this project is to develop a more efficient information management system by connecting the KAREI serials database which enable the users to access from their own laboratory facilities through KAREI-NET. The importance of this project is to make the serials information of KAERI easily accessible to users as valuable resources for R and D activities. The results of the project are as follows. 1) Development of the serials database and retrieval system enabled us to access to the serials holding information through KAERI-NET. 2) The database construction establishes a foundation for the management of 1,600 serials held in KAERI. 3) The system can be applied not only to KAERI but also to similar medium-level libraries. (Author)

  18. The Information Warfare Life Cycle Model

    Directory of Open Access Journals (Sweden)

    Brett van Niekerk

    2011-11-01

    Full Text Available Information warfare (IW is a dynamic and developing concept, which constitutes a number of disciplines. This paper aims to develop a life cycle model for information warfare that is applicable to all of the constituent disciplines. The model aims to be scalable and applicable to civilian and military incidents where information warfare tactics are employed. Existing information warfare models are discussed, and a new model is developed from the common aspects of these existing models. The proposed model is then applied to a variety of incidents to test its applicability and scalability. The proposed model is shown to be applicable to multiple disciplines of information warfare and is scalable, thus meeting the objectives of the model.

  19. The Information Warfare Life Cycle Model

    Directory of Open Access Journals (Sweden)

    Brett van Niekerk

    2011-03-01

    Full Text Available Information warfare (IW is a dynamic and developing concept, which constitutes a number of disciplines. This paper aims to develop a life cycle model for information warfare that is applicable to all of the constituent disciplines. The model aims to be scalable and applicable to civilian and military incidents where information warfare tactics are employed. Existing information warfare models are discussed, and a new model is developed from the common aspects of these existing models. The proposed model is then applied to a variety of incidents to test its applicability and scalability. The proposed model is shown to be applicable to multiple disciplines of information warfare and is scalable, thus meeting the objectives of the model.

  20. How Qualitative Methods Can be Used to Inform Model Development.

    Science.gov (United States)

    Husbands, Samantha; Jowett, Susan; Barton, Pelham; Coast, Joanna

    2017-06-01

    Decision-analytic models play a key role in informing healthcare resource allocation decisions. However, there are ongoing concerns with the credibility of models. Modelling methods guidance can encourage good practice within model development, but its value is dependent on its ability to address the areas that modellers find most challenging. Further, it is important that modelling methods and related guidance are continually updated in light of any new approaches that could potentially enhance model credibility. The objective of this article was to highlight the ways in which qualitative methods have been used and recommended to inform decision-analytic model development and enhance modelling practices. With reference to the literature, the article discusses two key ways in which qualitative methods can be, and have been, applied. The first approach involves using qualitative methods to understand and inform general and future processes of model development, and the second, using qualitative techniques to directly inform the development of individual models. The literature suggests that qualitative methods can improve the validity and credibility of modelling processes by providing a means to understand existing modelling approaches that identifies where problems are occurring and further guidance is needed. It can also be applied within model development to facilitate the input of experts to structural development. We recommend that current and future model development would benefit from the greater integration of qualitative methods, specifically by studying 'real' modelling processes, and by developing recommendations around how qualitative methods can be adopted within everyday modelling practice.

  1. Agent oriented modeling of business information systems

    OpenAIRE

    Vymetal, Dominik

    2009-01-01

    Enterprise modeling is an abstract definition of processes running in enterprise using process, value, data and resource models. There are two perspectives of business modeling: process perspective and value chain perspective. Both have some advantages and disadvantages. This paper proposes a combination of both perspectives into one generic model. The model takes also social part or the enterprise system into consideration and pays attention to disturbances influencing the enterprise system....

  2. Bayesian inference for Markov jump processes with informative observations.

    Science.gov (United States)

    Golightly, Andrew; Wilkinson, Darren J

    2015-04-01

    In this paper we consider the problem of parameter inference for Markov jump process (MJP) representations of stochastic kinetic models. Since transition probabilities are intractable for most processes of interest yet forward simulation is straightforward, Bayesian inference typically proceeds through computationally intensive methods such as (particle) MCMC. Such methods ostensibly require the ability to simulate trajectories from the conditioned jump process. When observations are highly informative, use of the forward simulator is likely to be inefficient and may even preclude an exact (simulation based) analysis. We therefore propose three methods for improving the efficiency of simulating conditioned jump processes. A conditioned hazard is derived based on an approximation to the jump process, and used to generate end-point conditioned trajectories for use inside an importance sampling algorithm. We also adapt a recently proposed sequential Monte Carlo scheme to our problem. Essentially, trajectories are reweighted at a set of intermediate time points, with more weight assigned to trajectories that are consistent with the next observation. We consider two implementations of this approach, based on two continuous approximations of the MJP. We compare these constructs for a simple tractable jump process before using them to perform inference for a Lotka-Volterra system. The best performing construct is used to infer the parameters governing a simple model of motility regulation in Bacillus subtilis.

  3. INFORMATIONAL MODEL OF MENTAL ROTATION OF FIGURES

    Directory of Open Access Journals (Sweden)

    V. A. Lyakhovetskiy

    2016-01-01

    Full Text Available Subject of Study.The subject of research is the information structure of objects internal representations and operations over them, used by man to solve the problem of mental rotation of figures. To analyze this informational structure we considered not only classical dependencies of the correct answers on the angle of rotation, but also the other dependencies obtained recently in cognitive psychology. Method.The language of technical computing Matlab R2010b was used for developing information model of the mental rotation of figures. Such model parameters as the number of bits in the internal representation, an error probability in a single bit, discrete rotation angle, comparison threshold, and the degree of difference during rotation can be changed. Main Results.The model reproduces qualitatively such psychological dependencies as the linear increase of time of correct answers and the number of errors on the angle of rotation for identical figures, "flat" dependence of the time of correct answers and the number of errors on the angle of rotation for mirror-like figures. The simulation results suggest that mental rotation is an iterative process of finding a match between the two figures, each step of which can lead to a significant distortion of the internal representation of the stored objects. Matching is carried out within the internal representations that have no high invariance to rotation angle. Practical Significance.The results may be useful for understanding the role of learning (including the learning with a teacher in the development of effective information representation and operations on them in artificial intelligence systems.

  4. Internet-based intelligent information processing systems

    CERN Document Server

    Tonfoni, G; Ichalkaranje, N S

    2003-01-01

    The Internet/WWW has made it possible to easily access quantities of information never available before. However, both the amount of information and the variation in quality pose obstacles to the efficient use of the medium. Artificial intelligence techniques can be useful tools in this context. Intelligent systems can be applied to searching the Internet and data-mining, interpreting Internet-derived material, the human-Web interface, remote condition monitoring and many other areas. This volume presents the latest research on the interaction between intelligent systems (neural networks, adap

  5. Spinal Cord Injury Model System Information Network

    Science.gov (United States)

    ... the UAB-SCIMS More The UAB-SCIMS Information Network The University of Alabama at Birmingham Spinal Cord Injury Model System (UAB-SCIMS) maintains this Information Network as a resource to promote knowledge in the ...

  6. Processing data base information having nonwhite noise

    Science.gov (United States)

    Gross, Kenneth C.; Morreale, Patricia

    1995-01-01

    A method and system for processing a set of data from an industrial process and/or a sensor. The method and system can include processing data from either real or calculated data related to an industrial process variable. One of the data sets can be an artificial signal data set generated by an autoregressive moving average technique. After obtaining two data sets associated with one physical variable, a difference function data set is obtained by determining the arithmetic difference between the two pairs of data sets over time. A frequency domain transformation is made of the difference function data set to obtain Fourier modes describing a composite function data set. A residual function data set is obtained by subtracting the composite function data set from the difference function data set and the residual function data set (free of nonwhite noise) is analyzed by a statistical probability ratio test to provide a validated data base.

  7. Information Processing Approaches to Cognitive Development

    Science.gov (United States)

    1988-07-01

    Craik . F.I.M., & Lockhart , R.S. (1972). Levels of processing : A framework for memory research. Journal of Verbal Learning and Verbal Behavior, 11...task at both levels of performance, then one would, in both cases, postulate systems that had the ability to process symbols at the microscopic level ...821760s and early 70s. (cf. Atkinson & Shiffrin. 1968: Craik & Lockhart . 1972: Norman, Rumelhart, & LNR, 1975). This architecture is comprised of several

  8. Numerical support, information processing and attitude change

    OpenAIRE

    de Dreu, C.K.W.; de Vries, N.K.

    1993-01-01

    In two experiments we studied the prediction that majority support induces stronger convergent processing than minority support for a persuasive message, the more so when recipients are explicitly forced to pay attention to the source's point of view; this in turn affects the amount of attitude change on related issues. Convergent processing is the systematic elaboration on the sources position, but with a stronger focus on verification and justification rather than falsification. In Exp 1 wi...

  9. Visual Motion Perception and Visual Information Processing

    Science.gov (United States)

    1993-12-31

    tradi- tionally called the "span of apprehension" (Kulpe, 1904; Durable Storage Wundt , 1899). However, a partial-report procedure demon- strates...Gehrig. P. (1992). On the time course Wundt . W. (1899). Zur Kritik tachistoskopischer Versuche [A crit- of perceptual information that results from a

  10. Advanced oxidation processes: overall models

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez, M. [Univ. de los Andes, Escuela Basica de Ingenieria, La Hechicera, Merida (Venezuela); Curco, D.; Addardak, A.; Gimenez, J.; Esplugas, S. [Dept. de Ingenieria Quimica. Univ. de Barcelona, Barcelona (Spain)

    2003-07-01

    Modelling AOPs implies to consider all the steps included in the process, that means, mass transfer, kinetic (reaction) and luminic steps. In this way, recent works develop models which relate the global reaction rate to catalyst concentration and radiation absorption. However, the application of such models requires to know what is the controlling step for the overall process. In this paper, a simple method is explained which allows to determine the controlling step. Thus, it is assumed that reactor is divided in two hypothetical zones (dark and illuminated), and according to the experimental results, obtained by varying only the reaction volume, it can be decided if reaction occurs only in the illuminated zone or in the all reactor, including dark zone. The photocatalytic degradation of phenol, by using titania degussa P-25 as catalyst, is studied as reaction model. The preliminary results obtained are presented here, showing that it seems that, in this case, reaction only occurs in the illuminated zone of photoreactor. A model is developed to explain this behaviour. (orig.)

  11. Model for amorphous aggregation processes

    Science.gov (United States)

    Stranks, Samuel D.; Ecroyd, Heath; van Sluyter, Steven; Waters, Elizabeth J.; Carver, John A.; von Smekal, Lorenz

    2009-11-01

    The amorphous aggregation of proteins is associated with many phenomena, ranging from the formation of protein wine haze to the development of cataract in the eye lens and the precipitation of recombinant proteins during their expression and purification. While much literature exists describing models for linear protein aggregation, such as amyloid fibril formation, there are few reports of models which address amorphous aggregation. Here, we propose a model to describe the amorphous aggregation of proteins which is also more widely applicable to other situations where a similar process occurs, such as in the formation of colloids and nanoclusters. As first applications of the model, we have tested it against experimental turbidimetry data of three proteins relevant to the wine industry and biochemistry, namely, thaumatin, a thaumatinlike protein, and α -lactalbumin. The model is very robust and describes amorphous experimental data to a high degree of accuracy. Details about the aggregation process, such as shape parameters of the aggregates and rate constants, can also be extracted.

  12. Theory Creation, Modification, and Testing: An Information-Processing Model and Theory of the Anticipated and Unanticipated Consequences of Research and Development

    Science.gov (United States)

    Perla, Rocco J.; Carifio, James

    2011-01-01

    Background: Extending Merton's (1936) work on the consequences of purposive social action, the model, theory and taxonomy outlined here incorporates and formalizes both anticipated and unanticipated research findings in a unified theoretical framework. The model of anticipated research findings was developed initially by Carifio (1975, 1977) and…

  13. Understanding the Information Research Process of Experienced Online Information Researchers to Inform Development of a Scholars Portal

    Directory of Open Access Journals (Sweden)

    Martha Whitehead

    2009-06-01

    Full Text Available Objective - The main purpose of this study was to understand the information research process of experienced online information researchers in a variety of disciplines, gather their ideas for improvement and as part of this to validate a proposed research framework for use in future development of Ontario’s Scholars Portal.Methods - This was a qualitative research study in which sixty experienced online information researchers participated in face-to-face workshops that included a collaborative design component. The sessions were conducted and recorded by usability specialists who subsequently analyzed the data and identified patterns and themes.Results - Key themes included the similarities of the information research process across all disciplines, the impact of interdisciplinarity, the social aspect of research and opportunities for process improvement. There were many specific observations regarding current and ideal processes. Implications for portal development and further research included: supporting a common process while accommodating user-defined differences; supporting citation chaining practices with new opportunities for data linkage and granularity; enhancing keyword searching with various types of intervention; exploring trusted social networks; exploring new mental models for data manipulation while retaining traditional objects; improving citation and document management. Conclusion – The majority of researchers in the study had almost no routine in their information research processes, had developed few techniques to assist themselves and had very little awareness of the tools available to help them. There are many opportunities to aid researchers in the research process that can be explored when developing scholarly research portals. That development will be well guided by the framework ‘discover, gather, synthesize, create, share.’

  14. Repairing business process models as retrieved from source code

    NARCIS (Netherlands)

    Fernández-Ropero, M.; Reijers, H.A.; Pérez-Castillo, R.; Piattini, M.; Nurcan, S.; Proper, H.A.; Soffer, P.; Krogstie, J.; Schmidt, R.; Halpin, T.; Bider, I.

    2013-01-01

    The static analysis of source code has become a feasible solution to obtain underlying business process models from existing information systems. Due to the fact that not all information can be automatically derived from source code (e.g., consider manual activities), such business process models

  15. Can the Analytical Hierarchy Process Model Be Effectively Applied in the Prioritization of Information Assurance Defense In-Depth Measures? --A Quantitative Study

    Science.gov (United States)

    Alexander, Rodney T.

    2017-01-01

    Organizational computing devices are increasingly becoming targets of cyber-attacks, and organizations have become dependent on the safety and security of their computer networks and their organizational computing devices. Business and government often use defense in-depth information assurance measures such as firewalls, intrusion detection…

  16. Information-based models for finance and insurance

    Science.gov (United States)

    Hoyle, Edward

    2010-10-01

    In financial markets, the information that traders have about an asset is reflected in its price. The arrival of new information then leads to price changes. The `information-based framework' of Brody, Hughston and Macrina (BHM) isolates the emergence of information, and examines its role as a driver of price dynamics. This approach has led to the development of new models that capture a broad range of price behaviour. This thesis extends the work of BHM by introducing a wider class of processes for the generation of the market filtration. In the BHM framework, each asset is associated with a collection of random cash flows. The asset price is the sum of the discounted expectations of the cash flows. Expectations are taken with respect (i) an appropriate measure, and (ii) the filtration generated by a set of so-called information processes that carry noisy or imperfect market information about the cash flows. To model the flow of information, we introduce a class of processes termed Lévy random bridges (LRBs), generalising the Brownian and gamma information processes of BHM. Conditioned on its terminal value, an LRB is identical in law to a Lévy bridge. We consider in detail the case where the asset generates a single cash flow X_T at a fixed date T. The flow of information about X_T is modelled by an LRB with random terminal value X_T. An explicit expression for the price process is found by working out the discounted conditional expectation of X_T with respect to the natural filtration of the LRB. New models are constructed using information processes related to the Poisson process, the Cauchy process, the stable-1/2 subordinator, the variance-gamma process, and the normal inverse-Gaussian process. These are applied to the valuation of credit-risky bonds, vanilla and exotic options, and non-life insurance liabilities.

  17. Symposium on Information Processing in Organizations.

    Science.gov (United States)

    1982-04-01

    Mental Hygiene Bourdieu , Pierre 1977 Outline of a Theory of Practice. Richard Nice, trans. Cambridge: Cambridge University Press. Cain, Leo F. and Samuel...hospital posed a unique evacuation problem. When a fire occurs in a hospital, information is typically communicated to doctors , nurses, and other...bore no relation whatsoever to the emergencies they announced, and they differed from institution to institution. Thus doctors , nuerses and other staff

  18. Motivated information processing, strategic choice, and the quality of negotiated agreement

    NARCIS (Netherlands)

    De Dreu, Carsten K W; Beersma, Bianca; Stroebe, Katherine; Euwema, Martin C.

    The authors tested a motivated information-processing model of negotiation: To reach high joint outcomes, negotiators need a deep understanding of the task, which requires them to exchange information and to process new information systematically. All this depends on social motivation, epistemic

  19. Motivated information processing, strategic choice, and the quality of negotiated agreement

    NARCIS (Netherlands)

    De Dreu, CKW; Beersma, B; Stroebe, K; Euwema, MC

    The authors tested a motivated information-processing model of negotiation: To reach high joint outcomes, negotiators need a deep understanding of the task, which requires them to exchange information and to process new information. systematically. All this depends on social motivation, epistemic

  20. An Information Processing Perspective on Divergence and Convergence in Collaborative Learning

    Science.gov (United States)

    Jorczak, Robert L.

    2011-01-01

    This paper presents a model of collaborative learning that takes an information processing perspective of learning by social interaction. The collaborative information processing model provides a theoretical basis for understanding learning principles associated with social interaction and explains why peer-to-peer discussion is potentially more…