WorldWideScience

Sample records for information processing model

  1. TUNS/TCIS information model/process model

    Science.gov (United States)

    Wilson, James

    1992-01-01

    An Information Model is comprised of graphical and textual notation suitable for describing and defining the problem domain - in our case, TUNS or TCIS. The model focuses on the real world under study. It identifies what is in the problem and organizes the data into a formal structure for documentation and communication purposes. The Information Model is composed of an Entity Relationship Diagram (ERD) and a Data Dictionary component. The combination of these components provide an easy to understand methodology for expressing the entities in the problem space, the relationships between entities and the characteristics (attributes) of the entities. This approach is the first step in information system development. The Information Model identifies the complete set of data elements processed by TUNS. This representation provides a conceptual view of TUNS from the perspective of entities, data, and relationships. The Information Model reflects the business practices and real-world entities that users must deal with.

  2. A process Approach to Information Services: Information Search Process (ISP Model

    Directory of Open Access Journals (Sweden)

    Hamid Keshavarz

    2010-12-01

    Full Text Available Information seeking is a behavior emerging out of the interaction between information seeker and information system and should be regarded as an episodic process so as to meet information needs of users and to take different roles in different stages of it. The present article introduces a process approach to information services in libraries using Carol Collier Kuhlthau Model. In this model, information seeking is regarded as a process consisting of six stages in each of which users have different thoughts, feelings and actions and librarians also take different roles at any stage correspondingly. These six stages are derived from instructive learning theory based on uncertainty principle. Regardless of some acceptable shortcomings, this model may be regarded as a new solution for rendering modern information services in libraries especially in relation to new information environments and media.

  3. An information theory-based approach to modeling the information processing of NPP operators

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Hyun; Seong, Poong Hyun [Korea Advanced Institute, Taejon (Korea, Republic of)

    2002-08-01

    This paper proposes a quantitative approach to modeling the information processing of NPP operators. The aim of this work is to derive the amount of the information processed during a certain control task. The focus will be on i) developing a model for information processing of NPP operators and ii) quantifying the model. To resolve the problems of the previous approaches based on the information theory, i.e. the problems of single channel approaches, we primarily develop the information processing model having multiple stages, which contains information flows. Then the uncertainty of the information is quantified using the Conant's model, a kind of information theory.

  4. A simplified computational memory model from information processing

    Science.gov (United States)

    Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang

    2016-11-01

    This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view.

  5. Technology of Developing of the Information Models of Nonstructurized Processes

    CERN Document Server

    Samojlov, V N

    2000-01-01

    In the paper, a multi-level algorithm of forming the information models is proposed for nonstructurized processes. Basic components of the information model are classified in the form of structure-functional constituents and structure-functional types of a developing composite system. Systematic requirements and criteria of construction of the basis of knowledge data and the bank of data of information model are formulated with the use of the proposed algorithm. As examples, the results are shown for application of the systematic analysis of forming the correspondence of computational technique and mathematical simulation in research studies on the structure-functional type "input-process-output" and constructing of the structure-functional model of the knowledge data basis starting from one of the inculcated technological processes.

  6. Information Search Process Model: How Freshmen Begin Research.

    Science.gov (United States)

    Swain, Deborah E.

    1996-01-01

    Investigates Kuhlthau's Search Process Model for information seeking using two Freshmen English classes. Data showed that students followed the six stages Kuhlthau proposed and suggest extensions to the model, including changing the order of the tasks, iterating and combining steps, and revising search goals based on social and interpersonal…

  7. Model choice considerations and information integration using analytical hierarchy process

    Energy Technology Data Exchange (ETDEWEB)

    Langenbrunner, James R [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Booker, Jane M [BOOKER SCIENTIFIC; Ross, Timothy J. [UNM

    2010-10-15

    Using the theory of information-gap for decision-making under severe uncertainty, it has been shown that model output compared to experimental data contains irrevocable trade-offs between fidelity-to-data, robustness-to-uncertainty and confidence-in-prediction. We illustrate a strategy for information integration by gathering and aggregating all available data, knowledge, theory, experience, similar applications. Such integration of information becomes important when the physics is difficult to model, when observational data are sparse or difficult to measure, or both. To aggregate the available information, we take an inference perspective. Models are not rejected, nor wasted, but can be integrated into a final result. We show an example of information integration using Saaty's Analytic Hierarchy Process (AHP), integrating theory, simulation output and experimental data. We used expert elicitation to determine weights for two models and two experimental data sets, by forming pair-wise comparisons between model output and experimental data. In this way we transform epistemic and/or statistical strength from one field of study into another branch of physical application. The price to pay for utilizing all available knowledge is that inferences drawn for the integrated information must be accounted for and the costs can be considerable. Focusing on inferences and inference uncertainty (IU) is one way to understand complex information.

  8. An information theoretic approach for combining neural network process models.

    Science.gov (United States)

    Sridhar, D V.; Bartlett, E B.; Seagrave, R C.

    1999-07-01

    Typically neural network modelers in chemical engineering focus on identifying and using a single, hopefully optimal, neural network model. Using a single optimal model implicitly assumes that one neural network model can extract all the information available in a given data set and that the other candidate models are redundant. In general, there is no assurance that any individual model has extracted all relevant information from the data set. Recently, Wolpert (Neural Networks, 5(2), 241 (1992)) proposed the idea of stacked generalization to combine multiple models. Sridhar, Seagrave and Barlett (AIChE J., 42, 2529 (1996)) implemented the stacked generalization for neural network models by integrating multiple neural networks into an architecture known as stacked neural networks (SNNs). SNNs consist of a combination of the candidate neural networks and were shown to provide improved modeling of chemical processes. However, in Sridhar's work SNNs were limited to using a linear combination of artificial neural networks. While a linear combination is simple and easy to use, it can utilize only those model outputs that have a high linear correlation to the output. Models that are useful in a nonlinear sense are wasted if a linear combination is used. In this work we propose an information theoretic stacking (ITS) algorithm for combining neural network models. The ITS algorithm identifies and combines useful models regardless of the nature of their relationship to the actual output. The power of the ITS algorithm is demonstrated through three examples including application to a dynamic process modeling problem. The results obtained demonstrate that the SNNs developed using the ITS algorithm can achieve highly improved performance as compared to selecting and using a single hopefully optimal network or using SNNs based on a linear combination of neural networks.

  9. Modeling Units of Assessment for Sharing Assessment Process Information: towards an Assessment Process Specification

    NARCIS (Netherlands)

    Miao, Yongwu; Sloep, Peter; Koper, Rob

    2009-01-01

    Miao, Y., Sloep, P. B., & Koper, R. (2008). Modeling Units of Assessment for Sharing Assessment Process Information: towards an Assessment Process Specification. Presentation at the ICWL 2008 conference. August, 20, 2008, Jinhua, China.

  10. Polymodal information processing via temporal cortex Area 37 modeling

    Science.gov (United States)

    Peterson, James K.

    2004-04-01

    A model of biological information processing is presented that consists of auditory and visual subsystems linked to temporal cortex and limbic processing. An biologically based algorithm is presented for the fusing of information sources of fundamentally different modalities. Proof of this concept is outlined by a system which combines auditory input (musical sequences) and visual input (illustrations such as paintings) via a model of cortex processing for Area 37 of the temporal cortex. The training data can be used to construct a connectionist model whose biological relevance is suspect yet is still useful and a biologically based model which achieves the same input to output map through biologically relevant means. The constructed models are able to create from a set of auditory and visual clues a combined musical/ illustration output which shares many of the properties of the original training data. These algorithms are not dependent on these particular auditory/ visual modalities and hence are of general use in the intelligent computation of outputs that require sensor fusion.

  11. Advanced modeling of management processes in information technology

    CERN Document Server

    Kowalczuk, Zdzislaw

    2014-01-01

    This book deals with the issues of modelling management processes of information technology and IT projects while its core is the model of information technology management and its component models (contextual, local) describing initial processing and the maturity capsule as well as a decision-making system represented by a multi-level sequential model of IT technology selection, which acquires a fuzzy rule-based implementation in this work. In terms of applicability, this work may also be useful for diagnosing applicability of IT standards in evaluation of IT organizations. The results of this diagnosis might prove valid for those preparing new standards so that – apart from their own visions – they could, to an even greater extent, take into account the capabilities and needs of the leaders of project and manufacturing teams. The book is intended for IT professionals using the ITIL, COBIT and TOGAF standards in their work. Students of computer science and management who are interested in the issue of IT...

  12. Metadata and their impact on processes in Building Information Modeling

    Directory of Open Access Journals (Sweden)

    Vladimir Nyvlt

    2014-04-01

    Full Text Available Building Information Modeling (BIM itself contains huge potential, how to increase effectiveness of every project in its all life cycle. It means from initial investment plan through project and building-up activities to long-term usage and property maintenance and finally demolition. Knowledge Management or better say Knowledge Sharing covers two sets of tools, managerial and technological. Manager`s needs are real expectations and desires of final users in terms of how could they benefit from managing long-term projects, covering whole life cycle in terms of sparing investment money and other resources. Technology employed can help BIM processes to support and deliver these benefits to users. How to use this technology for data and metadata collection, storage and sharing, which processes may these new technologies deploy. We will touch how to cover optimized processes proposal for better and smooth support of knowledge sharing within project time-scale, and covering all its life cycle.

  13. Approaching the Affective Factors of Information Seeking: The Viewpoint of the Information Search Process Model

    Science.gov (United States)

    Savolainen, Reijo

    2015-01-01

    Introduction: The article contributes to the conceptual studies of affective factors in information seeking by examining Kuhlthau's information search process model. Method: This random-digit dial telephone survey of 253 people (75% female) living in a rural, medically under-serviced area of Ontario, Canada, follows-up a previous interview study…

  14. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    Science.gov (United States)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  15. Modeling Units of Assessment for Sharing Assessment Process Information: towards an Assessment Process Specification

    NARCIS (Netherlands)

    Miao, Yongwu; Sloep, Peter; Koper, Rob

    2008-01-01

    Miao, Y., Sloep, P. B., & Koper, R. (2008). Modeling Units of Assessment for Sharing Assessment Process Information: towards an Assessment Process Specification. In F. W. B. Li, J. Zhao, T. K. Shih, R. W. H. Lau, Q. Li & D. McLeod (Eds.), Advances in Web Based Learning - Proceedings of the 7th

  16. Temporal Expectation and Information Processing: A Model-Based Analysis

    Science.gov (United States)

    Jepma, Marieke; Wagenmakers, Eric-Jan; Nieuwenhuis, Sander

    2012-01-01

    People are able to use temporal cues to anticipate the timing of an event, enabling them to process that event more efficiently. We conducted two experiments, using the fixed-foreperiod paradigm (Experiment 1) and the temporal-cueing paradigm (Experiment 2), to assess which components of information processing are speeded when subjects use such…

  17. Computational spectrotemporal auditory model with applications to acoustical information processing

    Science.gov (United States)

    Chi, Tai-Shih

    A computational spectrotemporal auditory model based on neurophysiological findings in early auditory and cortical stages is described. The model provides a unified multiresolution representation of the spectral and temporal features of sound likely critical in the perception of timbre. Several types of complex stimuli are used to demonstrate the spectrotemporal information preserved by the model. Shown by these examples, this two stage model reflects the apparent progressive loss of temporal dynamics along the auditory pathway from the rapid phase-locking (several kHz in auditory nerve), to moderate rates of synchrony (several hundred Hz in midbrain), to much lower rates of modulations in the cortex (around 30 Hz). To complete this model, several projection-based reconstruction algorithms are implemented to resynthesize the sound from the representations with reduced dynamics. One particular application of this model is to assess speech intelligibility. The spectro-temporal Modulation Transfer Functions (MTF) of this model is investigated and shown to be consistent with the salient trends in the human MTFs (derived from human detection thresholds) which exhibit a lowpass function with respect to both spectral and temporal dimensions, with 50% bandwidths of about 16 Hz and 2 cycles/octave. Therefore, the model is used to demonstrate the potential relevance of these MTFs to the assessment of speech intelligibility in noise and reverberant conditions. Another useful feature is the phase singularity emerged in the scale space generated by this multiscale auditory model. The singularity is shown to have certain robust properties and carry the crucial information about the spectral profile. Such claim is justified by perceptually tolerable resynthesized sounds from the nonconvex singularity set. In addition, the singularity set is demonstrated to encode the pitch and formants at different scales. These properties make the singularity set very suitable for traditional

  18. Constructing topic models of Internet of Things for information processing.

    Science.gov (United States)

    Xin, Jie; Cui, Zhiming; Zhang, Shukui; He, Tianxu; Li, Chunhua; Huang, Haojing

    2014-01-01

    Internet of Things (IoT) is regarded as a remarkable development of the modern information technology. There is abundant digital products data on the IoT, linking with multiple types of objects/entities. Those associated entities carry rich information and usually in the form of query records. Therefore, constructing high quality topic hierarchies that can capture the term distribution of each product record enables us to better understand users' search intent and benefits tasks such as taxonomy construction, recommendation systems, and other communications solutions for the future IoT. In this paper, we propose a novel record entity topic model (RETM) for IoT environment that is associated with a set of entities and records and a Gibbs sampling-based algorithm is proposed to learn the model. We conduct extensive experiments on real-world datasets and compare our approach with existing methods to demonstrate the advantage of our approach.

  19. Process and building information modelling in the construction industry by using information delivery manuals and model view definitions

    DEFF Research Database (Denmark)

    Karlshøj, Jan

    2012-01-01

    The construction industry is gradually increasing its use of structured information and building information modelling.To date, the industry has suffered from the disadvantages of a project-based organizational structure and ad hoc solutions. Furthermore, it is not used to formalizing the flow...... of information and specifying exactly which objects and properties are needed for each process and which information is produced by the processes. The present study is based on reviewing the existing methodology of Information Delivery Manuals (IDM) from Buildingsmart, which also is also an ISO standard 29481...... Part 1; and the Model View Definition (MVD) methodology developed by Buildingsmart and BLIS. The research also includes a review of concrete IDM development projects that have been developed over the last five years. Although the study has identified interest in the IDM methodology in a number...

  20. Testing the Causal Mediation Component of Dodge's Social Information Processing Model of Social Competence and Depression

    Science.gov (United States)

    Possel, Patrick; Seemann, Simone; Ahrens, Stefanie; Hautzinger, Martin

    2006-01-01

    In Dodge's model of "social information processing" depression is the result of a linear sequence of five stages of information processing ("Annu Rev Psychol" 44: 559-584, 1993). These stages follow a person's reaction to situational stimuli, such that each stage of information processing mediates the relationship between earlier and later stages.…

  1. Kuhlthau's Information Search Process.

    Science.gov (United States)

    Shannon, Donna

    2002-01-01

    Explains Kuhlthau's Information Search Process (ISP) model which is based on a constructivist view of learning and provides a framework for school library media specialists for the design of information services and instruction. Highlights include a shift from library skills to information skills; attitudes; process approach; and an interview with…

  2. The importance of information goods abstraction levels for information commerce process models

    NARCIS (Netherlands)

    Wijnhoven, Fons

    2002-01-01

    A process model, in the context of e-commerce, is an organized set of activities for the creation, (re-)production, trade and delivery of goods. Electronic commerce studies have created important process models for the trade of physical goods via Internet. These models are not easily suitable for th

  3. Information transfer model of natural processes: from the ideal gas law to the distance dependent redshift

    CERN Document Server

    Fielitz, P

    2009-01-01

    We consider a general information transfer model which comprises any natural process which is able to transfer information and which can be characterised by only two independent process variables. We further postulate that these independent process variables serve as source and destination of information during a natural process. To define information which is directly related to the process variables, we apply the definition of information originally formulated by Hartley. We demonstrate that the proposed information transfer model yields well known laws, which, as yet, have not been directly related to information theory, such as the ideal gas law, the radioactive decay law, the formation law of vacancies in single crystals, and Fick's first law. Further, for the propagation of photons from a point source the information transfer model shows that any detector device, if at rest relative to the point source, will measure a redshift relative to the photon wavelength which is emitted from the point source. Tha...

  4. Modeling market information processing in new product development: An empirical analysis

    NARCIS (Netherlands)

    Veldhuizen, A; Hultink E.J; Griffin, A.

    2006-01-01

    This research explores the antecedents and consequences of market information processing during the development process of new high-tech products. To this end, we develop and test a conceptual model for market information processing in three generic stages of the new product development (NPD) proces

  5. Information Processing and Collective Behavior in a Model Neuronal System

    Science.gov (United States)

    2014-03-28

    including   cancer,  cardiovascular   disease ,  and  possibly  even  late-­‐onset  diabetes.  Rapid   readjustment  of  our...channels, sodium channels, input signal, we simulate the classical Hodgkin -Huxley (HH) model with stochastic channel dynamics and deterministic stimuli

  6. Quantum-like model of processing of information in the brain based on classical electromagnetic field

    CERN Document Server

    Khrennikov, Andrei

    2010-01-01

    We propose a model of quantum-like (QL) processing of mental information. This model is based on quantum information theory. However, in contrast to models of ``quantum physical brain'' reducing mental activity (at least at the highest level) to quantum physical phenomena in the brain, our model matches well with the basic neuronal paradigm of the cognitive science. QL information processing is based (surprisingly) on classical electromagnetic signals induced by joint activity of neurons. This novel approach to quantum information is based on representation of quantum mechanics as a version of classical signal theory which was recently elaborated by the author. The brain uses the QL representation (QLR) for working with abstract concepts; concrete images are described by classical information theory. Two processes, classical and QL, are performed parallely. Moreover, information is actively transmitted from one representation to another. A QL concept given in our model by a density operator can generate a var...

  7. Econobiophysics - game of choosing. Model of selection or election process with diverse accessible information

    Science.gov (United States)

    2011-01-01

    We propose several models applicable to both selection and election processes when each selecting or electing subject has access to different information about the objects to choose from. We wrote special software to simulate these processes. We consider both the cases when the environment is neutral (natural process) as well as when the environment is involved (controlled process). PMID:21892959

  8. Development of the Instructional Model by Integrating Information Literacy in the Class Learning and Teaching Processes

    Science.gov (United States)

    Maitaouthong, Therdsak; Tuamsuk, Kulthida; Techamanee, Yupin

    2011-01-01

    This study was aimed at developing an instructional model by integrating information literacy in the instructional process of general education courses at an undergraduate level. The research query, "What is the teaching methodology that integrates information literacy in the instructional process of general education courses at an undergraduate…

  9. Applying Catastrophe Theory to an Information-Processing Model of Problem Solving in Science Education

    Science.gov (United States)

    Stamovlasis, Dimitrios; Tsaparlis, Georgios

    2012-01-01

    In this study, we test an information-processing model (IPM) of problem solving in science education, namely the working memory overload model, by applying catastrophe theory. Changes in students' achievement were modeled as discontinuities within a cusp catastrophe model, where working memory capacity was implemented as asymmetry and the degree…

  10. Applying Catastrophe Theory to an Information-Processing Model of Problem Solving in Science Education

    Science.gov (United States)

    Stamovlasis, Dimitrios; Tsaparlis, Georgios

    2012-01-01

    In this study, we test an information-processing model (IPM) of problem solving in science education, namely the working memory overload model, by applying catastrophe theory. Changes in students' achievement were modeled as discontinuities within a cusp catastrophe model, where working memory capacity was implemented as asymmetry and the degree…

  11. Junior high school students' cognitive process in solving the developed algebraic problems based on information processing taxonomy model

    Science.gov (United States)

    Purwoko, Saad, Noor Shah; Tajudin, Nor'ain Mohd

    2017-05-01

    This study aims to: i) develop problem solving questions of Linear Equations System of Two Variables (LESTV) based on levels of IPT Model, ii) explain the level of students' skill of information processing in solving LESTV problems; iii) explain students' skill in information processing in solving LESTV problems; and iv) explain students' cognitive process in solving LESTV problems. This study involves three phases: i) development of LESTV problem questions based on Tessmer Model; ii) quantitative survey method on analyzing students' skill level of information processing; and iii) qualitative case study method on analyzing students' cognitive process. The population of the study was 545 eighth grade students represented by a sample of 170 students of five Junior High Schools in Hilir Barat Zone, Palembang (Indonesia) that were chosen using cluster sampling. Fifteen students among them were drawn as a sample for the interview session with saturated information obtained. The data were collected using the LESTV problem solving test and the interview protocol. The quantitative data were analyzed using descriptive statistics, while the qualitative data were analyzed using the content analysis. The finding of this study indicated that students' cognitive process was just at the step of indentifying external source and doing algorithm in short-term memory fluently. Only 15.29% students could retrieve type A information and 5.88% students could retrieve type B information from long-term memory. The implication was the development problems of LESTV had validated IPT Model in modelling students' assessment by different level of hierarchy.

  12. Business Process Modelling Languages in Designing Integrated Information System for Supply Chain Management

    Directory of Open Access Journals (Sweden)

    Mohsen Mohammadi

    2012-01-01

    Full Text Available A business process model is very germane to the formation of an appropriate information system. For a marked infusion of business processes in the supply chain, the status quo regarding the processes must be totally understood and well secured. Business activities and sequence have to be well kept and properly coordinated by predicting business procedures process from diverse views. This study examines seven BPMLs Data Flow Diagram (DFD, Unified Modelling Language (UML, Business Process Modelling Notation(BPMN, Event Driven Process Chain (EPC, IDEF, Petri Net, and Role Activity Diagram (RAD. The submissions of this study are the subject of the Business Process Modelling Languages (BPMLs in developing an integrated dissemination mechanism and classification of modelling tools.

  13. Implications of Building Information Modeling on Interior Design Education: The Impact on Teaching Design Processes

    Directory of Open Access Journals (Sweden)

    Amy Roehl, MFA

    2013-06-01

    Full Text Available Currently, major shifts occur in design processes effecting business practices for industries involved with designing and delivering the built environment. These changing conditions are a direct result of industry adoption of relatively new technologies called BIM or Building Information Modeling. This review of literature examines implications of these changing processes on interior design education.

  14. Implications of Building Information Modeling on Interior Design Education: The Impact on Teaching Design Processes

    Directory of Open Access Journals (Sweden)

    Amy Roehl, MFA

    2013-06-01

    Full Text Available Currently, major shifts occur in design processes effecting business practices for industries involved with designing and delivering the built environment. These changing conditions are a direct result of industry adoption of relatively new technologies called BIM or Building Information Modeling. This review of literature examines implications of these changing processes on interior design education.

  15. Extending the 4I Organizational Learning Model: Information Sources, Foraging Processes and Tools

    Directory of Open Access Journals (Sweden)

    Tracy A. Jenkin

    2013-08-01

    Full Text Available The continued importance of organizational learning has recently led to several calls for further developing the theory. This article addresses these calls by extending Crossan, Lane and White’s (1999 4I model to include a fifth process, information foraging, and a fourth level, the tool. The resulting 5I organizational learning model can be generalized to a number of learning contexts, especially those that involve understanding and making sense of data and information. Given the need for organizations to both innovate and increase productivity, and the volumes of data and information that are available to support both, the 5I model addresses an important organizational issue.

  16. [DESCRIPTION AND PRESENTATION OF THE RESULTS OF ELECTROENCEPHALOGRAM PROCESSING USING AN INFORMATION MODEL].

    Science.gov (United States)

    Myznikov, I L; Nabokov, N L; Rogovanov, D Yu; Khankevich, Yu R

    2016-01-01

    The paper proposes to apply the informational modeling of correlation matrix developed by I.L. Myznikov in early 1990s in neurophysiological investigations, such as electroencephalogram recording and analysis, coherence description of signals from electrodes on the head surface. The authors demonstrate information models built using the data from studies of inert gas inhalation by healthy human subjects. In the opinion of the authors, information models provide an opportunity to describe physiological processes with a high level of generalization. The procedure of presenting the EEG results holds great promise for the broad application.

  17. Age and Creative Productivity: Nonlinear Estimation of an Information-Processing Model.

    Science.gov (United States)

    Simonton, Dean Keith

    1989-01-01

    Applied two-step cognitive model to relationship between age and creative productivity. Selected ideation and elaboration rates as information-processing parameters that define mathematical function which describes age curves and specifies their variance across disciplines. Applied non-linear estimation program to further validate model. Despite…

  18. CONSERVATION PROCESS MODEL (CPM: A TWOFOLD SCIENTIFIC RESEARCH SCOPE IN THE INFORMATION MODELLING FOR CULTURAL HERITAGE

    Directory of Open Access Journals (Sweden)

    D. Fiorani

    2017-05-01

    Full Text Available The aim of the present research is to develop an instrument able to adequately support the conservation process by means of a twofold approach, based on both BIM environment and ontology formalisation. Although BIM has been successfully experimented within AEC (Architecture Engineering Construction field, it has showed many drawbacks for architectural heritage. To cope with unicity and more generally complexity of ancient buildings, applications so far developed have shown to poorly adapt BIM to conservation design with unsatisfactory results (Dore, Murphy 2013; Carrara 2014. In order to combine achievements reached within AEC through BIM environment (design control and management with an appropriate, semantically enriched and flexible The presented model has at its core a knowledge base developed through information ontologies and oriented around the formalization and computability of all the knowledge necessary for the full comprehension of the object of architectural heritage an its conservation. Such a knowledge representation is worked out upon conceptual categories defined above all within architectural criticism and conservation scope. The present paper aims at further extending the scope of conceptual modelling within cultural heritage conservation already formalized by the model. A special focus is directed on decay analysis and surfaces conservation project.

  19. Conservation Process Model (cpm): a Twofold Scientific Research Scope in the Information Modelling for Cultural Heritage

    Science.gov (United States)

    Fiorani, D.; Acierno, M.

    2017-05-01

    The aim of the present research is to develop an instrument able to adequately support the conservation process by means of a twofold approach, based on both BIM environment and ontology formalisation. Although BIM has been successfully experimented within AEC (Architecture Engineering Construction) field, it has showed many drawbacks for architectural heritage. To cope with unicity and more generally complexity of ancient buildings, applications so far developed have shown to poorly adapt BIM to conservation design with unsatisfactory results (Dore, Murphy 2013; Carrara 2014). In order to combine achievements reached within AEC through BIM environment (design control and management) with an appropriate, semantically enriched and flexible The presented model has at its core a knowledge base developed through information ontologies and oriented around the formalization and computability of all the knowledge necessary for the full comprehension of the object of architectural heritage an its conservation. Such a knowledge representation is worked out upon conceptual categories defined above all within architectural criticism and conservation scope. The present paper aims at further extending the scope of conceptual modelling within cultural heritage conservation already formalized by the model. A special focus is directed on decay analysis and surfaces conservation project.

  20. A model of designing as the intersection between uncertainty perception, information processing, and coevolution

    DEFF Research Database (Denmark)

    Lasso, Sarah Venturim; Cash, Philip; Daalhuizen, Jaap

    2016-01-01

    A number of fundamental perspectives on designing have been described in the literature, in particular problem/solution coevolution and information use. However, these different perspectives have to-date been modelled separately, making holistic description of design activity difficult. This paper...... takes the first steps towards linking these disparate perspectives in a model of designing that synthesises coevolution and information processing. How designers act has been shown to play an important role in the process of New Product Development (NPD) (See e.g. Badke-Schaub and Frankenberger, 2012......). Modelling design activity in NPD is typically done in one of three ways; object-, subject- or process oriented. First, it can be modelled by focusing on the object of design: the product. Second, it can be modelled by describing the social interaction and knowledge exchange between actors. Third, design...

  1. Models for Photogrammetric Processing of Information from Resource-P Satellites

    Science.gov (United States)

    Poshekhonov, V.; Eremeev, V.; Kuznetcov, A.; Kochergin, A.

    2016-06-01

    The present paper provides information about imagery and navigation systems of the Russian high resolution satellites "Resource- P". Models of image geolocation used for photogrammetric processing of information from all types of imagery systems are designed. Design of these models is based on two task solutions: correct processing of the measurement information and geometric calibration of the imagery systems. It is shown that for high-precision interior orientation parameters adjustment of the high-resolution "Geoton" instrument the method of self-calibration should be used. The technology of calibration activities is considered. Distinctive features of calibration of the hyperspectral and wide-swath imagery systems are noted. It is represented in the paper that after calibration the root mean square error (RMSE) of measured geodetic coordinates of objects on images do not exceed 10 m. Examples of the obtained models practical application for photogrammetric processing of images from "Resource-P" satellites are shown.

  2. Evolution of the archaeal and mammalian information processing systems: towards an archaeal model for human disease.

    Science.gov (United States)

    Lyu, Zhe; Whitman, William B

    2017-01-01

    Current evolutionary models suggest that Eukaryotes originated from within Archaea instead of being a sister lineage. To test this model of ancient evolution, we review recent studies and compare the three major information processing subsystems of replication, transcription and translation in the Archaea and Eukaryotes. Our hypothesis is that if the Eukaryotes arose within the archaeal radiation, their information processing systems will appear to be one of kind and not wholly original. Within the Eukaryotes, the mammalian or human systems are emphasized because of their importance in understanding health. Biochemical as well as genetic studies provide strong evidence for the functional similarity of archaeal homologs to the mammalian information processing system and their dissimilarity to the bacterial systems. In many independent instances, a simple archaeal system is functionally equivalent to more elaborate eukaryotic homologs, suggesting that evolution of complexity is likely an central feature of the eukaryotic information processing system. Because fewer components are often involved, biochemical characterizations of the archaeal systems are often easier to interpret. Similarly, the archaeal cell provides a genetically and metabolically simpler background, enabling convenient studies on the complex information processing system. Therefore, Archaea could serve as a parsimonious and tractable host for studying human diseases that arise in the information processing systems.

  3. Food Safety Information Processing and Teaching Behavior of Dietitians: A Mental Model Approach

    Directory of Open Access Journals (Sweden)

    Lydia C. Medeiros

    2015-03-01

    Full Text Available Health professionals play an important role in educating the public about food safety risks. However, the ways this important group of educators remains up-to-date on these topics are not well defined. In this study, a national sample of dietitians employed in direct teaching of patients (n = 327 were recruited to complete a web-delivered survey designed to develop a model of factors that promote information processing and teaching in practice about food safety related to fresh vegetables. The resulting mental model demonstrates that dietitians teach fresh vegetable safety using systematic information processing to intellectually understand new information, but this is also associated with a gap in the dietitian’s knowledge of food safety. The juxtaposition of an information processing model with a behavioral model provides valuable new insights about how dietitians seek, acquire and translate/transfer important information to move patients toward a higher goal of food safety. The study also informs food safety educators as they formulate teaching strategies that are more effective than other approaches at promoting behavior change.

  4. Parametric models to relate spike train and LFP dynamics with neural information processing.

    Science.gov (United States)

    Banerjee, Arpan; Dean, Heather L; Pesaran, Bijan

    2012-01-01

    Spike trains and local field potentials (LFPs) resulting from extracellular current flows provide a substrate for neural information processing. Understanding the neural code from simultaneous spike-field recordings and subsequent decoding of information processing events will have widespread applications. One way to demonstrate an understanding of the neural code, with particular advantages for the development of applications, is to formulate a parametric statistical model of neural activity and its covariates. Here, we propose a set of parametric spike-field models (unified models) that can be used with existing decoding algorithms to reveal the timing of task or stimulus specific processing. Our proposed unified modeling framework captures the effects of two important features of information processing: time-varying stimulus-driven inputs and ongoing background activity that occurs even in the absence of environmental inputs. We have applied this framework for decoding neural latencies in simulated and experimentally recorded spike-field sessions obtained from the lateral intraparietal area (LIP) of awake, behaving monkeys performing cued look-and-reach movements to spatial targets. Using both simulated and experimental data, we find that estimates of trial-by-trial parameters are not significantly affected by the presence of ongoing background activity. However, including background activity in the unified model improves goodness of fit for predicting individual spiking events. Uncovering the relationship between the model parameters and the timing of movements offers new ways to test hypotheses about the relationship between neural activity and behavior. We obtained significant spike-field onset time correlations from single trials using a previously published data set where significantly strong correlation was only obtained through trial averaging. We also found that unified models extracted a stronger relationship between neural response latency and trial

  5. Information Processing: A Review of Implications of Johnstone's Model for Science Education

    Science.gov (United States)

    St Clair-Thompson, Helen; Overton, Tina; Botton, Chris

    2010-01-01

    The current review is concerned with an information processing model used in science education. The purpose is to summarise the current theoretical understanding, in published research, of a number of factors that are known to influence learning and achievement. These include field independence, working memory, long-term memory, and the use of…

  6. An Information Theoretic Model of Information Processing in the Drosophila Olfactory System: the Role of Inhibitory Neurons for System Efficiency

    Directory of Open Access Journals (Sweden)

    Faramarz eFaghihi

    2013-12-01

    Full Text Available Fruit flies (Drosophila melanogaster rely on their olfactory system to process environmental information. This information has to be transmitted without system-relevant loss by the olfactory system to deeper brain areas for learning. Here we study the role of several parameters of the fly's olfactory system and the environment and how they influence olfactory information transmission. We have designed an abstract model of the antennal lobe, the mushroom body and the inhibitory circuitry. Mutual information between the olfactory environment, simulated in terms of different odor concentrations, and a sub-population of intrinsic mushroom body neurons (Kenyon cells was calculated to quantify the efficiency of information transmission. With this method we study, on the one hand, the effect of different connectivity rates between olfactory projection neurons and firing thresholds of Kenyon cells. On the other hand, we analyze the influence of inhibition on mutual information between environment and mushroom body. Our simulations show an expected linear relation between the connectivity rate between the antennal lobe and the mushroom body and firing threshold of the Kenyon cells to obtain maximum mutual information for both low and high odor concentrations. However, contradicting all-day experiences, high odor concentrations cause a drastic, and unrealistic, decrease in mutual information for all connectivity rates compared to low concentration. But when inhibition on the mushroom body is included, mutual information remains at high levels independent of other system parameters. This finding points to a pivotal role of inhibition in fly information processing without which the system's efficiency will be substantially reduced.

  7. Information transfer with rate-modulated Poisson processes: a simple model for nonstationary stochastic resonance.

    Science.gov (United States)

    Goychuk, I

    2001-08-01

    Stochastic resonance in a simple model of information transfer is studied for sensory neurons and ensembles of ion channels. An exact expression for the information gain is obtained for the Poisson process with the signal-modulated spiking rate. This result allows one to generalize the conventional stochastic resonance (SR) problem (with periodic input signal) to the arbitrary signals of finite duration (nonstationary SR). Moreover, in the case of a periodic signal, the rate of information gain is compared with the conventional signal-to-noise ratio. The paper establishes the general nonequivalence between both measures notwithstanding their apparent similarity in the limit of weak signals.

  8. Information transfer with rate-modulated Poisson processes: A simple model for nonstationary stochastic resonance

    Science.gov (United States)

    Goychuk, Igor

    2001-08-01

    Stochastic resonance in a simple model of information transfer is studied for sensory neurons and ensembles of ion channels. An exact expression for the information gain is obtained for the Poisson process with the signal-modulated spiking rate. This result allows one to generalize the conventional stochastic resonance (SR) problem (with periodic input signal) to the arbitrary signals of finite duration (nonstationary SR). Moreover, in the case of a periodic signal, the rate of information gain is compared with the conventional signal-to-noise ratio. The paper establishes the general nonequivalence between both measures notwithstanding their apparent similarity in the limit of weak signals.

  9. A cascade model of information processing and encoding for retinal prosthesis

    Directory of Open Access Journals (Sweden)

    Zhi-jun Pei

    2016-01-01

    Full Text Available Retinal prosthesis offers a potential treatment for individuals suffering from photoreceptor degeneration diseases. Establishing biological retinal models and simulating how the biological retina convert incoming light signal into spike trains that can be properly decoded by the brain is a key issue. Some retinal models have been presented, ranking from structural models inspired by the layered architecture to functional models originated from a set of specific physiological phenomena. However, Most of these focus on stimulus image compression, edge detection and reconstruction, but do not generate spike trains corresponding to visual image. In this study, based on state-of-the-art retinal physiological mechanism, including effective visual information extraction, static nonlinear rectification of biological systems and neurons Poisson coding, a cascade model of the retina including the out plexiform layer for information processing and the inner plexiform layer for information encoding was brought forward, which integrates both anatomic connections and functional computations of retina. Using MATLAB software, spike trains corresponding to stimulus image were numerically computed by four steps: linear spatiotemporal filtering, static nonlinear rectification, radial sampling and then Poisson spike generation. The simulated results suggested that such a cascade model could recreate visual information processing and encoding functionalities of the retina, which is helpful in developing artificial retina for the retinally blind.

  10. A cascade model of information processing and encoding for retinal prosthesis

    Institute of Scientific and Technical Information of China (English)

    Zhi-jun Pei; Guan-xin Gao; Bo Hao; Qing-li Qiao; Hui-jian Ai

    2016-01-01

    Retinal prosthesis offers a potential treatment for individuals suffering from photoreceptor degeneration diseases. Establishing biological retinal models and simulating how the biological retina convert incoming light signal into spike trains that can be properly decoded by the brain is a key issue. Some retinal models have been presented, ranking from structural models inspired by the layered architecture to functional models originated from a set of speciifc physiological phenomena. However, Most of these focus on stimulus image com-pression, edge detection and reconstruction, but do not generate spike trains corresponding to visual image. In this study, based on state-of-the-art retinal physiological mechanism, including effective visual information extraction, static nonlinear rectiifcation of biological systems and neurons Poisson coding, a cascade model of the retina including the out plexiform layer for information processing and the inner plexiform layer for information encoding was brought forward, which integrates both anatomic connections and functional com-putations of retina. Using MATLAB software, spike trains corresponding to stimulus image were numerically computed by four steps:linear spatiotemporal ifltering, static nonlinear rectiifcation, radial sampling and then Poisson spike generation. The simulated results suggested that such a cascade model could recreate visual information processing and encoding functionalities of the retina, which is helpful in developing artiifcial retina for the retinally blind.

  11. An investigative model evaluating how consumers process pictorial information on nonprescription medication labels.

    Science.gov (United States)

    Sansgiry, S S; Cady, P S

    1997-01-01

    Currently, marketed over-the-counter (OTC) medication labels were simulated and tested in a controlled environment to understand consumer evaluation of OTC label information. Two factors, consumers' age (younger and older adults) and label designs (picture-only, verbal-only, congruent picture-verbal, and noncongruent picture-verbal) were controlled and tested to evaluate consumer information processing. The effects exerted by the independent variables, namely, comprehension of label information (understanding) and product evaluations (satisfaction, certainty, and perceived confusion) were evaluated on the dependent variable purchase intention. Intention measured as purchase recommendation was significantly related to product evaluations and affected by the factor label design. Participants' level of perceived confusion was more important than actual understanding of information on OTC medication labels. A Label Evaluation Process Model was developed which could be used for future testing of OTC medication labels.

  12. A model of designing as the intersection between uncertainty perception, information processing, and coevolution

    DEFF Research Database (Denmark)

    Lasso, Sarah Venturim; Cash, Philip; Daalhuizen, Jaap

    2016-01-01

    ). In order to resolve uncertainty, both individuals and teams need to engage in decision making. In the case of decision making in a team, there is also greater scope for uncertainty, since personality and cognitive style influence decision making (Dewberry, Juanchich & Narendran, 2013) and every person has......), and the transformation of information (Aurisicchio et al., 2013). However, prior research has traditionally modelled these perspectives separately; making holistic description of designer activity difficult. Thus, the aim of this paper is to propose a model that links design coevolution and information processing via......A number of fundamental perspectives on designing have been described in the literature, in particular problem/solution coevolution and information use. However, these different perspectives have to-date been modelled separately, making holistic description of design activity difficult. This paper...

  13. Parametric models to relate spike train and LFP dynamics with neural information processing

    Directory of Open Access Journals (Sweden)

    Arpan eBanerjee

    2012-07-01

    Full Text Available Spike trains and local field potentials resulting from extracellular current flows provide a substrate for neural information processing. Understanding the neural code from simultaneous spike-field recordings and subsequent decoding of information processing events will have widespread applications. One way to demonstrate an understanding of the neural code, with particular advantages for the development of applications, is to formulate a parametric statistical model of neural activity and its covariates. Here, we propose a set of parametric spike-field models (unified models that can be used with existing decoding algorithms to reveal the timing of task or stimulus specific processing. Our proposed unified modeling framework captures the effects of two important features of information processing: time-varying stimulus driven inputs and ongoing background activity that occurs even in the absence of environmental inputs. We have applied this framework for decoding neural latencies in simulated and experimentally recorded spike-field sessions obtained from the lateral intraparietal area (LIP of awake, behaving monkeys performing cued look-and-reach movements to spatial targets. Using both simulated and experimental data, we find that estimates of trial-by-trial parameters are not significantly affected by the presence of ongoing background activity. However, including background activity in the unified model improves goodness of fit for predicting individual spiking events. Trial-by-trial spike-field correlation in visual response onset times are higher when the unified model is used, matching with corresponding values obtained using earlier trial-averaged measures on a previously published data set. Uncovering the relationship between the model parameters and the timing of movements offers new ways to test hypotheses about the relationship between neural activity and behavior.

  14. Formalize clinical processes into electronic health information systems: Modelling a screening service for diabetic retinopathy.

    Science.gov (United States)

    Eguzkiza, Aitor; Trigo, Jesús Daniel; Martínez-Espronceda, Miguel; Serrano, Luis; Andonegui, José

    2015-08-01

    Most healthcare services use information and communication technologies to reduce and redistribute the workload associated with follow-up of chronic conditions. However, the lack of normalization of the information handled in and exchanged between such services hinders the scalability and extendibility. The use of medical standards for modelling and exchanging information, especially dual-model based approaches, can enhance the features of screening services. Hence, the approach of this paper is twofold. First, this article presents a generic methodology to model patient-centered clinical processes. Second, a proof of concept of the proposed methodology was conducted within the diabetic retinopathy (DR) screening service of the Health Service of Navarre (Spain) in compliance with a specific dual-model norm (openEHR). As a result, a set of elements required for deploying a model-driven DR screening service has been established, namely: clinical concepts, archetypes, termsets, templates, guideline definition rules, and user interface definitions. This model fosters reusability, because those elements are available to be downloaded and integrated in any healthcare service, and interoperability, since from then on such services can share information seamlessly.

  15. Hierarchical network model for the analysis of human spatio-temporal information processing

    Science.gov (United States)

    Schill, Kerstin; Baier, Volker; Roehrbein, Florian; Brauer, Wilfried

    2001-06-01

    The perception of spatio-temporal pattern is a fundamental part of visual cognition. In order to understand more about the principles behind these biological processes, we are analyzing and modeling the presentation of spatio-temporal structures on different levels of abstraction. For the low- level processing of motion information we have argued for the existence of a spatio-temporal memory in early vision. The basic properties of this structure are reflected in a neural network model which is currently developed. Here we discuss major architectural features of this network which is base don Kohonens SOMs. In order to enable the representation, processing and prediction of spatio-temporal pattern on different levels of granularity and abstraction the SOMs are organized in a hierarchical manner. The model has the advantage of a 'self-teaching' learning algorithm and stored temporal information try local feedback in each computational layer. The constraints for the neural modeling and data set for training the neural network are obtained by psychophysical experiments where human subjects' abilities for dealing with spatio-temporal information is investigated.

  16. Model-free stochastic processes studied with q-wavelet-based informational tools

    Energy Technology Data Exchange (ETDEWEB)

    Perez, D.G. [Instituto de Fisica, Pontificia Universidad Catolica de Valparaiso (PUCV), 23-40025 Valparaiso (Chile)]. E-mail: dario.perez@ucv.cl; Zunino, L. [Centro de Investigaciones Opticas, C.C. 124 Correo Central, 1900 La Plata (Argentina) and Departamento de Ciencias Basicas, Facultad de Ingenieria, Universidad Nacional de La Plata (UNLP), 1900 La Plata (Argentina) and Departamento de Fisica, Facultad de Ciencias Exactas, Universidad Nacional de La Plata, 1900 La Plata (Argentina)]. E-mail: lucianoz@ciop.unlp.edu.ar; Martin, M.T. [Instituto de Fisica (IFLP), Facultad de Ciencias Exactas, Universidad Nacional de La Plata and Argentina' s National Council (CONICET), C.C. 727, 1900 La Plata (Argentina)]. E-mail: mtmartin@venus.unlp.edu.ar; Garavaglia, M. [Centro de Investigaciones Opticas, C.C. 124 Correo Central, 1900 La Plata (Argentina) and Departamento de Fisica, Facultad de Ciencias Exactas, Universidad Nacional de La Plata, 1900 La Plata (Argentina)]. E-mail: garavagliam@ciop.unlp.edu.ar; Plastino, A. [Instituto de Fisica (IFLP), Facultad de Ciencias Exactas, Universidad Nacional de La Plata and Argentina' s National Council (CONICET), C.C. 727, 1900 La Plata (Argentina)]. E-mail: plastino@venus.unlp.edu.ar; Rosso, O.A. [Chaos and Biology Group, Instituto de Calculo, Facultad de Ciencias Exactas y Naturales, Universidad de Buenos Aires, Pabellon II, Ciudad Universitaria, 1428 Ciudad de Buenos Aires (Argentina)]. E-mail: oarosso@fibertel.com.ar

    2007-04-30

    We undertake a model-free investigation of stochastic processes employing q-wavelet based quantifiers, that constitute a generalization of their Shannon counterparts. It is shown that (i) interesting physical information becomes accessible in such a way (ii) for special q values the quantifiers are more sensitive than the Shannon ones and (iii) there exist an implicit relationship between the Hurst parameter H and q within this wavelet framework.

  17. Information Theory Analysis of Cascading Process in a Synthetic Model of Fluid Turbulence

    Directory of Open Access Journals (Sweden)

    Massimo Materassi

    2014-02-01

    Full Text Available The use of transfer entropy has proven to be helpful in detecting which is the verse of dynamical driving in the interaction of two processes, X and Y . In this paper, we present a different normalization for the transfer entropy, which is capable of better detecting the information transfer direction. This new normalized transfer entropy is applied to the detection of the verse of energy flux transfer in a synthetic model of fluid turbulence, namely the Gledzer–Ohkitana–Yamada shell model. Indeed, this is a fully well-known model able to model the fully developed turbulence in the Fourier space, which is characterized by an energy cascade towards the small scales (large wavenumbers k, so that the application of the information-theory analysis to its outcome tests the reliability of the analysis tool rather than exploring the model physics. As a result, the presence of a direct cascade along the scales in the shell model and the locality of the interactions in the space of wavenumbers come out as expected, indicating the validity of this data analysis tool. In this context, the use of a normalized version of transfer entropy, able to account for the difference of the intrinsic randomness of the interacting processes, appears to perform better, being able to discriminate the wrong conclusions to which the “traditional” transfer entropy would drive.

  18. The Impact of Building Information Modeling on the Architectural Design Process

    Science.gov (United States)

    Moreira, P. F.; Silva, Neander F.; Lima, Ecilamar M.

    Many benefits of Building Information Modeling, BIM, have been suggested by several authors and by software vendors. In this paper we describe an experiment in which two groups of designers were observed developing an assigned design task. One of the groups used a BIM system, while the other used a standard computer-aided drafting system. The results show that some of the promises of BIM hold true, such as consistency maintenance and error avoidance in the design documentation process. Other promises such as changing the design process itself seemed also promising but they need more research to determine to greater extent the depth of such changes.

  19. Information Processing in Single Cells and Small Networks: Insights from Compartmental Models

    Science.gov (United States)

    Poirazi, Panayiota

    2009-03-01

    The goal of this paper is to present a set of predictions generated by detailed compartmental models regarding the ways in which information may be processed, encoded and propagated by single cells and neural assemblies. Towards this goal, I will review a number of modelling studies from our lab that investigate how single pyramidal neurons and small neural networks in different brain regions process incoming signals that are associated with learning and memory. I will first discuss the computational capabilities of individual pyramidal neurons in the hippocampus [1-3] and how these properties may allow a single cell to discriminate between different memories [4]. I will then present biophysical models of prefrontal layer V neurons and small networks that exhibit sustained activity under realistic synaptic stimulation and discuss their potential role in working memory [5].

  20. Clinical information modeling processes for semantic interoperability of electronic health records: systematic review and inductive analysis.

    Science.gov (United States)

    Moreno-Conde, Alberto; Moner, David; Cruz, Wellington Dimas da; Santos, Marcelo R; Maldonado, José Alberto; Robles, Montserrat; Kalra, Dipak

    2015-07-01

    This systematic review aims to identify and compare the existing processes and methodologies that have been published in the literature for defining clinical information models (CIMs) that support the semantic interoperability of electronic health record (EHR) systems. Following the preferred reporting items for systematic reviews and meta-analyses systematic review methodology, the authors reviewed published papers between 2000 and 2013 that covered that semantic interoperability of EHRs, found by searching the PubMed, IEEE Xplore, and ScienceDirect databases. Additionally, after selection of a final group of articles, an inductive content analysis was done to summarize the steps and methodologies followed in order to build CIMs described in those articles. Three hundred and seventy-eight articles were screened and thirty six were selected for full review. The articles selected for full review were analyzed to extract relevant information for the analysis and characterized according to the steps the authors had followed for clinical information modeling. Most of the reviewed papers lack a detailed description of the modeling methodologies used to create CIMs. A representative example is the lack of description related to the definition of terminology bindings and the publication of the generated models. However, this systematic review confirms that most clinical information modeling activities follow very similar steps for the definition of CIMs. Having a robust and shared methodology could improve their correctness, reliability, and quality. Independently of implementation technologies and standards, it is possible to find common patterns in methods for developing CIMs, suggesting the viability of defining a unified good practice methodology to be used by any clinical information modeler. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. A Novel Information-Theoretic Approach for Variable Clustering and Predictive Modeling Using Dirichlet Process Mixtures

    Science.gov (United States)

    Chen, Yun; Yang, Hui

    2016-12-01

    In the era of big data, there are increasing interests on clustering variables for the minimization of data redundancy and the maximization of variable relevancy. Existing clustering methods, however, depend on nontrivial assumptions about the data structure. Note that nonlinear interdependence among variables poses significant challenges on the traditional framework of predictive modeling. In the present work, we reformulate the problem of variable clustering from an information theoretic perspective that does not require the assumption of data structure for the identification of nonlinear interdependence among variables. Specifically, we propose the use of mutual information to characterize and measure nonlinear correlation structures among variables. Further, we develop Dirichlet process (DP) models to cluster variables based on the mutual-information measures among variables. Finally, orthonormalized variables in each cluster are integrated with group elastic-net model to improve the performance of predictive modeling. Both simulation and real-world case studies showed that the proposed methodology not only effectively reveals the nonlinear interdependence structures among variables but also outperforms traditional variable clustering algorithms such as hierarchical clustering.

  2. A Novel Information-Theoretic Approach for Variable Clustering and Predictive Modeling Using Dirichlet Process Mixtures.

    Science.gov (United States)

    Chen, Yun; Yang, Hui

    2016-12-14

    In the era of big data, there are increasing interests on clustering variables for the minimization of data redundancy and the maximization of variable relevancy. Existing clustering methods, however, depend on nontrivial assumptions about the data structure. Note that nonlinear interdependence among variables poses significant challenges on the traditional framework of predictive modeling. In the present work, we reformulate the problem of variable clustering from an information theoretic perspective that does not require the assumption of data structure for the identification of nonlinear interdependence among variables. Specifically, we propose the use of mutual information to characterize and measure nonlinear correlation structures among variables. Further, we develop Dirichlet process (DP) models to cluster variables based on the mutual-information measures among variables. Finally, orthonormalized variables in each cluster are integrated with group elastic-net model to improve the performance of predictive modeling. Both simulation and real-world case studies showed that the proposed methodology not only effectively reveals the nonlinear interdependence structures among variables but also outperforms traditional variable clustering algorithms such as hierarchical clustering.

  3. Financial information processing

    Institute of Scientific and Technical Information of China (English)

    Shuo BAI; Shouyang WANG; Lean YU; Aoying ZHOU

    2009-01-01

    @@ The rapid growth in financial data volume has made financial information processing more and more difficult due to the increase in complexity, which has forced businesses and academics alike to turn to sophisticated information processing technologies for better solutions. A typical feature is that high-performance computers and advanced computational techniques play ever-increasingly important roles for business and industries to have competitive advantages. Accordingly, financial information processing has emerged as a new cross-disciplinary field integrating computer science, mathematics, financial economics, intelligent techniques, and computer simulations to make different decisions based on processed financial information.

  4. Emotional voices in context: A neurobiological model of multimodal affective information processing

    Science.gov (United States)

    Brück, Carolin; Kreifelts, Benjamin; Wildgruber, Dirk

    2011-12-01

    Just as eyes are often considered a gateway to the soul, the human voice offers a window through which we gain access to our fellow human beings' minds - their attitudes, intentions and feelings. Whether in talking or singing, crying or laughing, sighing or screaming, the sheer sound of a voice communicates a wealth of information that, in turn, may serve the observant listener as valuable guidepost in social interaction. But how do human beings extract information from the tone of a voice? In an attempt to answer this question, the present article reviews empirical evidence detailing the cerebral processes that underlie our ability to decode emotional information from vocal signals. The review will focus primarily on two prominent classes of vocal emotion cues: laughter and speech prosody (i.e. the tone of voice while speaking). Following a brief introduction, behavioral as well as neuroimaging data will be summarized that allows to outline cerebral mechanisms associated with the decoding of emotional voice cues, as well as the influence of various context variables (e.g. co-occurring facial and verbal emotional signals, attention focus, person-specific parameters such as gender and personality) on the respective processes. Building on the presented evidence, a cerebral network model will be introduced that proposes a differential contribution of various cortical and subcortical brain structures to the processing of emotional voice signals both in isolation and in context of accompanying (facial and verbal) emotional cues.

  5. Algorithm-structured computer arrays and networks architectures and processes for images, percepts, models, information

    CERN Document Server

    Uhr, Leonard

    1984-01-01

    Computer Science and Applied Mathematics: Algorithm-Structured Computer Arrays and Networks: Architectures and Processes for Images, Percepts, Models, Information examines the parallel-array, pipeline, and other network multi-computers.This book describes and explores arrays and networks, those built, being designed, or proposed. The problems of developing higher-level languages for systems and designing algorithm, program, data flow, and computer structure are also discussed. This text likewise describes several sequences of successively more general attempts to combine the power of arrays wi

  6. Hybrid quantum information processing

    Energy Technology Data Exchange (ETDEWEB)

    Furusawa, Akira [Department of Applied Physics, School of Engineering, The University of Tokyo (Japan)

    2014-12-04

    I will briefly explain the definition and advantage of hybrid quantum information processing, which is hybridization of qubit and continuous-variable technologies. The final goal would be realization of universal gate sets both for qubit and continuous-variable quantum information processing with the hybrid technologies. For that purpose, qubit teleportation with a continuousvariable teleporter is one of the most important ingredients.

  7. Pyphant – A Python Framework for Modelling Reusable Information Processing Tasks

    Directory of Open Access Journals (Sweden)

    2008-06-01

    Full Text Available We are presenting the Python framework “Pyphant” for the creation and application of information flow models. The central idea of this approach is to encapsulate each data processing step in one unit which we call a worker. A worker receives input via sockets and provides the results of its data processing via plugs. These can be connected to other workers' sockets. The resulting directed graph is called a recipe. Classes for these objects comprise the Pyphant core. To implement the actual processing steps, Pyphant relies on third-party plug-ins which extend the basic worker class and can be distributed as Python eggs. On top of the core, Pyphant offers an information exchange layer which facilitates the interoperability of the workers, using Numpy objects. A third layer comprises textual and graphical user interfaces. The former allows for the batch processing of data and the latter allows for the interactive construction of recipes.

    This paper discusses the Pyphant framework and presents an example recipe for determining the length scale of aggregated polymeric phases, building an amphiphilic conetwork from an Atomic Force Microscopy (AFM phase mode image.

  8. The role of categorization and scale endpoint comparisons in numerical information processing: A two-process model.

    Science.gov (United States)

    Tao, Tao; Wyer, Robert S; Zheng, Yuhuang

    2017-03-01

    We propose a two-process conceptualization of numerical information processing to describe how people form impressions of a score that is described along a bounded scale. According to the model, people spontaneously categorize a score as high or low. Furthermore, they compare the numerical discrepancy between the score and the endpoint of the scale to which it is closer, if they are not confident of their categorization, and use implications of this comparison as a basis for judgment. As a result, their evaluation of the score is less extreme when the range of numbers along the scale is large (e.g., from 0 to 100) than when it is small (from 0 to 10). Six experiments support this two-process model and demonstrate its generalizability. Specifically, the magnitude of numbers composing the scale has less impact on judgments (a) when the score being evaluated is extreme, (b) when individuals are unmotivated to engage in endpoint comparison processes (i.e., they are low in need for cognition), and (c) when they are unable to do so (i.e., they are under cognitive load). Moreover, the endpoint to which individuals compare the score can depend on their regulatory focus. (PsycINFO Database Record

  9. System Model Bias Processing Approach for Regional Coordinated States Information Involved Filtering

    Directory of Open Access Journals (Sweden)

    Zebo Zhou

    2016-01-01

    Full Text Available In the Kalman filtering applications, the conventional dynamic model which connects the states information of two consecutive epochs by state transition matrix is usually predefined and assumed to be invariant. Aiming to improve the adaptability and accuracy of dynamic model, we propose multiple historical states involved filtering algorithm. An autoregressive model is used as the dynamic model which is subsequently combined with observation model for deriving the optimal window-recursive filter formulae in the sense of minimum mean square error principle. The corresponding test statistics characteristics of system residuals are discussed in details. The test statistics of regional predicted residuals are then constructed in a time-window for model bias testing with two hypotheses, that is, the null and alternative hypotheses. Based on the innovations test statistics, we develop a model bias processing procedure including bias detection, location identification, and state correction. Finally, the minimum detectable bias and bias-to-noise ratio are both computed for evaluating the internal and external reliability of overall system, respectively.

  10. Scientific information processing procedures

    Directory of Open Access Journals (Sweden)

    García, Maylin

    2013-07-01

    Full Text Available The paper systematizes several theoretical view-points on scientific information processing skill. It decomposes the processing skills into sub-skills. Several methods such analysis, synthesis, induction, deduction, document analysis were used to build up a theoretical framework. Interviews and survey to professional being trained and a case study was carried out to evaluate the results. All professional in the sample improved their performance in scientific information processing.

  11. Developing a Process Model for the Forensic Extraction of Information from Desktop Search Applications

    Directory of Open Access Journals (Sweden)

    Timothy Pavlic

    2008-03-01

    Full Text Available Desktop search applications can contain cached copies of files that were deleted from the file system. Forensic investigators see this as a potential source of evidence, as documents deleted by suspects may still exist in the cache. Whilst there have been attempts at recovering data collected by desktop search applications, there is no methodology governing the process, nor discussion on the most appropriate means to do so. This article seeks to address this issue by developing a process model that can be applied when developing an information extraction application for desktop search applications, discussing preferred methods and the limitations of each. This work represents a more structured approach than other forms of current research.

  12. Parametric Accuracy: Building Information Modeling Process Applied to the Cultural Heritage Preservation

    Science.gov (United States)

    Garagnani, S.; Manferdini, A. M.

    2013-02-01

    Since their introduction, modeling tools aimed to architectural design evolved in today's "digital multi-purpose drawing boards" based on enhanced parametric elements able to originate whole buildings within virtual environments. Semantic splitting and elements topology are features that allow objects to be "intelligent" (i.e. self-aware of what kind of element they are and with whom they can interact), representing this way basics of Building Information Modeling (BIM), a coordinated, consistent and always up to date workflow improved in order to reach higher quality, reliability and cost reductions all over the design process. Even if BIM was originally intended for new architectures, its attitude to store semantic inter-related information can be successfully applied to existing buildings as well, especially if they deserve particular care such as Cultural Heritage sites. BIM engines can easily manage simple parametric geometries, collapsing them to standard primitives connected through hierarchical relationships: however, when components are generated by existing morphologies, for example acquiring point clouds by digital photogrammetry or laser scanning equipment, complex abstractions have to be introduced while remodeling elements by hand, since automatic feature extraction in available software is still not effective. In order to introduce a methodology destined to process point cloud data in a BIM environment with high accuracy, this paper describes some experiences on monumental sites documentation, generated through a plug-in written for Autodesk Revit and codenamed GreenSpider after its capability to layout points in space as if they were nodes of an ideal cobweb.

  13. Construction Process Simulation and Safety Analysis Based on Building Information Model and 4D Technology

    Institute of Scientific and Technical Information of China (English)

    HU Zhenzhong; ZHANG Jianping; DENG Ziyin

    2008-01-01

    Time-dependent structure analysis theory has been proved to be more accurate and reliable com-pared to commonly used methods during construction. However, so far applications are limited to partial pe-riod and part of the structure because of immeasurable artificial intervention. Based on the building informa-tion model (BIM) and four-dimensional (4D) technology, this paper proposes an improves structure analysis method, which can generate structural geometry, resistance model, and loading conditions automatically by a close interlink of the schedule information, architectural model, and material properties. The method was applied to a safety analysis during a continuous and dynamic simulation of the entire construction process.The results show that the organic combination of the BIM, 4D technology, construction simulation, and safety analysis of time-dependent structures is feasible and practical. This research also lays a foundation for further researches on building lifecycle management by combining architectural design, structure analy-sis, and construction management.

  14. Sparsity and Information Processing

    OpenAIRE

    Ikeda, Shiro

    2015-01-01

    Recently, many information processing methods utilizing the sparsity of the information source is studied. We have reported some results on this line of research. Here we pick up two results from our own works. One is an image reconstruction method for radio interferometory and the other is a motor command computation method for a two-joint arm.

  15. Progress in bionic information processing techniques for an electronic nose based on olfactory models

    Institute of Scientific and Technical Information of China (English)

    LI Guang; FU Jun; ZHANG Jia; ZHENG JunBao

    2009-01-01

    As a novel bionic analytical technique, an electronic nose, inspired by the mechanism of the biological olfactory system and integrated with modern sensing technology, electronic technology and pattern recognition technology, has been widely used in many areas. Moreover, recent basic research findings in biological olfaction combined with computational neuroscience promote its development both in methodology and application. In this review, the basic information processing principle of biological olfaction and artificial olfaction are summarized and compared, and four olfactory models and their applications to electronic noses are presented. Finally, a chaotic olfactory neural network is detailed and the utilization of several biologically oriented learning rules and its spatiotemporal dynamic prop-ties for electronic noses are discussed. The integration of various phenomena and their mechanisms for biological olfaction into an electronic nose context for information processing will not only make them more bionic, but also perform better than conventional methods. However, many problems still remain, which should be solved by further cooperation between theorists and engineers.

  16. CONSERVATION PROCESS MODEL (CPM): A TWOFOLD SCIENTIFIC RESEARCH SCOPE IN THE INFORMATION MODELLING FOR CULTURAL HERITAGE

    National Research Council Canada - National Science Library

    D. Fiorani; M. Acierno

    2017-01-01

    The aim of the present research is to develop an instrument able to adequately support the conservation process by means of a twofold approach, based on both BIM environment and ontology formalisation...

  17. Towards Predictive Modeling of Information Processing in Microbial Ecosystems With Quorum-Sensing Interactions

    Science.gov (United States)

    Yusufaly, Tahir; Boedicker, James

    Bacteria communicate using external chemical signals in a process known as quorum sensing. However, the efficiency of this communication is reduced by both limitations on the rate of diffusion over long distances and potential interference from neighboring strains. Therefore, having a framework to quantitatively predict how spatial structure and biodiversity shape information processing in bacterial colonies is important, both for understanding the evolutionary dynamics of natural microbial ecosystems, and for the rational design of synthetic ecosystems with desired computational properties. As a first step towards these goals, we implement a reaction-diffusion model to study the dynamics of a LuxI/LuxR quorum sensing circuit in a growing bacterial population. The spatiotemporal concentration profile of acyl-homoserine lactone (AHL) signaling molecules is analyzed, and used to define a measure of physical and functional signaling network connectivity. From this, we systematically investigate how different initial distributions of bacterial populations influence the subsequent efficiency of collective long-range signal propagation in the population. We compare our results with known experimental data, and discuss limitations and extensions to our modeling framework.-/abstract-

  18. Information Model for Product Modeling

    Institute of Scientific and Technical Information of China (English)

    焦国方; 刘慎权

    1992-01-01

    The Key problems in product modeling for integrated CAD ∥CAM systems are the information structures and representations of products.They are taking more and more important roles in engineering applications.With the investigation on engineering product information and from the viewpoint of industrial process,in this paper,the information models are proposed and the definitions of the framework of product information are given.And then,the integration and the consistence of product information are discussed by introucing the entity and its instance.As a summary,the information structures described in this paper have many advantage and natures helpful in engineering design.

  19. Models of neural networks temporal aspects of coding and information processing in biological systems

    CERN Document Server

    Hemmen, J; Schulten, Klaus

    1994-01-01

    Since the appearance of Vol. 1 of Models of Neural Networks in 1991, the theory of neural nets has focused on two paradigms: information coding through coherent firing of the neurons and functional feedback. Information coding through coherent neuronal firing exploits time as a cardinal degree of freedom. This capacity of a neural network rests on the fact that the neuronal action potential is a short, say 1 ms, spike, localized in space and time. Spatial as well as temporal correlations of activity may represent different states of a network. In particular, temporal correlations of activity may express that neurons process the same "object" of, for example, a visual scene by spiking at the very same time. The traditional description of a neural network through a firing rate, the famous S-shaped curve, presupposes a wide time window of, say, at least 100 ms. It thus fails to exploit the capacity to "bind" sets of coherently firing neurons for the purpose of both scene segmentation and figure-ground segregatio...

  20. Modeling Visual Information Processing in Brain: A Computer Vision Point of View and Approach

    CERN Document Server

    Diamant, Emanuel

    2007-01-01

    We live in the Information Age, and information has become a critically important component of our life. The success of the Internet made huge amounts of it easily available and accessible to everyone. To keep the flow of this information manageable, means for its faultless circulation and effective handling have become urgently required. Considerable research efforts are dedicated today to address this necessity, but they are seriously hampered by the lack of a common agreement about "What is information?" In particular, what is "visual information" - human's primary input from the surrounding world. The problem is further aggravated by a long-lasting stance borrowed from the biological vision research that assumes human-like information processing as an enigmatic mix of perceptual and cognitive vision faculties. I am trying to find a remedy for this bizarre situation. Relying on a new definition of "information", which can be derived from Kolmogorov's compexity theory and Chaitin's notion of algorithmic inf...

  1. Process Principle of Information

    Institute of Scientific and Technical Information of China (English)

    张高锋; 任君

    2006-01-01

    Ⅰ.IntroductionInformation structure is the organization modelof given and New information in the course ofinformation transmission.A discourse contains avariety of information and not all the informationlisted in the discourse is necessary and useful to us.When we decode a discourse,usually,we do not needto read every word in the discourse or text but skimor scan the discourse or text to search what we thinkis important or useful to us in the discourse as quicklyas possible.Ⅱ.Process Principles of Informati...

  2. The Practice of Information Processing Model in the Teaching of Cognitive Strategies

    Science.gov (United States)

    Ozel, Ali

    2009-01-01

    In this research, the differentiation condition of teaching the learning strategies depending on the time which the first grade of primary school teachers carried out to form an information-process skeleton on student is tried to be found out. This process including the efforts of 260 teachers in this direction consists of whether the adequate…

  3. Information services and information processing

    Science.gov (United States)

    1975-01-01

    Attempts made to design and extend space system capabilities are reported. Special attention was given to establishing user needs for information or services which might be provided by space systems. Data given do not attempt to detail scientific, technical, or economic bases for the needs expressed by the users.

  4. The role of information in a lifetime process - a model of weight maintenance by women over long time periods

    Directory of Open Access Journals (Sweden)

    Judit Bar-Ilan

    2006-01-01

    Full Text Available Introduction. This paper proposes a model of information behaviour of women during their life-long struggle to maintain normal weight. Method. The model is integrative and contextual, built on existing models in information science and several other disciplines, and the life stories of about fifty Israeli women aged 25-55 and interviews with professionals. Analysis. The life stories of the participating women were analyzed qualitatively, major themes and phases were identified. Results. Weight loss and/or maintenance behaviour is a lifetime process in which distinctive stages were identified. In most cases the weight gain - weight loss - maintenance cycle is a recurring cycle. Information is a major resource during the process: several roles of information were defined: enabling, motivating, reinforcing, providing background information related to weight problems and creating the internal cognitive schema related to food and weight. Information behaviour and the roles of information vary with the different stages. Information needs are also influenced by the specific stage of the process. Information gathered at previous cycles is reused, and information gained through previous experience effects behaviour in the current cycle. Conclusion. The model has both theoretical and practical implications.

  5. Development of Energy Models for Production Systems and Processes to Inform Environmentally Benign Decision-Making

    Science.gov (United States)

    Diaz-Elsayed, Nancy

    Between 2008 and 2035 global energy demand is expected to grow by 53%. While most industry-level analyses of manufacturing in the United States (U.S.) have traditionally focused on high energy consumers such as the petroleum, chemical, paper, primary metal, and food sectors, the remaining sectors account for the majority of establishments in the U.S. Specifically, of the establishments participating in the Energy Information Administration's Manufacturing Energy Consumption Survey in 2006, the non-energy intensive" sectors still consumed 4*109 GJ of energy, i.e., one-quarter of the energy consumed by the manufacturing sectors, which is enough to power 98 million homes for a year. The increasing use of renewable energy sources and the introduction of energy-efficient technologies in manufacturing operations support the advancement towards a cleaner future, but having a good understanding of how the systems and processes function can reduce the environmental burden even further. To facilitate this, methods are developed to model the energy of manufacturing across three hierarchical levels: production equipment, factory operations, and industry; these methods are used to accurately assess the current state and provide effective recommendations to further reduce energy consumption. First, the energy consumption of production equipment is characterized to provide machine operators and product designers with viable methods to estimate the environmental impact of the manufacturing phase of a product. The energy model of production equipment is tested and found to have an average accuracy of 97% for a product requiring machining with a variable material removal rate profile. However, changing the use of production equipment alone will not result in an optimal solution since machines are part of a larger system. Which machines to use, how to schedule production runs while accounting for idle time, the design of the factory layout to facilitate production, and even the

  6. Strategy, information processing and scorecard models in the UK financial services sector

    Directory of Open Access Journals (Sweden)

    Judith Broady-Preston; Tim Hayward

    2001-01-01

    Full Text Available In the current turbulent business environment, quality information is required to ensure that companies achieve competitive advantage by using such information to make decisions more rapidly than their rivals. Scorecard models are employed increasingly to translate the mission and strategy of a company into a comprehensive set of performance measures, providing the framework for a strategic management system. Presented here are the results of a British Library Research and Innovation Centre funded project which examined the role of information in the strategic management of UK retail banks. All the banks surveyed used scorecard models of some description, principally to ease blockages in information flow. Information gathering and analysis activities were perceived as necessary elements of the work of all managers, especially when viewed in conjunction with the trend towards bottom-up strategy formulation in companies.

  7. Information Processing - Administrative Data Processing

    Science.gov (United States)

    Bubenko, Janis

    A three semester, 60-credit course package in the topic of Administrative Data Processing (ADP), offered in 1966 at Stockholm University (SU) and the Royal Institute of Technology (KTH) is described. The package had an information systems engineering orientation. The first semester focused on datalogical topics, while the second semester focused on the infological topics. The third semester aimed to deepen the students’ knowledge in different parts of ADP and at writing a bachelor thesis. The concluding section of this paper discusses various aspects of the department’s first course effort. The course package led to a concretisation of our discipline and gave our discipline an identity. Our education seemed modern, “just in time”, and well adapted to practical needs. The course package formed the first concrete activity of a group of young teachers and researchers. In a forty-year perspective, these people have further developed the department and the topic to an internationally well-reputed body of knowledge and research. The department has produced more than thirty professors and more than one hundred doctoral degrees.

  8. [Inappropriateness in ionizing imaging. The central node of the informed consent: from "event" model to "process" model].

    Science.gov (United States)

    Dodaro, Antonio; Recchia, Virginia

    2011-11-01

    The phenomenon of inappropriateness in ionizing imaging and medical interventions is large-scale and increasing. This tendency causes noteworthy damages to health and to patient's autonomy. Moreover, this trend causes a huge increment of health expenditures, waiting lists, organizational conflicts, judicial disputes, insurance compensations. The actual passive signature on unreadable templates of informed consent in the Italian hospital context constitutes, by a matter of facts, a central node of inappropriateness problem. This way to manage informed consent - "event" model - mortifies the patient's right to decide freely and deliberately, being him unaware of biological consequences of clinical-therapeutical interventions on himself and his progeny's health. Physician himself can generate arbitrary clinical acts, with heavy deontological and legal consequences. Hence, informed consent in ionizing imaging necessitates a particular "process" management, useful to convey a series of other clinical and organisational processes towards a full realisation of therapeutic alliance among physician and patient. This review aims at highlighting - in a juridical and communicative key - a range of tools which are applicable to contrasting the hospital abuse of ionizing radiations, for defending both patients' health and patients' dignity, being them primarily persons and citizens of a rule-of-law State.

  9. Introduction to information processing

    CERN Document Server

    Dietel, Harvey M

    2014-01-01

    An Introduction to Information Processing provides an informal introduction to the computer field. This book introduces computer hardware, which is the actual computing equipment.Organized into three parts encompassing 12 chapters, this book begins with an overview of the evolution of personal computing and includes detailed case studies on two of the most essential personal computers for the 1980s, namely, the IBM Personal Computer and Apple's Macintosh. This text then traces the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapte

  10. Modeling Business Processes of the Social Insurance Fund in Information System Runa WFE

    Science.gov (United States)

    Kataev, M. Yu; Bulysheva, L. A.; Xu, Li D.; Loseva, N. V.

    2016-08-01

    Introduction - Business processes are gradually becoming a tool that allows you at a new level to put employees or to make more efficient document management system. In these directions the main work, and presents the largest possible number of publications. However, business processes are still poorly implemented in public institutions, where it is very difficult to formalize the main existing processes. Us attempts to build a system of business processes for such state agencies as the Russian social insurance Fund (SIF), where virtually all of the processes, when different inputs have the same output: public service. The parameters of the state services (as a rule, time limits) are set by state laws and regulations. The article provides a brief overview of the FSS, the formulation of requirements to business processes, the justification of the choice of software for modeling business processes and create models of work in the system Runa WFE and optimization models one of the main business processes of the FSS. The result of the work of Runa WFE is an optimized model of the business process of FSS.

  11. Modeling and optimization of transmission and processing of data in an information computer network

    Science.gov (United States)

    Nekrasova, A.; Boriev, Z.; Nyrkov, A.; Sokolov, S.

    2016-04-01

    The paper presents a comparative analysis of the routing algorithms that allows optimizing the process of transmission and processing of data in information computer networks. A special attention is paid to multipath methods of data transmission coupled with the number of operations necessary for their performance. In addition the authors have raised the question of a linear programming method for the purpose of the solution of the above-mentioned problem.

  12. The Process of Learning from Information.

    Science.gov (United States)

    Kuhlthau, Carol Collier

    1995-01-01

    Presents the process of learning from information as the key concept for the library media center in the information age school. The Information Search Process Approach is described as a model for developing information skills fundamental to information literacy, and process learning is discussed. (Author/LRW)

  13. Mindfulness training alters emotional memory recall compared to active controls: support for an emotional information processing model of mindfulness

    OpenAIRE

    Doug eRoberts-Wolfe; Matthew eSacchet; Elizabeth eHastings; Harold eRoth; Willoughby eBritton

    2012-01-01

    Objectives: While mindfulness-based interventions have received widespread application in both clinical and non-clinical populations, the mechanism by which mindfulness meditation improves well-being remains elusive. One possibility is that mindfulness training alters the processing of emotional information, similar to prevailing cognitive models of depression and anxiety. The aim of this study was to investigating the effects of mindfulness training on emotional information processing (i.e. ...

  14. Mindfulness Training Alters Emotional Memory Recall Compared to Active Controls: Support for an Emotional Information Processing Model of Mindfulness

    OpenAIRE

    Roberts-Wolfe, Douglas; Sacchet, Matthew D.; Hastings, Elizabeth; Roth, Harold; Britton, Willoughby

    2012-01-01

    Objectives: While mindfulness-based interventions have received widespread application in both clinical and non-clinical populations, the mechanism by which mindfulness meditation improves well-being remains elusive. One possibility is that mindfulness training alters the processing of emotional information, similar to prevailing cognitive models of depression and anxiety. The aim of this study was to investigate the effects of mindfulness training on emotional information processing (i.e., m...

  15. The use of process models to inform and improve statistical models of nitrate occurrence, Great Miami River Basin, southwestern Ohio

    Science.gov (United States)

    Walter, Donald A.; Starn, J. Jeffrey

    2013-01-01

    Statistical models of nitrate occurrence in the glacial aquifer system of the northern United States, developed by the U.S. Geological Survey, use observed relations between nitrate concentrations and sets of explanatory variables—representing well-construction, environmental, and source characteristics— to predict the probability that nitrate, as nitrogen, will exceed a threshold concentration. However, the models do not explicitly account for the processes that control the transport of nitrogen from surface sources to a pumped well and use area-weighted mean spatial variables computed from within a circular buffer around the well as a simplified source-area conceptualization. The use of models that explicitly represent physical-transport processes can inform and, potentially, improve these statistical models. Specifically, groundwater-flow models simulate advective transport—predominant in many surficial aquifers— and can contribute to the refinement of the statistical models by (1) providing for improved, physically based representations of a source area to a well, and (2) allowing for more detailed estimates of environmental variables. A source area to a well, known as a contributing recharge area, represents the area at the water table that contributes recharge to a pumped well; a well pumped at a volumetric rate equal to the amount of recharge through a circular buffer will result in a contributing recharge area that is the same size as the buffer but has a shape that is a function of the hydrologic setting. These volume-equivalent contributing recharge areas will approximate circular buffers in areas of relatively flat hydraulic gradients, such as near groundwater divides, but in areas with steep hydraulic gradients will be elongated in the upgradient direction and agree less with the corresponding circular buffers. The degree to which process-model-estimated contributing recharge areas, which simulate advective transport and therefore account for

  16. Simulation Models for Analyzing the Dynamic Costs of Process-aware Information Systems

    NARCIS (Netherlands)

    Mutschler, B.B.; Reichert, M.U.

    2007-01-01

    Introducing process-aware information systems (PAIS) in enterprises (e.g., workflow management systems, case handling systems) is associated with high costs. Though cost estimation has received considerable attention in software engineering for many years, it is difficult to apply existing

  17. Cultural and Linguistic Considerations in Psychodiagnosis with Hispanics: The Need for an Empirically Informed Process Model.

    Science.gov (United States)

    Malgady, Robert G.; Zayas, Luis H.

    2001-01-01

    Nearly half the population who seeks mental health services in the century ahead will be ethnic minorities. Evidence suggests that Hispanics, who are the fastest growing minority group, present higher levels of psychiatric symptomatology and prevalence rates of disorders. A discussion is included on process variables that can inform social…

  18. A Multidirectional Model for Assessing Learning Disabled Students' Intelligence: An Information-Processing Framework.

    Science.gov (United States)

    Swanson, H. Lee

    1988-01-01

    The information processing approach to the assessment of learning disabled students' intellectual performance includes decisions about: (1) relationship between hypothesis testing and overall performance, (2) the knowledge base for strategy development, (3) coordination of search strategies, (4) metacognition, and (5) problem solving strategies.…

  19. Bridging the Operational Divide: An Information-Processing Model of Internal Supply Chain Integration

    Science.gov (United States)

    Rosado Feger, Ana L.

    2009-01-01

    Supply Chain Management, the coordination of upstream and downstream flows of product, services, finances, and information from a source to a customer, has risen in prominence over the past fifteen years. The delivery of a product to the consumer is a complex process requiring action from several independent entities. An individual firm consists…

  20. A method for tailoring the information content of a software process model

    Science.gov (United States)

    Perkins, Sharon; Arend, Mark B.

    1990-01-01

    The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to a set of accepted processes and products for achieving each criterion; (5) select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.

  1. Bridging the Operational Divide: An Information-Processing Model of Internal Supply Chain Integration

    Science.gov (United States)

    Rosado Feger, Ana L.

    2009-01-01

    Supply Chain Management, the coordination of upstream and downstream flows of product, services, finances, and information from a source to a customer, has risen in prominence over the past fifteen years. The delivery of a product to the consumer is a complex process requiring action from several independent entities. An individual firm consists…

  2. A business process modeling experience in a complex information system re-engineering.

    Science.gov (United States)

    Bernonville, Stéphanie; Vantourout, Corinne; Fendeler, Geneviève; Beuscart, Régis

    2013-01-01

    This article aims to share a business process modeling experience in a re-engineering project of a medical records department in a 2,965-bed hospital. It presents the modeling strategy, an extract of the results and the feedback experience.

  3. Applications of Social Science to Management Information Systems and Evaluation Process: A Peace Corps Model.

    Science.gov (United States)

    Lassey, William R.; And Others

    This study discusses some of the central concepts, assumptions and methods used in the development and design of a Management Information and Evaluation System for the Peace Corps in Colombia. Methodological problems encountered are reviewed. The model requires explicit project or program objectives, individual staff behavioral objectives, client…

  4. A Novel Information-Theoretic Approach for Variable Clustering and Predictive Modeling Using Dirichlet Process Mixtures

    OpenAIRE

    Yun Chen; Hui Yang

    2016-01-01

    In the era of big data, there are increasing interests on clustering variables for the minimization of data redundancy and the maximization of variable relevancy. Existing clustering methods, however, depend on nontrivial assumptions about the data structure. Note that nonlinear interdependence among variables poses significant challenges on the traditional framework of predictive modeling. In the present work, we reformulate the problem of variable clustering from an information theoretic pe...

  5. Testing for a Common Volatility Process and Information Spillovers in Bivariate Financial Time Series Models

    NARCIS (Netherlands)

    J. Chen (Jinghui); M. Kobayashi (Masahito); M.J. McAleer (Michael)

    2016-01-01

    textabstractThe paper considers the problem as to whether financial returns have a common volatility process in the framework of stochastic volatility models that were suggested by Harvey et al. (1994). We propose a stochastic volatility version of the ARCH test proposed by Engle and Susmel (1993),

  6. Efficiency of cellular information processing

    CERN Document Server

    Barato, Andre C; Seifert, Udo

    2014-01-01

    We show that a rate of conditional Shannon entropy reduction, characterizing the learning of an internal process about an external process, is bounded by the thermodynamic entropy production. This approach allows for the definition of an informational efficiency that can be used to study cellular information processing. We analyze three models of increasing complexity inspired by the E. coli sensory network, where the external process is an external ligand concentration jumping between two values. We start with a simple model for which ATP must be consumed so that a protein inside the cell can learn about the external concentration. With a second model for a single receptor we show that the rate at which the receptor learns about the external environment can be nonzero even without any dissipation inside the cell since chemical work done by the external process compensates for this learning rate. The third model is more complete, also containing adaptation. For this model we show inter alia that a bacterium i...

  7. Weather Information Processing

    Science.gov (United States)

    1991-01-01

    Science Communications International (SCI), formerly General Science Corporation, has developed several commercial products based upon experience acquired as a NASA Contractor. Among them are METPRO, a meteorological data acquisition and processing system, which has been widely used, RISKPRO, an environmental assessment system, and MAPPRO, a geographic information system. METPRO software is used to collect weather data from satellites, ground-based observation systems and radio weather broadcasts to generate weather maps, enabling potential disaster areas to receive advance warning. GSC's initial work for NASA Goddard Space Flight Center resulted in METPAK, a weather satellite data analysis system. METPAK led to the commercial METPRO system. The company also provides data to other government agencies, U.S. embassies and foreign countries.

  8. Resolving Radiological Waste Classification and Release Issues Using Material Process Information and Simple Measurements and Models

    Energy Technology Data Exchange (ETDEWEB)

    Hochel, R.C. [Westinghouse Savannah River Company, AIKEN, SC (United States)

    1997-11-01

    This report was prepared as an account of work sponsored by an agency of the United States Government. Neither the United States Government nor any agency thereof, nor any of their employees, makes any warranty, express or implied, or assumes any legal liability or responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, or represents that its use would not infringe privately owned rights. Reference herein to any specific commercial product, process, or service by trade name, trademark, manufacturer, or otherwise does not necessarily constitute or imply its endorsement, recommendation, or favoring by United States Government or any agency thereof. The views and opinions of the author expressed herein do not necessarily state or reflect those of the United States Government or any agency thereof.

  9. Information processing. [in human performance

    Science.gov (United States)

    Wickens, Christopher D.; Flach, John M.

    1988-01-01

    Theoretical models of sensory-information processing by the human brain are reviewed from a human-factors perspective, with a focus on their implications for aircraft and avionics design. The topics addressed include perception (signal detection and selection), linguistic factors in perception (context provision, logical reversals, absence of cues, and order reversals), mental models, and working and long-term memory. Particular attention is given to decision-making problems such as situation assessment, decision formulation, decision quality, selection of action, the speed-accuracy tradeoff, stimulus-response compatibility, stimulus sequencing, dual-task performance, task difficulty and structure, and factors affecting multiple task performance (processing modalities, codes, and stages).

  10. Applying an Information Processing Model to Measure the Effectiveness of a Mailed Circular Advertisement.

    Science.gov (United States)

    1982-01-01

    and if the direction of flow of the process is as stated, Aaker & Day (1974) came to some interesting conclusions. They found that adver- 17 tising...the sample. Nevertheless, the study did indicate positive effects of a mailed circular ad. II 2i REFERENCES REFERENCES Aaker , D. A., & Day, G. S. A...dynamic model of relation- ships among advertising, consumer awareness, attitudes, and behavior. Journal of Applied Psychology, 1974, 39, 281-286. Aaker

  11. Towards a Strategic Process Model of Governance for Agile IT Implementation: A Healthcare Information Technology Study in China

    OpenAIRE

    Say Yen Teoh; Xi Chen

    2013-01-01

    To remain competitive in the present dynamic environment, ‘governance for agility’ has become a key solution. Past literature paid little attention to understanding how governance for agility, particularly in regard to the delivery of Information Technology (IT) implementation. Using agile organisation and IT-governance theory as lenses to analyse data from a hospital case study, a strategic process model of governance for agility is empirically derived. This model suggests that agile hea...

  12. Towards a Strategic Process Model of Governance for Agile IT Implementation: A Healthcare Information Technology Study in China

    OpenAIRE

    Say Yen Teoh; Xi Chen

    2013-01-01

    To remain competitive in the present dynamic environment, ‘governance for agility’ has become a key solution. Past literature paid little attention to understanding how governance for agility, particularly in regard to the delivery of Information Technology (IT) implementation. Using agile organisation and IT-governance theory as lenses to analyse data from a hospital case study, a strategic process model of governance for agility is empirically derived. This model suggests that agile hea...

  13. Common data model for natural language processing based on two existing standard information models: CDA+GrAF.

    Science.gov (United States)

    Meystre, Stéphane M; Lee, Sanghoon; Jung, Chai Young; Chevrier, Raphaël D

    2012-08-01

    An increasing need for collaboration and resources sharing in the Natural Language Processing (NLP) research and development community motivates efforts to create and share a common data model and a common terminology for all information annotated and extracted from clinical text. We have combined two existing standards: the HL7 Clinical Document Architecture (CDA), and the ISO Graph Annotation Format (GrAF; in development), to develop such a data model entitled "CDA+GrAF". We experimented with several methods to combine these existing standards, and eventually selected a method wrapping separate CDA and GrAF parts in a common standoff annotation (i.e., separate from the annotated text) XML document. Two use cases, clinical document sections, and the 2010 i2b2/VA NLP Challenge (i.e., problems, tests, and treatments, with their assertions and relations), were used to create examples of such standoff annotation documents, and were successfully validated with the XML schemata provided with both standards. We developed a tool to automatically translate annotation documents from the 2010 i2b2/VA NLP Challenge format to GrAF, and automatically generated 50 annotation documents using this tool, all successfully validated. Finally, we adapted the XSL stylesheet provided with HL7 CDA to allow viewing annotation XML documents in a web browser, and plan to adapt existing tools for translating annotation documents between CDA+GrAF and the UIMA and GATE frameworks. This common data model may ease directly comparing NLP tools and applications, combining their output, transforming and "translating" annotations between different NLP applications, and eventually "plug-and-play" of different modules in NLP applications.

  14. Model for Electromagnetic Information Leakage

    OpenAIRE

    Mao Jian; Li Yongmei; Zhang Jiemin; Liu Jinming

    2013-01-01

    Electromagnetic leakage will happen in working information equipments; it could lead to information leakage. In order to discover the nature of information in electromagnetic leakage, this paper combined electromagnetic theory with information theory as an innovative research method. It outlines a systematic model of electromagnetic information leakage, which theoretically describes the process of information leakage, intercept and reproduction based on electromagnetic radiation, and ana...

  15. Single-process versus multiple-strategy models of decision making: evidence from an information intrusion paradigm.

    Science.gov (United States)

    Söllner, Anke; Bröder, Arndt; Glöckner, Andreas; Betsch, Tilmann

    2014-02-01

    When decision makers are confronted with different problems and situations, do they use a uniform mechanism as assumed by single-process models (SPMs) or do they choose adaptively from a set of available decision strategies as multiple-strategy models (MSMs) imply? Both frameworks of decision making have gathered a lot of support, but only rarely have they been contrasted with each other. Employing an information intrusion paradigm for multi-attribute decisions from givens, SPM and MSM predictions on information search, decision outcomes, attention, and confidence judgments were derived and tested against each other in two experiments. The results consistently support the SPM view: Participants seemingly using a "take-the-best" (TTB) strategy do not ignore TTB-irrelevant information as MSMs would predict, but adapt the amount of information searched, choose alternative choice options, and show varying confidence judgments contingent on the quality of the "irrelevant" information. The uniformity of these findings underlines the adequacy of the novel information intrusion paradigm and comprehensively promotes the notion of a uniform decision making mechanism as assumed by single-process models. Copyright © 2013 The Authors. Published by Elsevier B.V. All rights reserved.

  16. Model of information diffusion

    CERN Document Server

    Lande, D V

    2008-01-01

    The system of cellular automata, which expresses the process of dissemination and publication of the news among separate information resources, has been described. A bell-shaped dependence of news diffusion on internet-sources (web-sites) coheres well with a real behavior of thematic data flows, and at local time spans - with noted models, e.g., exponential and logistic ones.

  17. An Investigation on Information Seeking Behaviors of Graduate Students in Birjand and Ferdowsi Universities on the basis of Kuhlthau Model of Information Search Process

    Directory of Open Access Journals (Sweden)

    Mehdi Narmenji

    2010-12-01

    Full Text Available This paper is an attempt to study information seeking behaviors of graduate students in Birjand and Ferdowsi universities based on Kuhlthau model of Information Search Process. The population consists of graduate students in Birjand and Ferdowsi universities whose proposals were approved by graduate study councils of mentioned universities up to the end of first semester of 2008-2009 academic year. Finally, 158 usable questionnaires (%66.95 of under study sample were collected.  This research was carried out in two parts. Data was collected by a questionnaire in the first part and through interview in the second part. Findings of both parts (through questionnaire and interview reveal that in the process of preparing their proposals, the graduate students of both universities went through the same stages mentioned in Kuhlthau model of Information Search Process with minor differences, and they indicated corresponding feelings and thoughts approximate to those in stages of Kuhlthau model. The results did not generally show any significant difference among men and women's feelings and thoughts, nor among different areas of study. Therefore, it can be concluded that Kuhlthau model is not dependent on special gender or study area, and it is applicable for all areas of study and both genders.

  18. Modeling and simulating causal dependencies on process-aware information systems from a cost perspective

    NARCIS (Netherlands)

    Mutschler, B.B.

    2008-01-01

    Providing effective IT support for business processes has become crucial for enterprises to stay competitive in their market. Business processes must be defined, implemented, enacted, monitored, and continuously adapted to changing situations. Process life cycle support and continuous process

  19. Modelling health care processes for eliciting user requirements: a way to link a quality paradigm and clinical information system design.

    Science.gov (United States)

    Staccini, P; Joubert, M; Quaranta, J F; Fieschi, D; Fieschi, M

    2001-12-01

    Healthcare institutions are looking at ways to increase their efficiency by reducing costs while providing care services with a high level of safety. Thus, hospital information systems have to support quality improvement objectives. The elicitation of the requirements has to meet users' needs in relation to both the quality (efficacy, safety) and the monitoring of all health care activities (traceability). Information analysts need methods to conceptualise clinical information systems that provide actors with individual benefits and guide behavioural changes. A methodology is proposed to elicit and structure users' requirements using a process-oriented analysis, and it is applied to the blood transfusion process. An object-oriented data model of a process has been defined in order to organise the data dictionary. Although some aspects of activity, such as 'where', 'what else', and 'why' are poorly represented by the data model alone, this method of requirement elicitation fits the dynamic of data input for the process to be traced. A hierarchical representation of hospital activities has to be found for the processes to be interrelated, and for their characteristics to be shared, in order to avoid data redundancy and to fit the gathering of data with the provision of care.

  20. The relationship of three cortical regions to an information-processing model.

    Science.gov (United States)

    Anderson, John R; Qin, Yulin; Stenger, V Andrew; Carter, Cameron S

    2004-05-01

    This research tests a model of the computational role of three cortical regions in tasks like algebra equation solving. The model assumes that there is a left parietal region-of-interest (ROI) where the problem expression is represented and transformed, a left prefrontal ROI where information for solving the task is retrieved, and a motor ROI where hand movements to produce the answer are programmed. A functional magnetic resonance imaging (fMRI) study of an abstract symbol-manipulation task was performed to articulate the roles of these three regions. Participants learned to associate words with instructions for transforming strings of letters. The study manipulated the need to retrieve these instructions, the need to transform the strings, and whether there was a delay between calculation of the answer and the output of the answer. As predicted, the left parietal ROI mainly reflected the need for a transformation and the left prefrontal ROI the need for retrieval. Homologous right ROIs showed similar but weaker responses. Neither the prefrontal nor the parietal ROIs responded to delay, but the motor ROI did respond to delay, implying motor rehearsal over the delay. Except for the motor ROI, these patterns of activity did not vary with response hand. In an ACT-R model, it was shown that the activity of an imaginal buffer predicted the blood oxygen level-dependent (BOLD) response of the parietal ROI, the activity of a retrieval buffer predicted the response of the prefrontal ROI, and the activity of a manual buffer predicted the response of the motor ROI.

  1. Lateral information processing by spiking neurons: a theoretical model of the neural correlate of consciousness.

    Science.gov (United States)

    Ebner, Marc; Hameroff, Stuart

    2011-01-01

    Cognitive brain functions, for example, sensory perception, motor control and learning, are understood as computation by axonal-dendritic chemical synapses in networks of integrate-and-fire neurons. Cognitive brain functions may occur either consciously or nonconsciously (on "autopilot"). Conscious cognition is marked by gamma synchrony EEG, mediated largely by dendritic-dendritic gap junctions, sideways connections in input/integration layers. Gap-junction-connected neurons define a sub-network within a larger neural network. A theoretical model (the "conscious pilot") suggests that as gap junctions open and close, a gamma-synchronized subnetwork, or zone moves through the brain as an executive agent, converting nonconscious "auto-pilot" cognition to consciousness, and enhancing computation by coherent processing and collective integration. In this study we implemented sideways "gap junctions" in a single-layer artificial neural network to perform figure/ground separation. The set of neurons connected through gap junctions form a reconfigurable resistive grid or sub-network zone. In the model, outgoing spikes are temporally integrated and spatially averaged using the fixed resistive grid set up by neurons of similar function which are connected through gap-junctions. This spatial average, essentially a feedback signal from the neuron's output, determines whether particular gap junctions between neurons will open or close. Neurons connected through open gap junctions synchronize their output spikes. We have tested our gap-junction-defined sub-network in a one-layer neural network on artificial retinal inputs using real-world images. Our system is able to perform figure/ground separation where the laterally connected sub-network of neurons represents a perceived object. Even though we only show results for visual stimuli, our approach should generalize to other modalities. The system demonstrates a moving sub-network zone of synchrony, within which the contents of

  2. Application of the JDL data fusion process model to hard/soft information fusion in the condition monitoring of aircraft

    Science.gov (United States)

    Bernardo, Joseph T.

    2014-05-01

    Hard/soft information fusion has been proposed as a way to enhance diagnostic capability for the condition monitoring of machinery. However, there is a limited understanding of where hard/soft information fusion could and should be applied in the condition monitoring of aircraft. Condition-based maintenance refers to the philosophy of performing maintenance when the need arises, based upon indicators of deterioration in the condition of the machinery. The addition of the multisensory capability of human cognition to electronic sensors may create a fuller picture of machinery condition. Since 1988, the Joint Directors of Laboratories (JDL) data fusion process model has served as a framework for information fusion research. Advances are described in the application of hard/soft information fusion in condition monitoring using terms that condition-based maintenance professionals in aviation will recognize. Emerging literature on hard/soft information fusion in condition monitoring is organized into the levels of the JDL data fusion process model. Gaps in the literature are identified, and the author's ongoing research is discussed. Future efforts will focus on building domain-specific frameworks and experimental design, which may provide a foundation for improving flight safety, increasing mission readiness, and reducing the cost of maintenance operations.

  3. The Complex Information Process

    Directory of Open Access Journals (Sweden)

    Edwina Taborsky

    2000-07-01

    Full Text Available Abstract: This paper examines the semiosic development of energy to information within a dyadic reality that operates within the contradictions of both classical and quantum physics. These two realities are examined within the three Peircean modal categories of Firstness, Secondness and Thirdness. The paper concludes that our world cannot operate within either of the two physical realities but instead filiates the two to permit a semiosis or information-generation of complex systems.

  4. The Complex Information Process

    Science.gov (United States)

    Taborsky, Edwina

    2000-09-01

    This paper examines the semiosic development of energy to information within a dyadic reality that operates within the contradictions of both classical and quantum physics. These two realities are examined within the three Peircean modal categories of Firstness, Secondness and Thirdness. The paper concludes that our world cannot operate within either of the two physical realities but instead filiates the two to permit a semiosis or information-generation of complex systems.

  5. How parents process child health and nutrition information: A grounded theory model.

    Science.gov (United States)

    Lovell, Jennifer L

    2016-02-01

    The aim of the present study was to investigate low-income parents' experiences receiving, making meaning of, and applying sociocultural messages about childhood health and nutrition. Semi-structured interviews were conducted with parents from 16 low-income Early Head Start families. Verbatim interview transcripts, observations, field notes, documentary evidence, and follow-up participant checks were used during grounded theory analysis of the data. Data yielded a potential theoretical model of parental movement toward action involving (a) the culture and context influencing parents, (b) parents' sources of social and cultural messages, (c) parental values and engagement, (d) parental motivation for action, (e) intervening conditions impacting motivation and application, and (f) parent action taken on the individual and social levels. Parent characteristics greatly impacted the ways in which parents understood and applied health and nutrition information. Among other implications, it is recommended that educators and providers focus on a parent's beliefs, values, and cultural preferences regarding food and health behaviors as well as his/her personal/family definition of "health" when framing recommendations and developing interventions.

  6. Towards the Significance of Decision Aid in Building Information Modeling (BIM Software Selection Process

    Directory of Open Access Journals (Sweden)

    Omar Mohd Faizal

    2014-01-01

    Full Text Available Building Information Modeling (BIM has been considered as a solution in construction industry to numerous problems such as delays, increased lead in times and increased costs. This is due to the concept and characteristic of BIM that will reshaped the way construction project teams work together to increase productivity and improve the final project outcomes (cost, time, quality, safety, functionality, maintainability, etc.. As a result, the construction industry has witnesses numerous of BIM software available in market. Each of this software has offers different function, features. Furthermore, the adoption of BIM required high investment on software, hardware and also training expenses. Thus, there is indentified that there is a need of decision aid for appropriated BIM software selection that fulfill the project needs. However, research indicates that there is limited study attempt to guide decision in BIM software selection problem. Thus, this paper highlight the importance of decision making and support for BIM software selection as it is vital to increase productivity, construction project throughout building lifecycle.

  7. Lateral Information Processing by Spiking Neurons: A Theoretical Model of the Neural Correlate of Consciousness

    Directory of Open Access Journals (Sweden)

    Marc Ebner

    2011-01-01

    Full Text Available Cognitive brain functions, for example, sensory perception, motor control and learning, are understood as computation by axonal-dendritic chemical synapses in networks of integrate-and-fire neurons. Cognitive brain functions may occur either consciously or nonconsciously (on “autopilot”. Conscious cognition is marked by gamma synchrony EEG, mediated largely by dendritic-dendritic gap junctions, sideways connections in input/integration layers. Gap-junction-connected neurons define a sub-network within a larger neural network. A theoretical model (the “conscious pilot” suggests that as gap junctions open and close, a gamma-synchronized subnetwork, or zone moves through the brain as an executive agent, converting nonconscious “auto-pilot” cognition to consciousness, and enhancing computation by coherent processing and collective integration. In this study we implemented sideways “gap junctions” in a single-layer artificial neural network to perform figure/ground separation. The set of neurons connected through gap junctions form a reconfigurable resistive grid or sub-network zone. In the model, outgoing spikes are temporally integrated and spatially averaged using the fixed resistive grid set up by neurons of similar function which are connected through gap-junctions. This spatial average, essentially a feedback signal from the neuron's output, determines whether particular gap junctions between neurons will open or close. Neurons connected through open gap junctions synchronize their output spikes. We have tested our gap-junction-defined sub-network in a one-layer neural network on artificial retinal inputs using real-world images. Our system is able to perform figure/ground separation where the laterally connected sub-network of neurons represents a perceived object. Even though we only show results for visual stimuli, our approach should generalize to other modalities. The system demonstrates a moving sub-network zone of

  8. Model for Electromagnetic Information Leakage

    Directory of Open Access Journals (Sweden)

    Mao Jian

    2013-09-01

    Full Text Available Electromagnetic leakage will happen in working information equipments; it could lead to information leakage. In order to discover the nature of information in electromagnetic leakage, this paper combined electromagnetic theory with information theory as an innovative research method. It outlines a systematic model of electromagnetic information leakage, which theoretically describes the process of information leakage, intercept and reproduction based on electromagnetic radiation, and analyzes amount of leakage information with formulas.  

  9. Building a Privacy Model in the Business Processes of the Enterprise: An Information Systems Design Science Research

    Directory of Open Access Journals (Sweden)

    Munir Majdalawieh

    2013-08-01

    Full Text Available Privacy has not been researched or investigated from business process management perspective and the current literature has shown lack of a well-defined methodology for integrating privacy into business processes. This paper proposes an integrated privacy model. Such model is an integral part of the organization’s enterprise to ensure that personal data protection is impeded in the business processes of any system that is involved in collecting, disseminating, and accessing an individual’s data. Passing privacy laws is very essential and requires some cooperation and partnership between nations based on some privacy principles. The proposed framework is built on these principles and will help organizations to develop data protection in their business processes, assess the privacy issues in their organization, protect the interest of their clients, advance their value proposition, and make it easier to identify the impact of privacy on their business. The study follows the design science research process and the information systems design science research (ISDSR methodologies by identifying relevant problems from the current literature, defining the objectives of the study, designing and developing the ABC-PDMS model, and evaluating the model.

  10. Information Search Process in Science Education.

    Science.gov (United States)

    McNally, Mary Jane; Kuhlthau, Carol C.

    1994-01-01

    Discussion of the development of an information skills curriculum focuses on science education. Topics addressed include information seeking behavior; information skills models; the search process of scientists; science education; a process approach for student activities; and future possibilities. (Contains 15 references.) (LRW)

  11. Industrial Information Processing

    DEFF Research Database (Denmark)

    Svensson, Carsten

    2002-01-01

    This paper demonstrates, how cross-functional business processes may be aligned with product specification systems in an intra-organizational environment by integrating planning systems and expert systems, thereby providing an end-to-end integrated and an automated solution to the “build-to-order...

  12. Optoelectronic Information Processing

    Science.gov (United States)

    2012-03-07

    printing for III- V/Si heterogeneous integration • Single layer Si NM photonic crystal Fano membrane reflector (MR) replaces conventional DBR ... Monolithic integration on focal plane arrays using standard processes • Wavelength & polarization tunable on pixel by pixel basis • Collection...photonic crystals, nano-antennas, nano-emitters & modulators . [Agency Reviews, National Academies input] Integrated Photonics, Optical Components

  13. Scaling the Information Processing Demands of Occupations

    Science.gov (United States)

    Haase, Richard F.; Jome, LaRae M.; Ferreira, Joaquim Armando; Santos, Eduardo J. R.; Connacher, Christopher C.; Sendrowitz, Kerrin

    2011-01-01

    The purpose of this study was to provide additional validity evidence for a model of person-environment fit based on polychronicity, stimulus load, and information processing capacities. In this line of research the confluence of polychronicity and information processing (e.g., the ability of individuals to process stimuli from the environment…

  14. Logical Reasoning versus Information Processing in the Dual-Strategy Model of Reasoning

    Science.gov (United States)

    Markovits, Henry; Brisson, Janie; de Chantal, Pier-Luc

    2017-01-01

    One of the major debates concerning the nature of inferential reasoning is between counterexample-based strategies such as mental model theory and statistical strategies underlying probabilistic models. The dual-strategy model, proposed by Verschueren, Schaeken, & d'Ydewalle (2005a, 2005b), which suggests that people might have access to both…

  15. Mindfulness training alters emotional memory recall compared to active controls: support for an emotional information processing model of mindfulness

    Directory of Open Access Journals (Sweden)

    Doug eRoberts-Wolfe

    2012-02-01

    Full Text Available Objectives: While mindfulness-based interventions have received widespread application in both clinical and non-clinical populations, the mechanism by which mindfulness meditation improves well-being remains elusive. One possibility is that mindfulness training alters the processing of emotional information, similar to prevailing cognitive models of depression and anxiety. The aim of this study was to investigating the effects of mindfulness training on emotional information processing (i.e. memory biases in relation to both clinical symptomatology and well-being in comparison to active control conditions.Methods: Fifty-eight university students (28 female, age = 20.1 ± 2.7 years participated in either a 12-week course containing a "meditation laboratory" or an active control course with similar content or experiential practice laboratory format (music. Participants completed an emotional word recall task and self-report questionnaires of well-being and clinical symptoms before and after the 12-week course.Results: Meditators showed greater increases in positive word recall compared to controls F(1, 56 = 6.6, p = .02. The meditation group increased significantly more on measures of well-being [F(1, 56 = 6.6, p = .01], with a marginal decrease in depression and anxiety [(F(1, 56 = 3.0, p = .09] compared to controls. Increased positive word recall was associated with increased psychological well-being [r = 0.31, p = .02] and decreased clinical symptoms [r = -0.29, p = .03].Conclusion: Mindfulness training was associated with greater improvements in processing efficiency for positively valenced stimuli than active control conditions. This change in emotional information processing was associated with improvements in psychological well-being and less depression and anxiety. These data suggest that mindfulness training may improve well-being via changes in emotional information processing.

  16. Industrial Information Processing

    DEFF Research Database (Denmark)

    Svensson, Carsten

    2002-01-01

    This paper demonstrates, how cross-functional business processes may be aligned with product specification systems in an intra-organizational environment by integrating planning systems and expert systems, thereby providing an end-to-end integrated and an automated solution to the “build......-to-order” challenge. An outcome of this capability is that the potential market for customized products will expand, resulting in a reduction in administrative and manufacturing costs. This potential for cost reduction, simultaneous with market expansion, is a source of competitive advantage; hence manufacturers have...

  17. Building Information Modeling (BIM) Primer. Report 1: Facility Life-Cycle Process and Technology Innovation

    Science.gov (United States)

    2012-08-01

    there plans to maintain the delivered BIM c. What format are the BIM in (Revit, AutoCAD , Bentley) d. What standards are in place for BIM...tively. The solution is to integrate design Web format (DWF) capabilities with AutoCAD software to give customers access to spatial information contained...Structure, Autodesk Revit MEP, Autodesk Navisworks Manage, AutoCAD MEP, Autodesk Civil 3D Project Budget Reasonable, able to achieve by using student

  18. MODELING OF PROCESS OF DECISION-MAKING BY PARTICIPANTS OF INFORMAL VENTURE MARKET

    Directory of Open Access Journals (Sweden)

    Ekaterina I. Bragina

    2014-01-01

    Full Text Available A new approach for the evaluation of innovative projects by business angels, based on the concept of building color-graphic cards with further it’s processing by neural networks is proposed in this article. The most important criteria for evaluation of innovative projects in the deal flow and due diligence stages selected. Each criterion is supposed to be evaluated by six components, giving a complete picture of the extent of its study. The criteria, as well as its components, are ranked in order of importance for a business angel. A method of processing building color-graphic cards by neural network is also proposed in this work.

  19. Refactoring Process Models in Large Process Repositories.

    NARCIS (Netherlands)

    Weber, B.; Reichert, M.U.

    2008-01-01

    With the increasing adoption of process-aware information systems (PAIS), large process model repositories have emerged. Over time respective models have to be re-aligned to the real-world business processes through customization or adaptation. This bears the risk that model redundancies are introdu

  20. Refactoring Process Models in Large Process Repositories.

    NARCIS (Netherlands)

    Weber, B.; Reichert, M.U.

    With the increasing adoption of process-aware information systems (PAIS), large process model repositories have emerged. Over time respective models have to be re-aligned to the real-world business processes through customization or adaptation. This bears the risk that model redundancies are

  1. Predicting Defects Using Information Intelligence Process Models in the Software Technology Project.

    Science.gov (United States)

    Selvaraj, Manjula Gandhi; Jayabal, Devi Shree; Srinivasan, Thenmozhi; Balasubramanie, Palanisamy

    2015-01-01

    A key differentiator in a competitive market place is customer satisfaction. As per Gartner 2012 report, only 75%-80% of IT projects are successful. Customer satisfaction should be considered as a part of business strategy. The associated project parameters should be proactively managed and the project outcome needs to be predicted by a technical manager. There is lot of focus on the end state and on minimizing defect leakage as much as possible. Focus should be on proactively managing and shifting left in the software life cycle engineering model. Identify the problem upfront in the project cycle and do not wait for lessons to be learnt and take reactive steps. This paper gives the practical applicability of using predictive models and illustrates use of these models in a project to predict system testing defects thus helping to reduce residual defects.

  2. Information processing occurs via critical avalanches in a model of the primary visual cortex

    Science.gov (United States)

    Bortolotto, G. S.; Girardi-Schappo, M.; Gonsalves, J. J.; Pinto, L. T.; Tragtenberg, M. H. R.

    2016-01-01

    We study a new biologically motivated model for the Macaque monkey primary visual cortex which presents power-law avalanches after a visual stimulus. The signal propagates through all the layers of the model via avalanches that depend on network structure and synaptic parameter. We identify four different avalanche profiles as a function of the excitatory postsynaptic potential. The avalanches follow a size-duration scaling relation and present critical exponents that match experiments. The structure of the network gives rise to a regime of two characteristic spatial scales, one of which vanishes in the thermodynamic limit.

  3. BRICS and Quantum Information Processing

    DEFF Research Database (Denmark)

    Schmidt, Erik Meineche

    1998-01-01

    BRICS is a research centre and international PhD school in theoretical computer science, based at the University of Aarhus, Denmark. The centre has recently become engaged in quantum information processing in cooperation with the Department of Physics, also University of Aarhus. This extended...... abstract surveys activities at BRICS with special emphasis on the activities in quantum information processing....

  4. Models of Innate Neural Attractors and Their Applications for Neural Information Processing.

    Science.gov (United States)

    Solovyeva, Ksenia P; Karandashev, Iakov M; Zhavoronkov, Alex; Dunin-Barkowski, Witali L

    2015-01-01

    In this work we reveal and explore a new class of attractor neural networks, based on inborn connections provided by model molecular markers, the molecular marker based attractor neural networks (MMBANN). Each set of markers has a metric, which is used to make connections between neurons containing the markers. We have explored conditions for the existence of attractor states, critical relations between their parameters and the spectrum of single neuron models, which can implement the MMBANN. Besides, we describe functional models (perceptron and SOM), which obtain significant advantages over the traditional implementation of these models, while using MMBANN. In particular, a perceptron, based on MMBANN, gets specificity gain in orders of error probabilities values, MMBANN SOM obtains real neurophysiological meaning, the number of possible grandma cells increases 1000-fold with MMBANN. MMBANN have sets of attractor states, which can serve as finite grids for representation of variables in computations. These grids may show dimensions of d = 0, 1, 2,…. We work with static and dynamic attractor neural networks of the dimensions d = 0 and 1. We also argue that the number of dimensions which can be represented by attractors of activities of neural networks with the number of elements N = 10(4) does not exceed 8.

  5. The evolution of concepts of vestibular peripheral information processing: toward the dynamic, adaptive, parallel processing macular model

    Science.gov (United States)

    Ross, Muriel D.

    2003-01-01

    In a letter to Robert Hooke, written on 5 February, 1675, Isaac Newton wrote "If I have seen further than certain other men it is by standing upon the shoulders of giants." In his context, Newton was referring to the work of Galileo and Kepler, who preceded him. However, every field has its own giants, those men and women who went before us and, often with few tools at their disposal, uncovered the facts that enabled later researchers to advance knowledge in a particular area. This review traces the history of the evolution of views from early giants in the field of vestibular research to modern concepts of vestibular organ organization and function. Emphasis will be placed on the mammalian maculae as peripheral processors of linear accelerations acting on the head. This review shows that early, correct findings were sometimes unfortunately disregarded, impeding later investigations into the structure and function of the vestibular organs. The central themes are that the macular organs are highly complex, dynamic, adaptive, distributed parallel processors of information, and that historical references can help us to understand our own place in advancing knowledge about their complicated structure and functions.

  6. The evolution of concepts of vestibular peripheral information processing: toward the dynamic, adaptive, parallel processing macular model.

    Science.gov (United States)

    Ross, Muriel D

    2003-09-01

    In a letter to Robert Hooke, written on 5 February, 1675, Isaac Newton wrote "If I have seen further than certain other men it is by standing upon the shoulders of giants." In his context, Newton was referring to the work of Galileo and Kepler, who preceded him. However, every field has its own giants, those men and women who went before us and, often with few tools at their disposal, uncovered the facts that enabled later researchers to advance knowledge in a particular area. This review traces the history of the evolution of views from early giants in the field of vestibular research to modern concepts of vestibular organ organization and function. Emphasis will be placed on the mammalian maculae as peripheral processors of linear accelerations acting on the head. This review shows that early, correct findings were sometimes unfortunately disregarded, impeding later investigations into the structure and function of the vestibular organs. The central themes are that the macular organs are highly complex, dynamic, adaptive, distributed parallel processors of information, and that historical references can help us to understand our own place in advancing knowledge about their complicated structure and functions.

  7. Processing of recognition information and additional cues: A model-based analysis of choice, confidence, and response time

    Directory of Open Access Journals (Sweden)

    Andreas Glockner

    2011-02-01

    Full Text Available Research on the processing of recognition information has focused on testing the recognition heuristic (RH. On the aggregate, the noncompensatory use of recognition information postulated by the RH was rejected in several studies, while RH could still account for a considerable proportion of choices. These results can be explained if either a a part of the subjects used RH or b nobody used it but its choice predictions were accidentally in line with predictions of the strategy used. In the current study, which exemplifies a new approach to model testing, we determined individuals' decision strategies based on a maximum-likelihood classification method, taking into account choices, response times and confidence ratings simultaneously. Unlike most previous studies of the RH, our study tested the RH under conditions in which we provided information about cue values of unrecognized objects (which we argue is fairly common and thus of some interest. For 77.5% of the subjects, overall behavior was best explained by a compensatory parallel constraint satisfaction (PCS strategy. The proportion of subjects using an enhanced RH heuristic (RHe was negligible (up to 7.5%; 15% of the subjects seemed to use a take the best strategy (TTB. A more-fine grained analysis of the supplemental behavioral parameters conditional on strategy use supports PCS but calls into question process assumptions for apparent users of RH, RHe, and TTB within our experimental context. Our results are consistent with previous literature highlighting the importance of individual strategy classification as compared to aggregated analyses.

  8. Information processing, computation, and cognition.

    Science.gov (United States)

    Piccinini, Gualtiero; Scarantino, Andrea

    2011-01-01

    Computation and information processing are among the most fundamental notions in cognitive science. They are also among the most imprecisely discussed. Many cognitive scientists take it for granted that cognition involves computation, information processing, or both - although others disagree vehemently. Yet different cognitive scientists use 'computation' and 'information processing' to mean different things, sometimes without realizing that they do. In addition, computation and information processing are surrounded by several myths; first and foremost, that they are the same thing. In this paper, we address this unsatisfactory state of affairs by presenting a general and theory-neutral account of computation and information processing. We also apply our framework by analyzing the relations between computation and information processing on one hand and classicism, connectionism, and computational neuroscience on the other. We defend the relevance to cognitive science of both computation, at least in a generic sense, and information processing, in three important senses of the term. Our account advances several foundational debates in cognitive science by untangling some of their conceptual knots in a theory-neutral way. By leveling the playing field, we pave the way for the future resolution of the debates' empirical aspects.

  9. Markovian Processes for Quantitative Information Leakage

    DEFF Research Database (Denmark)

    Biondi, Fabrizio

    and randomized processes with Markovian models and to compute their information leakage for a very general model of attacker. We present the QUAIL tool that automates such analysis and is able to compute the information leakage of an imperative WHILE language. Finally, we show how to use QUAIL to analyze some...

  10. Mindfulness training alters emotional memory recall compared to active controls: support for an emotional information processing model of mindfulness.

    Science.gov (United States)

    Roberts-Wolfe, Douglas; Sacchet, Matthew D; Hastings, Elizabeth; Roth, Harold; Britton, Willoughby

    2012-01-01

    While mindfulness-based interventions have received widespread application in both clinical and non-clinical populations, the mechanism by which mindfulness meditation improves well-being remains elusive. One possibility is that mindfulness training alters the processing of emotional information, similar to prevailing cognitive models of depression and anxiety. The aim of this study was to investigate the effects of mindfulness training on emotional information processing (i.e., memory) biases in relation to both clinical symptomatology and well-being in comparison to active control conditions. Fifty-eight university students (28 female, age = 20.1 ± 2.7 years) participated in either a 12-week course containing a "meditation laboratory" or an active control course with similar content or experiential practice laboratory format (music). Participants completed an emotional word recall task and self-report questionnaires of well-being and clinical symptoms before and after the 12-week course. Meditators showed greater increases in positive word recall compared to controls [F(1, 56) = 6.6, p = 0.02]. The meditation group increased significantly more on measures of well-being [F(1, 56) = 6.6, p = 0.01], with a marginal decrease in depression and anxiety [F(1, 56) = 3.0, p = 0.09] compared to controls. Increased positive word recall was associated with increased psychological well-being (r = 0.31, p = 0.02) and decreased clinical symptoms (r = -0.29, p = 0.03). Mindfulness training was associated with greater improvements in processing efficiency for positively valenced stimuli than active control conditions. This change in emotional information processing was associated with improvements in psychological well-being and less depression and anxiety. These data suggest that mindfulness training may improve well-being via changes in emotional information processing. Future research with a fully randomized design will be

  11. Response to an abnormal ovarian cancer-screening test result: test of the social cognitive processing and cognitive social health information processing models.

    Science.gov (United States)

    Andrykowski, Michael A; Pavlik, Edward J

    2011-04-01

    All cancer screening tests produce a proportion of abnormal results requiring follow up. Consequently, the cancer-screening setting is a natural laboratory for examining psychological and behavioural response to a threatening health-related event. This study tested hypotheses derived from the social cognitive processing and cognitive-social health information processing models in trying to understand response to an abnormal ovarian cancer (OC) screening test result. Women (n = 278) receiving an abnormal screening test result a mean of 7 weeks earlier were assessed prior to a repeat screening test intended to clarify their previous abnormal result. Measures of disposition (optimism, informational coping style), social environment (social support and constraint), emotional processing, distress, and benefit finding were obtained. Regression analyses indicated greater distress was associated with greater social constraint and emotional processing and a monitoring coping style in women with a family history of OC. Distress was unrelated to social support. Greater benefit finding was associated with both greater social constraint and support and greater distress. The primacy of social constraint in accounting for both benefit finding and distress was noteworthy and warrants further research on the role of social constraint in adaptation to stressful events.

  12. Cognition: Human Information Processing. Introduction.

    Science.gov (United States)

    Griffith, Belver C.

    1981-01-01

    Summarizes the key research issues and developments in cognitive science, especially with respect to the similarities, differences, and interrelationships between human and machine information processing. Nine references are listed. (JL)

  13. Organization as Information Processing Systems. Toward a Model of the Research Factors Associated with Significant Research Outcomes.

    Science.gov (United States)

    1986-04-01

    and %- , during research projects that were related to research outcomes. The Ambidextrous model, which includes both organic and mechanistic research...to make choices with greater li4kelihood for innovative outcomes. p A potential side benefit from better knowledge of the research process * maY...aspect of the research process. The models are referred to respectively as the Davis model, the Antecedents model, and the Ambidextrous model. These

  14. Research and development of models and instruments to define, measure, and improve shared information processing with government oversight agencies. An analysis of the literature, August 1990--January 1992

    Energy Technology Data Exchange (ETDEWEB)

    1992-12-31

    This document identifies elements of sharing, plus key variables of each and their interrelationships. The document`s model of sharing is intended to help management systems` users understand what sharing is and how to integrate it with information processing.

  15. Biosphere Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    J. Schmitt

    2000-05-25

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  16. Internet Network Resource Information Model

    Institute of Scientific and Technical Information of China (English)

    陈传峰; 李增智; 唐亚哲; 刘康平

    2002-01-01

    The foundation of any network management systens is a database that con-tains information about the network resources relevant to the management tasks. A networkinformation model is an abstraction of network resources, including both managed resources andmanaging resources. In the SNMP-based management framework, management information isdefined almost exclusively from a "device" viewpoint, namely, managing a network is equiva-lent to managing a collection of individual nodes. Aiming at making use of recent advances indistributed computing and in object-oriented analysis and design, the Internet management ar-chitecture can also be based on the Open Distributed Processing Reference Model (RM-ODP).The purpose of this article is to provide an Internet Network Resource Information Model.First, a layered management information architecture will be discussed. Then the Internetnetwork resource information model is presented. The information model is specified usingObject-Z.

  17. Combining livestock production information in a process-based vegetation model to reconstruct the history of grassland management

    Science.gov (United States)

    Chang, Jinfeng; Ciais, Philippe; Herrero, Mario; Havlik, Petr; Campioli, Matteo; Zhang, Xianzhou; Bai, Yongfei; Viovy, Nicolas; Joiner, Joanna; Wang, Xuhui; Peng, Shushi; Yue, Chao; Piao, Shilong; Wang, Tao; Hauglustaine, Didier A.; Soussana, Jean-Francois; Peregon, Anna; Kosykh, Natalya; Mironycheva-Tokareva, Nina

    2016-06-01

    Grassland management type (grazed or mown) and intensity (intensive or extensive) play a crucial role in the greenhouse gas balance and surface energy budget of this biome, both at field scale and at large spatial scale. However, global gridded historical information on grassland management intensity is not available. Combining modelled grass-biomass productivity with statistics of the grass-biomass demand by livestock, we reconstruct gridded maps of grassland management intensity from 1901 to 2012. These maps include the minimum area of managed vs. maximum area of unmanaged grasslands and the fraction of mown vs. grazed area at a resolution of 0.5° by 0.5°. The grass-biomass demand is derived from a livestock dataset for 2000, extended to cover the period 1901-2012. The grass-biomass supply (i.e. forage grass from mown grassland and biomass grazed) is simulated by the process-based model ORCHIDEE-GM driven by historical climate change, rising CO2 concentration, and changes in nitrogen fertilization. The global area of managed grassland obtained in this study increases from 6.1 × 106 km2 in 1901 to 12.3 × 106 km2 in 2000, although the expansion pathway varies between different regions. ORCHIDEE-GM also simulated augmentation in global mean productivity and herbage-use efficiency over managed grassland during the 20th century, indicating a general intensification of grassland management at global scale but with regional differences. The gridded grassland management intensity maps are model dependent because they depend on modelled productivity. Thus specific attention was given to the evaluation of modelled productivity against a series of observations from site-level net primary productivity (NPP) measurements to two global satellite products of gross primary productivity (GPP) (MODIS-GPP and SIF data). Generally, ORCHIDEE-GM captures the spatial pattern, seasonal cycle, and interannual variability of grassland productivity at global scale well and thus is

  18. Attaining Performance with Building Information Modelling: A systematic literature review of product and process modelling in AEC

    NARCIS (Netherlands)

    Papadonikolaki, E.; Koutamanis, A.; Wamelink, J.W.F.

    2013-01-01

    The paper presents the findings of a systematic literature review of approximately 200 scientific sources. It is designed with the aim to identify the current benefits and factors of high performance in Architecture, Engineering, Construction (AEC) since the introduction of Building Information

  19. Modeling gross primary production of agro-forestry ecosystems by assimilation of satellite-derived information in a process-based model.

    Science.gov (United States)

    Migliavacca, Mirco; Meroni, Michele; Busetto, Lorenzo; Colombo, Roberto; Zenone, Terenzio; Matteucci, Giorgio; Manca, Giovanni; Seufert, Guenther

    2009-01-01

    In this paper we present results obtained in the framework of a regional-scale analysis of the carbon budget of poplar plantations in Northern Italy. We explored the ability of the process-based model BIOME-BGC to estimate the gross primary production (GPP) using an inverse modeling approach exploiting eddy covariance and satellite data. We firstly present a version of BIOME-BGC coupled with the radiative transfer models PROSPECT and SAILH (named PROSAILH-BGC) with the aims of i) improving the BIOME-BGC description of the radiative transfer regime within the canopy and ii) allowing the assimilation of remotely-sensed vegetation index time series, such as MODIS NDVI, into the model. Secondly, we present a two-step model inversion for optimization of model parameters. In the first step, some key ecophysiological parameters were optimized against data collected by an eddy covariance flux tower. In the second step, important information about phenological dates and about standing biomass were optimized against MODIS NDVI. Results obtained showed that the PROSAILH-BGC allowed simulation of MODIS NDVI with good accuracy and that we described better the canopy radiation regime. The inverse modeling approach was demonstrated to be useful for the optimization of ecophysiological model parameters, phenological dates and parameters related to the standing biomass, allowing good accuracy of daily and annual GPP predictions. In summary, this study showed that assimilation of eddy covariance and remote sensing data in a process model may provide important information for modeling gross primary production at regional scale.

  20. Planning Ability across Ranges of Intellectual Ability: An Examination of the Luria-Das Information-Processing Model.

    Science.gov (United States)

    McCallum, R. Steve; And Others

    1988-01-01

    Based on Luria-Das information processing theory, hypothesized that 26 educable mentally retarded children would score significantly less well on relatively pure measures of planning ability than would 13 younger average ability students after students were matched on cognitive processing ability. Hypothesis was not supported by study. (Author/NB)

  1. Planning Ability across Ranges of Intellectual Ability: An Examination of the Luria-Das Information-Processing Model.

    Science.gov (United States)

    McCallum, R. Steve; And Others

    1988-01-01

    Based on Luria-Das information processing theory, hypothesized that 26 educable mentally retarded children would score significantly less well on relatively pure measures of planning ability than would 13 younger average ability students after students were matched on cognitive processing ability. Hypothesis was not supported by study. (Author/NB)

  2. Chaotic signal processing: information aspects

    CERN Document Server

    Andreyev, Y V; Efremova, E V; Anagnostopoulos, A N

    2003-01-01

    One of the features of chaotic signals that make them different of other types of signals is their special information properties. In this paper, we investigate the effect of these properties on the procedures of chaotic signal processing. On examples of cleaning chaotic signals off noise, chaotic synchronization and separation of chaotic signals we demonstrate the existence of basic limits imposed by information theory on chaotic signal processing, independent of concrete algorithms. Relations of these limits with the Second law, Shannon theorems and Landauer principle are discussed.

  3. INFORMATION MODEL OF SOCIAL TRANSFORMATIONS

    Directory of Open Access Journals (Sweden)

    Мария Васильевна Комова

    2013-09-01

    Full Text Available The social transformation is considered as a process of qualitative changes of the society, creating a new level of organization in all areas of life, in different social formations, societies of different types of development. The purpose of the study is to create a universal model for studying social transformations based on their understanding as the consequence of the information exchange processes in the society. After defining the conceptual model of the study, the author uses the following methods: the descriptive method, analysis, synthesis, comparison.Information, objectively existing in all elements and systems of the material world, is an integral attribute of the society transformation as well. The information model of social transformations is based on the definition of the society transformation as the change in the information that functions in the society’s information space. The study of social transformations is the study of information flows circulating in the society and being characterized by different spatial, temporal, and structural states. Social transformations are a highly integrated system of social processes and phenomena, the nature, course and consequences of which are affected by the factors representing the whole complex of material objects. The integrated information model of social transformations foresees the interaction of the following components: social memory, information space, and the social ideal. To determine the dynamics and intensity of social transformations the author uses the notions of "information threshold of social transformations" and "information pressure".Thus, the universal nature of information leads to considering social transformations as a system of information exchange processes. Social transformations can be extended to any episteme actualized by social needs. The establishment of an information threshold allows to simulate the course of social development, to predict the

  4. Information Processing Approaches to Studying Spelling Deficiencies.

    Science.gov (United States)

    Gerber, Michael M.; Hall, Robert J.

    1987-01-01

    The article explores information processing models of spelling performance and argues that an adequate theory of spelling processes must include: (1) qualitative changes in performance as a function of maturation that underlie development of automaticity; (2) transactional development of spelling-related knowledge structures and efficient…

  5. Information processing among high-performance managers

    Directory of Open Access Journals (Sweden)

    S.C. Garcia-Santos

    2010-01-01

    Full Text Available The purpose of this study was to evaluate the information processing of 43 business managers with a professional superior performance. The theoretical framework considers three models: the Theory of Managerial Roles of Henry Mintzberg, the Theory of Information Processing, and Process Model Response to Rorschach by John Exner. The participants have been evaluated by Rorschach method. The results show that these managers are able to collect data, evaluate them and establish rankings properly. At same time, they are capable of being objective and accurate in the problems assessment. This information processing style permits an interpretation of the world around on basis of a very personal and characteristic processing way or cognitive style.

  6. Fidelity in Archaeal Information Processing

    Directory of Open Access Journals (Sweden)

    Bart de Koning

    2010-01-01

    Full Text Available A key element during the flow of genetic information in living systems is fidelity. The accuracy of DNA replication influences the genome size as well as the rate of genome evolution. The large amount of energy invested in gene expression implies that fidelity plays a major role in fitness. On the other hand, an increase in fidelity generally coincides with a decrease in velocity. Hence, an important determinant of the evolution of life has been the establishment of a delicate balance between fidelity and variability. This paper reviews the current knowledge on quality control in archaeal information processing. While the majority of these processes are homologous in Archaea, Bacteria, and Eukaryotes, examples are provided of nonorthologous factors and processes operating in the archaeal domain. In some instances, evidence for the existence of certain fidelity mechanisms has been provided, but the factors involved still remain to be identified.

  7. Combining Livestock Production Information in a Process-Based Vegetation Model to Reconstruct the History of Grassland Management

    Science.gov (United States)

    Chang, Jinfeng; Ciais, Philippe; Herrero, Mario; Havlik, Petr; Campioli, Matteo; Zhang, Xianzhou; Bai, Yongfei; Viovy, Nicolas; Joiner, Joanna; Wang, Xuhui; hide

    2016-01-01

    Grassland management type (grazed or mown) and intensity (intensive or extensive) play a crucial role in the greenhouse gas balance and surface energy budget of this biome, both at field scale and at large spatial scale. However, global gridded historical information on grassland management intensity is not available. Combining modelled grass-biomass productivity with statistics of the grass-biomass demand by livestock, we reconstruct gridded maps of grassland management intensity from 1901 to 2012. These maps include the minimum area of managed vs. maximum area of unmanaged grasslands and the fraction of mown vs. grazed area at a resolution of 0.5deg by 0.5deg. The grass-biomass demand is derived from a livestock dataset for 2000, extended to cover the period 19012012. The grass-biomass supply (i.e. forage grass from mown grassland and biomass grazed) is simulated by the process-based model ORCHIDEE-GM driven by historical climate change, risingCO2 concentration, and changes in nitrogen fertilization. The global area of managed grassland obtained in this study increases from 6.1 x 10(exp 6) km(exp 2) in 1901 to 12.3 x 10(exp 6) kmI(exp 2) in 2000, although the expansion pathway varies between different regions. ORCHIDEE-GM also simulated augmentation in global mean productivity and herbage-use efficiency over managed grassland during the 20th century, indicating a general intensification of grassland management at global scale but with regional differences. The gridded grassland management intensity maps are model dependent because they depend on modelled productivity. Thus specific attention was given to the evaluation of modelled productivity against a series of observations from site-level net primary productivity (NPP) measurements to two global satellite products of gross primary productivity (GPP) (MODIS-GPP and SIF data). Generally, ORCHIDEE-GM captures the spatial pattern, seasonal cycle, and inter-annual variability of grassland productivity at global

  8. Impacts of Irrigation and Climate Change on Water Security: Using Stakeholder Engagement to Inform a Process-based Crop Model

    Science.gov (United States)

    Leonard, A.; Flores, A. N.; Han, B.; Som Castellano, R.; Steimke, A.

    2016-12-01

    Irrigation is an essential component for agricultural production in arid and semi-arid regions, accounting for a majority of global freshwater withdrawals used for human consumption. Since climate change affects both the spatiotemporal demand and availability of water in irrigated areas, agricultural productivity and water efficiency depend critically on how producers adapt and respond to climate change. It is necessary, therefore, to understand the coevolution and feedbacks between humans and agricultural systems. Integration of social and hydrologic processes can be achieved by active engagement with local stakeholders and applying their expertise to models of coupled human-environment systems. Here, we use a process based crop simulation model (EPIC) informed by stakeholder engagement to determine how both farm management and climate change influence regional agricultural water use and production in the Lower Boise River Basin (LBRB) of southwest Idaho. Specifically, we investigate how a shift from flood to sprinkler fed irrigation would impact a watershed's overall agricultural water use under RCP 4.5 and RCP 8.5 climate scenarios. The LBRB comprises about 3500 km2, of which 20% is dedicated to irrigated crops and another 40% to grass/pasture grazing land. Via interviews of stakeholders in the LBRB, we have determined that approximately 70% of irrigated lands in the region are flood irrigated. We model four common crops produced in the LBRB (alfalfa, corn, winter wheat, and sugarbeets) to investigate both hydrologic and agricultural impacts of irrigation and climatic drivers. Factors influencing farmers' decision to switch from flood to sprinkler irrigation include potential economic benefits, external financial incentives, and providing a buffer against future water shortages. These two irrigation practices are associated with significantly different surface water and energy budgets, and large-scale shifts in practice could substantially impact regional

  9. Principles of neural information processing

    CERN Document Server

    Seelen, Werner v

    2016-01-01

    In this fundamental book the authors devise a framework that describes the working of the brain as a whole. It presents a comprehensive introduction to the principles of Neural Information Processing as well as recent and authoritative research. The books´ guiding principles are the main purpose of neural activity, namely, to organize behavior to ensure survival, as well as the understanding of the evolutionary genesis of the brain. Among the developed principles and strategies belong self-organization of neural systems, flexibility, the active interpretation of the world by means of construction and prediction as well as their embedding into the world, all of which form the framework of the presented description. Since, in brains, their partial self-organization, the lifelong adaptation and their use of various methods of processing incoming information are all interconnected, the authors have chosen not only neurobiology and evolution theory as a basis for the elaboration of such a framework, but also syst...

  10. Spatio-temporal rectification of tower-based eddy-covariance flux measurements for consistently informing process-based models

    Science.gov (United States)

    Metzger, S.; Xu, K.; Desai, A. R.; Taylor, J. R.; Kljun, N.; Schneider, D.; Kampe, T. U.; Fox, A. M.

    2013-12-01

    Process-based models, such as land surface models (LSMs), allow insight in the spatio-temporal distribution of stocks and the exchange of nutrients, trace gases etc. among environmental compartments. More recently, LSMs also become capable of assimilating time-series of in-situ reference observations. This enables calibrating the underlying functional relationships to site-specific characteristics, or to constrain the model results after each time-step in an attempt to minimize drift. The spatial resolution of LSMs is typically on the order of 10^2-10^4 km2, which is suitable for linking regional to continental scales and beyond. However, continuous in-situ observations of relevant stock and exchange variables, such as tower-based eddy-covariance (EC) fluxes, represent orders of magnitude smaller spatial scales (10^-6-10^1 km2). During data assimilation, this significant gap in spatial representativeness is typically either neglected, or side-stepped using simple tiling approaches. Moreover, at ';coarse' resolutions, a single LSM evaluation per time-step implies linearity among the underlying functional relationships as well as among the sub-grid land cover fractions. This, however, is not warranted for land-atmosphere exchange processes over more complex terrain. Hence, it is desirable to explicitly consider spatial variability at LSM sub-grid scales. Here we present a procedure that determines from a single EC tower the spatially integrated probability density function (PDF) of the surface-atmosphere exchange for individual land covers. These PDFs allow quantifying the expected value, as well as spatial variability over a target domain, can be assimilated in tiling-capable LSMs, and mitigate linearity assumptions at ';coarse' resolutions. The procedure is based on the extraction and extrapolation of environmental response functions (ERFs), for which a technical-oriented companion poster is submitted. In short, the subsequent steps are: (i) Time

  11. Building information modelling (BIM)

    CSIR Research Space (South Africa)

    Conradie, Dirk CU

    2009-02-01

    Full Text Available The concept of a Building Information Model (BIM) also known as a Building Product Model (BPM) is nothing new. A short article on BIM will never cover the entire filed, because it is a particularly complex filed that is recently beginning to receive...

  12. Quantum communication and information processing

    Science.gov (United States)

    Beals, Travis Roland

    Quantum computers enable dramatically more efficient algorithms for solving certain classes of computational problems, but, in doing so, they create new problems. In particular, Shor's Algorithm allows for efficient cryptanalysis of many public-key cryptosystems. As public key cryptography is a critical component of present-day electronic commerce, it is crucial that a working, secure replacement be found. Quantum key distribution (QKD), first developed by C.H. Bennett and G. Brassard, offers a partial solution, but many challenges remain, both in terms of hardware limitations and in designing cryptographic protocols for a viable large-scale quantum communication infrastructure. In Part I, I investigate optical lattice-based approaches to quantum information processing. I look at details of a proposal for an optical lattice-based quantum computer, which could potentially be used for both quantum communications and for more sophisticated quantum information processing. In Part III, I propose a method for converting and storing photonic quantum bits in the internal state of periodically-spaced neutral atoms by generating and manipulating a photonic band gap and associated defect states. In Part II, I present a cryptographic protocol which allows for the extension of present-day QKD networks over much longer distances without the development of new hardware. I also present a second, related protocol which effectively solves the authentication problem faced by a large QKD network, thus making QKD a viable, information-theoretic secure replacement for public key cryptosystems.

  13. Information Theory: a Multifaceted Model of Information

    Directory of Open Access Journals (Sweden)

    Mark Burgin

    2003-06-01

    Full Text Available A contradictory and paradoxical situation that currently exists in information studies can be improved by the introduction of a new information approach, which is called the general theory of information. The main achievement of the general theory of information is explication of a relevant and adequate definition of information. This theory is built as a system of two classes of principles (ontological and sociological and their consequences. Axiological principles, which explain how to measure and evaluate information and information processes, are presented in the second section of this paper. These principles systematize and unify different approaches, existing as well as possible, to construction and utilization of information measures. Examples of such measures are given by Shannon’s quantity of information, algorithmic quantity of information or volume of information. It is demonstrated that all other known directions of information theory may be treated inside general theory of information as its particular cases.

  14. Optical information storage and processing

    Science.gov (United States)

    Liu, Zhiwen

    Optical information storage and optical information processing are the two themes of this thesis. Chapter two and three discuss the issue of storage while the final two chapters investigate the topic of optical computing. In the second chapter, we demonstrate a holographic system which is able to record phenomena in nanosecond speed. Laser induced shock wave propagation is recorded by angularly multiplexing pulsed holograms. Five frames can be recorded with frame interval of 12ns and time resolution of 5.9ns. We also demonstrate a system which can record fast events holographically on a CCD camera. Carrier multiplexing is used to store 3 frames in a single CCD frame with frame interval of 12ns. This technique can be extended to record femtosecond events. Information storage in subwavelength structures is discussed in the third chapter. A 2D simulation tool using the FDTD algorithm is developed and applied to calculate the far field scattering from subwavelength trenches. The simulation agrees with the experimental data very well. Width, depth and angle multiplexing is investigated to encode information in subwavelength features. An eigenfunction approach is adopted to analyze how much information can be stored given the length of the feature. Finally we study the effect of non-linear buffer layer. We switch gear to holographic correlators in the fourth chapter. We study various properties of the defocused correlator which can control the shift invariance conveniently. An approximate expression of the shift selectivity is derived. We demonstrate a real time correlator with 480 templates. The cross talk of the correlators is also analyzed. Finally, in the fifth chapter we apply the optical correlator to fingerprint identification and study the performance of the correlation based algorithms. The windowed correlation can improve the rotation and distortion tolerance.

  15. GREAT Process Modeller user manual

    OpenAIRE

    Rueda, Urko; España, Sergio; Ruiz, Marcela

    2015-01-01

    This report contains instructions to install, uninstall and use GREAT Process Modeller, a tool that supports Communication Analysis, a communication-oriented business process modelling method. GREAT allows creating communicative event diagrams (i.e. business process models), specifying message structures (which describe the messages associated to each communicative event), and automatically generating a class diagram (representing the data model of an information system that would support suc...

  16. Improved GSO Optimized ESN Soft-Sensor Model of Flotation Process Based on Multisource Heterogeneous Information Fusion

    Directory of Open Access Journals (Sweden)

    Jie-sheng Wang

    2014-01-01

    Full Text Available For predicting the key technology indicators (concentrate grade and tailings recovery rate of flotation process, an echo state network (ESN based fusion soft-sensor model optimized by the improved glowworm swarm optimization (GSO algorithm is proposed. Firstly, the color feature (saturation and brightness and texture features (angular second moment, sum entropy, inertia moment, etc. based on grey-level co-occurrence matrix (GLCM are adopted to describe the visual characteristics of the flotation froth image. Then the kernel principal component analysis (KPCA method is used to reduce the dimensionality of the high-dimensional input vector composed by the flotation froth image characteristics and process datum and extracts the nonlinear principal components in order to reduce the ESN dimension and network complex. The ESN soft-sensor model of flotation process is optimized by the GSO algorithm with congestion factor. Simulation results show that the model has better generalization and prediction accuracy to meet the online soft-sensor requirements of the real-time control in the flotation process.

  17. Improved GSO optimized ESN soft-sensor model of flotation process based on multisource heterogeneous information fusion.

    Science.gov (United States)

    Wang, Jie-sheng; Han, Shuang; Shen, Na-na

    2014-01-01

    For predicting the key technology indicators (concentrate grade and tailings recovery rate) of flotation process, an echo state network (ESN) based fusion soft-sensor model optimized by the improved glowworm swarm optimization (GSO) algorithm is proposed. Firstly, the color feature (saturation and brightness) and texture features (angular second moment, sum entropy, inertia moment, etc.) based on grey-level co-occurrence matrix (GLCM) are adopted to describe the visual characteristics of the flotation froth image. Then the kernel principal component analysis (KPCA) method is used to reduce the dimensionality of the high-dimensional input vector composed by the flotation froth image characteristics and process datum and extracts the nonlinear principal components in order to reduce the ESN dimension and network complex. The ESN soft-sensor model of flotation process is optimized by the GSO algorithm with congestion factor. Simulation results show that the model has better generalization and prediction accuracy to meet the online soft-sensor requirements of the real-time control in the flotation process.

  18. Executive Information Systems' Multidimensional Models

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available Executive Information Systems are design to improve the quality of strategic level of management in organization through a new type of technology and several techniques for extracting, transforming, processing, integrating and presenting data in such a way that the organizational knowledge filters can easily associate with this data and turn it into information for the organization. These technologies are known as Business Intelligence Tools. But in order to build analytic reports for Executive Information Systems (EIS in an organization we need to design a multidimensional model based on the business model from the organization. This paper presents some multidimensional models that can be used in EIS development and propose a new model that is suitable for strategic business requests.

  19. Proprioceptive information processing in schizophrenia

    DEFF Research Database (Denmark)

    Arnfred, Sidse M H

    This doctoral thesis focuses on brain activity in response to proprioceptive stimulation in schizophrenia. The works encompass methodological developments substantiated by investigations of healthy volunteers and two clinical studies of schizophrenia spectrum patients. American psychiatrist Sandor...... Rado (1890-1972) suggested that one of two un-reducible deficits in schizophrenia was a disorder of proprioception. Exploration of proprioceptive information processing is possible through the measurement of evoked and event related potentials. Event related EEG can be analyzed as conventional time......-series averages or as oscillatory averages transformed into the frequency domain. Gamma activity evoked by electricity or by another type of somatosensory stimulus has not been reported before in schizophrenia. Gamma activity is considered to be a manifestation of perceptual integration. A new load stimulus...

  20. Real-time optical information processing

    CERN Document Server

    Javidi, Bahram

    1994-01-01

    Real-Time Optical Information Processing covers the most recent developments in optical information processing, pattern recognition, neural computing, and materials for devices in optical computing. Intended for researchers and graduate students in signal and information processing with some elementary background in optics, the book provides both theoretical and practical information on the latest in information processing in all its aspects. Leading researchers in the field describe the significant signal processing algorithms architectures in optics as well as basic hardware concepts,

  1. Information Retrieval Interaction: an Analysis of Models

    Directory of Open Access Journals (Sweden)

    Farahnaz Sadoughi

    2012-03-01

    Full Text Available Information searching process is an interactive process; thus users has control on searching process, and they can manage the results of the search process. In this process, user's question became more mature, according to retrieved results. In addition, on the side of the information retrieval system, there are some processes that could not be realized, unless by user. Practically, this issue, is egregious in “Interaction” -i.e. process of user connection to other system elements- and in “Relevance judgment”. This paper had a glance to existence of “Interaction” in information retrieval, in first. Then the tradition model of information retrieval and its strenght and weak points were reviewed. Finally, the current models of interactive information retrieval includes: Belkin episodic model, Ingwersen cognitive model, Sarasevic stratified model, and Spinks interactive feedback model were elucidated.

  2. Information model of economy

    Directory of Open Access Journals (Sweden)

    N.S.Gonchar

    2006-01-01

    Full Text Available A new stochastic model of economy is developed that takes into account the choice of consumers are the dependent random fields. Axioms of such a model are formulated. The existence of random fields of consumer's choice and decision making by firms are proved. New notions of conditionally independent random fields and random fields of evaluation of information by consumers are introduced. Using the above mentioned random fields the random fields of consumer choice and decision making by firms are constructed. The theory of economic equilibrium is developed.

  3. THE MODEL OF DISTINCTION OF ACCESS RIGHTS TO INFORMATION OBJECTS OF THE SYSTEM OF CONTROLLING OF BUSINESS PROCESSES OF AN AVIATION ENTERPRISE

    Directory of Open Access Journals (Sweden)

    Andrey V. Degtyarev

    2014-01-01

    Full Text Available On the basis of the analysis of controlling system of business processes ofaviation enterprise was formulated the approach for set up an hierarchicalmodel of personal permissions to information resources of an automatic of thesystem of controlling of projects and contracts (ASCPC on the instrumentaland procedure levels. On the model base structure of personalized key wasdeveloped. This model reflective of possibilities of the every category of userswhen working with ASCPC.

  4. Proprioceptive information processing in schizophrenia.

    Science.gov (United States)

    Arnfred, Sidse M H

    2012-03-01

    This doctoral thesis focuses on brain activity in response to proprioceptive stimulation in schizophrenia. The works encompass methodological developments substantiated by investigations of healthy volunteers and two clinical studies of schizophrenia spectrum patients. American psychiatrist Sandor Rado (1890-1972) suggested that one of two un-reducible deficits in schizophrenia was a disorder of proprioception. Exploration of proprioceptive information processing is possible through the measurement of evoked and event related potentials. Event related EEG can be analyzed as conventional time-series averages or as oscillatory averages transformed into the frequency domain. Gamma activity evoked by electricity or by another type of somatosensory stimulus has not been reported before in schizophrenia. Gamma activity is considered to be a manifestation of perceptual integration. A new load stimulus was constructed that stimulated the proprioceptive dimension of recognition of applied force. This load stimulus was tested both in simple and several types of more complex stimulus paradigms, with and without tasks, in total in 66 healthy volunteers. The evoked potential (EP) resulting from the load stimulus was named the proprioceptive EP. The later components of the proprioceptive EP (> 150 ms) were modulated similarly to previously reported electrical somatosensory EPs by repetition and cognitive task. The earlier activity was further investigated through decomposition of the time-frequency transformed data by a new non-negative matrix analysis, and previous research and visual inspection validated these results. Several time-frequency components emerged in the proprioceptive EP. The contra-lateral parietal gamma component (60-70 ms; 30-41 Hz) had not previously been described in the somatosensory modality without electrical stimulation. The parietal beta component (87-103 ms; 19-22 Hz) was increased when the proprioceptive stimulus appeared in a predictable sequence in

  5. Information Processing Structure of Quantum Gravity

    Science.gov (United States)

    Gyongyosi, Laszlo; Imre, Sandor

    2014-05-01

    The theory of quantum gravity is aimed to fuse general relativity with quantum theory into a more fundamental framework. Quantum gravity provides both the non-fixed causality of general relativity and the quantum uncertainty of quantum mechanics. In a quantum gravity scenario, the causal structure is indefinite and the processes are causally non-separable. We provide a model for the information processing structure of quantum gravity. We show that the quantum gravity environment is an information resource-pool from which valuable information can be extracted. We analyze the structure of the quantum gravity space and the entanglement of the space-time geometry. We study the information transfer capabilities of quantum gravity space and define the quantum gravity channel. We characterize the information transfer of the gravity space and the correlation measure functions of the gravity channel. We investigate the process of stimulated storage for quantum gravity memories, a phenomenon that exploits the information resource-pool property of quantum gravity. The results confirm that the benefits of the quantum gravity space can be exploited in quantum computations, particularly in the development of quantum computers. The results are supported by the grant COST Action MP1006.

  6. Geo-Information Logistical Modeling

    Directory of Open Access Journals (Sweden)

    Nikolaj I. Kovalenko

    2014-11-01

    Full Text Available This paper examines geo-information logistical modeling. The author illustrates the similarities between geo-informatics and logistics in the area of spatial objectives; illustrates that applying geo-data expands the potential of logistics; brings to light geo-information modeling as the basis of logistical modeling; describes the types of geo-information logistical modeling; describes situational geo-information modeling as a variety of geo-information logistical modeling.

  7. A STUDY ON IMPROVING INFORMATION PROCESSING ABILITIES BASED ON PBL

    Directory of Open Access Journals (Sweden)

    Du Gyu KIM,

    2014-04-01

    Full Text Available This study examined an instruction method for the improvement of information processing abilities in elementary school students. Current elementary students are required to develop information processing abilities to create new knowledge for this digital age. There is, however, a shortage of instruction strategies for these information processing abilities. This research proposes a method for teaching information processing abilities based on a problem-based learning model, and was tested with elementary students. The students developed an improved ability to create new knowledge and to present relationships with information through the process of problem solving. This study performed experimental research by comparing pre- and post-tests with twenty-three fifth grade elementary students over the course of eight months. This study produced a remarkable improvement in information selection, information reliability, information classification, information analysis, information comparison, and information internalization. This study presents an improved methodology for the teaching of information processing abilities.

  8. Complementary processing of haptic information by slowly and rapidly adapting neurons in the trigeminothalamic pathway. Electrophysiology, mathematical modeling and simulations of vibrissae-related neurons.

    Directory of Open Access Journals (Sweden)

    Abel eSanchez-Jimenez

    2013-06-01

    Full Text Available Tonic (slowly adapting and phasic (rapidly adapting primary afferents convey complementary aspects of haptic information to the central nervous system: object location and texture the former, shape the latter. Tonic and phasic neural responses are also recorded in all relay stations of the somatosensory pathway, yet it is unknown their role in both, information processing and information transmission to the cortex: we don’t know if tonic and phasic neurons process complementary aspects of haptic information and/or if these two types constitute two separate channels that convey complementary aspects of tactile information to the cortex. Here we propose to elucidate these two questions in the fast trigeminal pathway of the rat (PrV-VPM: principal trigeminal nucleus-ventroposteromedial thalamic nucleus. We analyze early and global behavior, latencies and stability of the responses of individual cells in PrV and medial lemniscus under 1-40 Hz stimulation of the whiskers in control and decorticated animals and we use stochastic spiking models and extensive simulations. Our results strongly suggest that in the first relay station of the somatosensory system (PrV: 1 tonic and phasic neurons process complementary aspects of whisker-related tactile information 2 tonic and phasic responses are not originated from two different types of neurons 3 the two responses are generated by the differential action of the somatosensory cortex on a unique type of PrV cell 4 tonic and phasic neurons do not belong to two different channels for the transmission of tactile information to the thalamus 5 trigeminothalamic transmission is exclusively performed by tonically firing neurons and 6 all aspects of haptic information are coded into low-pass, band-pass and high-pass filtering profiles of tonically firing neurons. Our results are important for both, basic research on neural circuits and information processing, and development of sensory neuroprostheses.

  9. Complementary processing of haptic information by slowly and rapidly adapting neurons in the trigeminothalamic pathway. Electrophysiology, mathematical modeling and simulations of vibrissae-related neurons.

    Science.gov (United States)

    Sanchez-Jimenez, Abel; Torets, Carlos; Panetsos, Fivos

    2013-01-01

    TONIC (SLOWLY ADAPTING) AND PHASIC (RAPIDLY ADAPTING) PRIMARY AFFERENTS CONVEY COMPLEMENTARY ASPECTS OF HAPTIC INFORMATION TO THE CENTRAL NERVOUS SYSTEM: object location and texture the former, shape the latter. Tonic and phasic neural responses are also recorded in all relay stations of the somatosensory pathway, yet it is unknown their role in both, information processing and information transmission to the cortex: we don't know if tonic and phasic neurons process complementary aspects of haptic information and/or if these two types constitute two separate channels that convey complementary aspects of tactile information to the cortex. Here we propose to elucidate these two questions in the fast trigeminal pathway of the rat (PrV-VPM: principal trigeminal nucleus-ventroposteromedial thalamic nucleus). We analyze early and global behavior, latencies and stability of the responses of individual cells in PrV and medial lemniscus under 1-40 Hz stimulation of the whiskers in control and decorticated animals and we use stochastic spiking models and extensive simulations. Our results strongly suggest that in the first relay station of the somatosensory system (PrV): (1) tonic and phasic neurons process complementary aspects of whisker-related tactile information (2) tonic and phasic responses are not originated from two different types of neurons (3) the two responses are generated by the differential action of the somatosensory cortex on a unique type of PrV cell (4) tonic and phasic neurons do not belong to two different channels for the transmission of tactile information to the thalamus (5) trigeminothalamic transmission is exclusively performed by tonically firing neurons and (6) all aspects of haptic information are coded into low-pass, band-pass, and high-pass filtering profiles of tonically firing neurons. Our results are important for both, basic research on neural circuits and information processing, and development of sensory neuroprostheses.

  10. Information processing of earth resources data

    Science.gov (United States)

    Zobrist, A. L.; Bryant, N. A.

    1982-01-01

    Current trends in the use of remotely sensed data include integration of multiple data sources of various formats and use of complex models. These trends have placed a strain on information processing systems because an enormous number of capabilities are needed to perform a single application. A solution to this problem is to create a general set of capabilities which can perform a wide variety of applications. General capabilities for the Image-Based Information System (IBIS) are outlined in this report. They are then cross-referenced for a set of applications performed at JPL.

  11. Information Processing in Auto-regulated Systems

    Directory of Open Access Journals (Sweden)

    Karl Javorszky

    2003-06-01

    Full Text Available Abstract: We present a model of information processing which is based on two concurrent ways of describing the world, where a description in one of the languages limits the possibilities for realisations in the other language. The two describing dimensions appear in our common sense as dichotomies of perspectives: subjective - objective; diversity - similarity; individual - collective. We abstract from the subjective connotations and treat the test theoretical case of an interval on which several concurrent categories can be introduced. We investigate multidimensional partitions as potential carriers of information and compare their efficiency to that of sequenced carriers. We regard the same assembly once as a contemporary collection, once as a longitudinal sequence and find promising inroads towards understanding information processing by auto-regulated systems. Information is understood to point out that what is the case from among alternatives, which could be the case. We have translated these ideas into logical operations on the set of natural numbers and have found two equivalence points on N where matches between sequential and commutative ways of presenting a state of the world can agree in a stable fashion: a flip-flop mechanism is envisioned. By following this new approach, a mathematical treatment of some poignant biomathematical problems is allowed. Also, the concepts presented in this treatise may well have relevance and applications within the information processing and the theory of language fields.

  12. Information processing of genetically modified food messages under different motives: an adaptation of the multiple-motive heuristic-systematic model.

    Science.gov (United States)

    Kim, Jooyoung; Paek, Hye-Jin

    2009-12-01

    Recent risk management research has noted the importance of understanding how the lay public processes and reacts to risk-related information. Guided by the multiple-motive heuristic-systematic model, this study examines (1) how individuals process messages in the context of genetically modified foods to change their attitudes and (2) how the persuasion process varies across types of motives. In the three treatment conditions of accuracy, defense, and impression motives, the respondents changed their attitudes through either the heuristic or the systematic mode, depending on their motives. The accuracy-motive group appeared to use the systematic processing mode, while the impression-motive group seemed to employ the heuristic processing mode. The empirical findings highlight the importance of incorporating motives to improve our understanding of the process of attitude change in risk management and communication contexts.

  13. Handbook on neural information processing

    CERN Document Server

    Maggini, Marco; Jain, Lakhmi

    2013-01-01

    This handbook presents some of the most recent topics in neural information processing, covering both theoretical concepts and practical applications. The contributions include:                         Deep architectures                         Recurrent, recursive, and graph neural networks                         Cellular neural networks                         Bayesian networks                         Approximation capabilities of neural networks                         Semi-supervised learning                         Statistical relational learning                         Kernel methods for structured data                         Multiple classifier systems                         Self organisation and modal learning                         Applications to ...

  14. Quantum process discrimination with information from environment

    Science.gov (United States)

    Wang, Yuan-Mei; Li, Jun-Gang; Zou, Jian; Xu, Bao-Ming

    2016-12-01

    In quantum metrology we usually extract information from the reduced probe system but ignore the information lost inevitably into the environment. However, K. Mølmer [Phys. Rev. Lett. 114, 040401 (2015)] showed that the information lost into the environment has an important effect on improving the successful probability of quantum process discrimination. Here we reconsider the model of a driven atom coupled to an environment and distinguish which of two candidate Hamiltonians governs the dynamics of the whole system. We mainly discuss two measurement methods, one of which obtains only the information from the reduced atom state and the other obtains the information from both the atom and its environment. Interestingly, for the two methods the optimal initial states of the atom, used to improve the successful probability of the process discrimination, are different. By comparing the two methods we find that the partial information from the environment is very useful for the discriminations. Project supported by the National Natural Science Foundation of China (Grant Nos. 11274043, 11375025, and 11005008).

  15. Infochemistry Information Processing at the Nanoscale

    CERN Document Server

    Szacilowski, Konrad

    2012-01-01

    Infochemistry: Information Processing at the Nanoscale, defines a new field of science, and describes the processes, systems and devices at the interface between chemistry and information sciences. The book is devoted to the application of molecular species and nanostructures to advanced information processing. It includes the design and synthesis of suitable materials and nanostructures, their characterization, and finally applications of molecular species and nanostructures for information storage and processing purposes. Divided into twelve chapters; the first three chapters serve as an int

  16. Using trauma informed care as a nursing model of care in an acute inpatient mental health unit: A practice development process.

    Science.gov (United States)

    Isobel, Sophie; Edwards, Clair

    2017-02-01

    Without agreeing on an explicit approach to care, mental health nurses may resort to problem focused, task oriented practice. Defining a model of care is important but there is also a need to consider the philosophical basis of any model. The use of Trauma Informed Care as a guiding philosophy provides a robust framework from which to review nursing practice. This paper describes a nursing workforce practice development process to implement Trauma Informed Care as an inpatient model of mental health nursing care. Trauma Informed Care is an evidence-based approach to care delivery that is applicable to mental health inpatient units; while there are differing strategies for implementation, there is scope for mental health nurses to take on Trauma Informed Care as a guiding philosophy, a model of care or a practice development project within all of their roles and settings in order to ensure that it has considered, relevant and meaningful implementation. The principles of Trauma Informed Care may also offer guidance for managing workforce stress and distress associated with practice change.

  17. Diffusive capture processes for information search

    CERN Document Server

    Lee, S; Kim, Y; Lee, Sungmin; Yook, Soon-Hyung; Kim, Yup

    2007-01-01

    We show how effectively the diffusive capture processes (DCP) on complex networks can be applied to information search in the networks. Numerical simulations show that our method generates only 2% of traffic compared with the most popular flooding-based query-packet-forwarding (FB) algorithm. We find that the average searching time, $$, of the our model is more scalable than another well known $n$-random walker model and comparable to the FB algorithm both on real Gnutella network and scale-free networks with $\\gamma =2.4$. We also discuss the possible relationship between $$ and $$, the second moment of the degree distribution of the networks.

  18. Social Information Processing in Deaf Adolescents

    Science.gov (United States)

    Torres, Jesús; Saldaña, David; Rodríguez-Ortiz, Isabel R.

    2016-01-01

    The goal of this study was to compare the processing of social information in deaf and hearing adolescents. A task was developed to assess social information processing (SIP) skills of deaf adolescents based on Crick and Dodge's (1994; A review and reformulation of social information-processing mechanisms in children's social adjustment.…

  19. Influence Processes for Information Technology Acceptance

    DEFF Research Database (Denmark)

    Bhattacherjee, Anol; Sanford, Clive Carlton

    2006-01-01

    This study examines how processes of external influence shape information technology acceptance among potential users, how such influence effects vary across a user population, and whether these effects are persistent over time. Drawing on the elaboration-likelihood model (ELM), we compared two...... alternative influence processes, the central and peripheral routes, in motivating IT acceptance. These processes were respectively operationalized using the argument quality and source credibility constructs, and linked to perceived usefulness and attitude, the core perceptual drivers of IT acceptance. We...... further examined how these influence processes were moderated by users' IT expertise and perceived job relevance and the temporal stability of such influence effects. Nine hypotheses thus developed were empirically validated using a field survey of document management system acceptance at an eastern...

  20. Towards elicitation of users requirements for hospital information system: from a care process modelling technique to a web based collaborative tool.

    Science.gov (United States)

    Staccini, Pascal M; Joubert, Michel; Quaranta, Jean-Francois; Fieschi, Marius

    2002-01-01

    Growing attention is being given to the use of process modeling methodology for user requirements elicitation. In the analysis phase of hospital information systems, the usefulness of care-process models has been investigated to evaluate the conceptual applicability and practical understandability by clinical staff and members of users teams. Nevertheless, there still remains a gap between users and analysts in their mutual ability to share conceptual views and vocabulary, keeping the meaning of clinical context while providing elements for analysis. One of the solutions for filling this gap is to consider the process model itself in the role of a hub as a centralized means of facilitating communication between team members. Starting with a robust and descriptive technique for process modeling called IDEF0/SADT, we refined the basic data model by extracting concepts from ISO 9000 process analysis and from enterprise ontology. We defined a web-based architecture to serve as a collaborative tool and implemented it using an object-oriented database. The prospects of such a tool are discussed notably regarding to its ability to generate data dictionaries and to be used as a navigation tool through the medium of hospital-wide documentation.

  1. Modeling multiphase materials processes

    CERN Document Server

    Iguchi, Manabu

    2010-01-01

    ""Modeling Multiphase Materials Processes: Gas-Liquid Systems"" describes the methodology and application of physical and mathematical modeling to multi-phase flow phenomena in materials processing. The book focuses on systems involving gas-liquid interaction, the most prevalent in current metallurgical processes. The performance characteristics of these processes are largely dependent on transport phenomena. This volume covers the inherent characteristics that complicate the modeling of transport phenomena in such systems, including complex multiphase structure, intense turbulence, opacity of

  2. Least Information Modeling for Information Retrieval

    CERN Document Server

    Ke, Weimao

    2012-01-01

    We proposed a Least Information theory (LIT) to quantify meaning of information in probability distribution changes, from which a new information retrieval model was developed. We observed several important characteristics of the proposed theory and derived two quantities in the IR context for document representation. Given probability distributions in a collection as prior knowledge, LI Binary (LIB) quantifies least information due to the binary occurrence of a term in a document whereas LI Frequency (LIF) measures least information based on the probability of drawing a term from a bag of words. Three fusion methods were also developed to combine LIB and LIF quantities for term weighting and document ranking. Experiments on four benchmark TREC collections for ad hoc retrieval showed that LIT-based methods demonstrated very strong performances compared to classic TF*IDF and BM25, especially for verbose queries and hard search topics. The least information theory offers a new approach to measuring semantic qua...

  3. Information behaviour: models and concepts

    Directory of Open Access Journals (Sweden)

    Polona Vilar

    2005-01-01

    Full Text Available The article presents an overview of the research area of information behaviour. Information behaviour is defined as the behaviour of individuals in relation to information sources and channels, which results as a consequence of their information need, and encompasses passive and active searching of information, and its use. Theoretical foundations are presented, as well as some fundamental conceptual models of information behaviour and related concepts: information searching behaviour, which occurrs in active, purposeful searching for information, regardless of the information source used; and information seeking behaviour, which represents a micro-level of information searching behaviour, and is expressed by those individuals who interact with information retrieval systems.

  4. Information Search Process of Lawyers: A Call for 'Just for Me' Information Services.

    Science.gov (United States)

    Kuhlthau, C. C.; Tama, S. L.

    2001-01-01

    This study sought to gain a better understanding of the variety of tasks that involve lawyers as a group of information workers, how they use information, and the role mediators play in their process of information seeking and use. Explains the influence of the model of the Information Search Process (Kuhlthau). (Author/LRW)

  5. Vision and visual information processing in cubozoans

    DEFF Research Database (Denmark)

    Bielecki, Jan

    relationship between acuity and light sensitivity. Animals have evolved a wide variety of solutions to this problem such as folded membranes, to have a larger receptive surfaces, and lenses, to focus light onto the receptive membranes. On the neural capacity side, complex eyes demand huge processing network...... to analyse the received information, illustrated by the fact that one third of the human brain is devoted to visual information processing. The cost of maintaining such neural network deter most organisms from investing in the camera type option, if possible, and settle for a model that will more precisely...... fit their need. Visual neuroethology integrates optics, sensory equipment, neural network and motor output to explain how animals can perform behaviour in response to a specific visual stimulus. In this doctoral thesis, I will elucidate the individual steps in a visual neuroethological pathway...

  6. Processing information system for highly specialized information in corporate networks

    Science.gov (United States)

    Petrosyan, M. O.; Kovalev, I. V.; Zelenkov, P. V.; Brezitskaya, VV; Prohorovich, G. A.

    2016-11-01

    The new structure for formation system and management system for highly specialized information in corporate systems is offered. The main distinguishing feature of this structure is that it involves the processing of multilingual information in a single user request.

  7. Product Development Process Modeling

    Institute of Scientific and Technical Information of China (English)

    1999-01-01

    The use of Concurrent Engineering and other modern methods of product development and maintenance require that a large number of time-overlapped "processes" be performed by many people. However, successfully describing and optimizing these processes are becoming even more difficult to achieve. The perspective of industrial process theory (the definition of process) and the perspective of process implementation (process transition, accumulation, and inter-operations between processes) are used to survey the method used to build one base model (multi-view) process model.

  8. Process modeling style

    CERN Document Server

    Long, John

    2014-01-01

    Process Modeling Style focuses on other aspects of process modeling beyond notation that are very important to practitioners. Many people who model processes focus on the specific notation used to create their drawings. While that is important, there are many other aspects to modeling, such as naming, creating identifiers, descriptions, interfaces, patterns, and creating useful process documentation. Experience author John Long focuses on those non-notational aspects of modeling, which practitioners will find invaluable. Gives solid advice for creating roles, work produ

  9. Product and Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian T.; Gani, Rafiqul

    This book covers the area of product and process modelling via a case study approach. It addresses a wide range of modelling applications with emphasis on modelling methodology and the subsequent in-depth analysis of mathematical models to gain insight via structural aspects of the models. These ...

  10. Five Computational Actions in Information Processing

    Directory of Open Access Journals (Sweden)

    Stefan Vladutescu

    2014-12-01

    Full Text Available This study is circumscribed to the Information Science. The zetetic aim of research is double: a to define the concept of action of information computational processing and b to design a taxonomy of actions of information computational processing. Our thesis is that any information processing is a computational processing. First, the investigation trays to demonstrate that the computati onal actions of information processing or the informational actions are computationalinvestigative configurations for structuring information: clusters of highlyaggregated operations which are carried out in a unitary manner operate convergent and behave like a unique computational device. From a methodological point of view, they are comprised within the category of analytical instruments for the informational processing of raw material, of data, of vague, confused, unstructured informational elements. As internal articulation, the actions are patterns for the integrated carrying out of operations of informational investigation. Secondly, we propose an inventory and a description of five basic informational computational actions: exploring, grouping, anticipation, schematization, inferential structuring. R. S. Wyer and T. K. Srull (2014 speak about "four information processing". We would like to continue with further and future investigation of the relationship between operations, actions, strategies and mechanisms of informational processing.

  11. Modelling Hospital Materials Management Processes

    Directory of Open Access Journals (Sweden)

    Raffaele Iannone

    2013-06-01

    integrated and detailed analysis and description model for hospital materials management data and tasks, which is able to tackle information from patient requirements to usage, from replenishment requests to supplying and handling activities. The model takes account of medical risk reduction, traceability and streamlined processes perspectives. Second, the paper translates this information into a business process model and mathematical formalization.The study provides a useful guide to the various relevant technology‐related, management and business issues, laying the foundations of an efficient reengineering of the supply chain to reduce healthcare costs and improve the quality of care.

  12. Objective information about energy models

    Energy Technology Data Exchange (ETDEWEB)

    Hale, D.R. (Energy Information Administration, Washington, DC (United States))

    1993-01-01

    This article describes the Energy Information Administration's program to develop objective information about its modeling systems without hindering model development and applications, and within budget and human resource constraints. 16 refs., 1 fig., 2 tabs.

  13. Product and Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian T.; Gani, Rafiqul

    This book covers the area of product and process modelling via a case study approach. It addresses a wide range of modelling applications with emphasis on modelling methodology and the subsequent in-depth analysis of mathematical models to gain insight via structural aspects of the models....... These approaches are put into the context of life cycle modelling, where multiscale and multiform modelling is increasingly prevalent in the 21st century. The book commences with a discussion of modern product and process modelling theory and practice followed by a series of case studies drawn from a variety...... to biotechnology applications, food, polymer and human health application areas. The book highlights to important nature of modern product and process modelling in the decision making processes across the life cycle. As such it provides an important resource for students, researchers and industrial practitioners....

  14. Information Processing Structure of Quantum Gravity

    CERN Document Server

    Gyongyosi, Laszlo

    2014-01-01

    The theory of quantum gravity is aimed to fuse general relativity with quantum theory into a more fundamental framework. The space of quantum gravity provides both the non-fixed causality of general relativity and the quantum uncertainty of quantum mechanics. In a quantum gravity scenario, the causal structure is indefinite and the processes are causally non-separable. In this work, we provide a model for the information processing structure of quantum gravity. We show that the quantum gravity environment is an information resource-pool from which valuable information can be extracted. We analyze the structure of the quantum gravity space and the entanglement of the space-time geometry. We study the information transfer capabilities of quantum gravity space and define the quantum gravity channel. We reveal that the quantum gravity space acts as a background noise on the local environment states. We characterize the properties of the noise of the quantum gravity space and show that it allows the separate local...

  15. UML in business process modeling

    Directory of Open Access Journals (Sweden)

    Bartosz Marcinkowski

    2013-03-01

    Full Text Available Selection and proper application of business process modeling methods and techniques have a significant impact on organizational improvement capabilities as well as proper understanding of functionality of information systems that shall support activity of the organization. A number of business process modeling notations were popularized in practice in recent decades. Most significant of the notations include Business Process Modeling Notation (OMG BPMN and several Unified Modeling Language (OMG UML extensions. In this paper, the assessment whether one of the most flexible and strictly standardized contemporary business process modeling notations, i.e. Rational UML Profile for Business Modeling, enable business analysts to prepare business models that are all-embracing and understandable by all the stakeholders. After the introduction, methodology of research is discussed. Section 2 presents selected case study results. The paper is concluded with a summary.

  16. Information Selection in Intelligence Processing

    Science.gov (United States)

    2011-12-01

    problem of overload.” As another example, Whaley (Whaley, 1974) argues that one of the causes for the Pearl Harbor and Barbarossa strategic surprises is...which becomes more and more important as the Internet evolves. The IR problem and the information selection problem share some similar...all the algorithms tend more towards exploration: the temperature parameter in Softmax is higher (0.12 instead of 0.08), the delta for the VDBE

  17. Standard Model processes

    CERN Document Server

    Mangano, M.L.; Aguilar Saavedra, J.A.; Alekhin, S.; Badger, S.; Bauer, C.W.; Becher, T.; Bertone, V.; Bonvini, M.; Boselli, S.; Bothmann, E.; Boughezal, R.; Cacciari, M.; Carloni Calame, C.M.; Caola, F.; Campbell, J.M.; Carrazza, S.; Chiesa, M.; Cieri, L.; Cimaglia, F.; Febres Cordero, F.; Ferrarese, P.; D'Enterria, D.; Ferrera, G.; Garcia i Tormo, X.; Garzelli, M.V.; Germann, E.; Hirschi, V.; Han, T.; Ita, H.; Jäger, B.; Kallweit, S.; Karlberg, A.; Kuttimalai, S.; Krauss, F.; Larkoski, A.J.; Lindert, J.; Luisoni, G.; Maierhöfer, P.; Mattelaer, O.; Martinez, H.; Moch, S.; Montagna, G.; Moretti, M.; Nason, P.; Nicrosini, O.; Oleari, C.; Pagani, D.; Papaefstathiou, A.; Petriello, F.; Piccinini, F.; Pierini, M.; Pierog, T.; Pozzorini, S.; Re, E.; Robens, T.; Rojo, J.; Ruiz, R.; Sakurai, K.; Salam, G.P.; Salfelder, L.; Schönherr, M.; Schulze, M.; Schumann, S.; Selvaggi, M.; Shivaji, A.; Siodmok, A.; Skands, P.; Torrielli, P.; Tramontano, F.; Tsinikos, I.; Tweedie, B.; Vicini, A.; Westhoff, S.; Zaro, M.; Zeppenfeld, D.; CERN. Geneva. ATS Department

    2017-06-22

    This report summarises the properties of Standard Model processes at the 100 TeV pp collider. We document the production rates and typical distributions for a number of benchmark Standard Model processes, and discuss new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

  18. Information Retrieval Models

    NARCIS (Netherlands)

    Hiemstra, Djoerd; Göker, Ayse; Davies, John

    2009-01-01

    Many applications that handle information on the internet would be completely inadequate without the support of information retrieval technology. How would we find information on the world wide web if there were no web search engines? How would we manage our email without spam filtering? Much of the

  19. Information Retrieval Models

    NARCIS (Netherlands)

    Hiemstra, Djoerd; Göker, Ayse; Davies, John

    2009-01-01

    Many applications that handle information on the internet would be completely inadequate without the support of information retrieval technology. How would we find information on the world wide web if there were no web search engines? How would we manage our email without spam filtering? Much of the

  20. A proposed general model of information behaviour.

    Directory of Open Access Journals (Sweden)

    2003-01-01

    Full Text Available Presents a critical description of Wilson's (1996 global model of information behaviour and proposes major modification on the basis of research into information behaviour of managers, conducted in Poland. The theoretical analysis and research results suggest that Wilson's model has certain imperfections, both in its conceptual content, and in graphical presentation. The model, for example, cannot be used to describe managers' information behaviour, since managers basically are not the end users of external from organization or computerized information services, and they acquire information mainly through various intermediaries. Therefore, the model cannot be considered as a general model, applicable to every category of information users. The proposed new model encompasses the main concepts of Wilson's model, such as: person-in-context, three categories of intervening variables (individual, social and environmental, activating mechanisms, cyclic character of information behaviours, and the adoption of a multidisciplinary approach to explain them. However, the new model introduces several changes. They include: 1. identification of 'context' with the intervening variables; 2. immersion of the chain of information behaviour in the 'context', to indicate that the context variables influence behaviour at all stages of the process (identification of needs, looking for information, processing and using it; 3. stress is put on the fact that the activating mechanisms also can occur at all stages of the information acquisition process; 4. introduction of two basic strategies of looking for information: personally and/or using various intermediaries.

  1. Filling the gap between geophysics and geotechnics in landslide process understanding: a data fusion methodology to integrate multi-source information in hydro-mechanical modeling

    Science.gov (United States)

    Bernadie, S.; Gance, J.; Grandjean, G.; Malet, J.

    2013-12-01

    The population increase and the rising issue of climate change impact the long term stability of mountain slopes. So far, it is not yet possible to assess in all cases conditions for failure, reactivation or rapid surges of slopes. The main reason identified by Van Asch et al. (2007) is the excessive conceptualization of the slope in the models. Therefore to improve our forecasting capability, considering local information such as the local slope geometry, the soil material variability, hydrological processes and the presence of fissures are of first importance. Geophysical imaging, combined with geotechnical tests, is an adapted tool to obtain such detailed information. The development of near-surface geophysics in the last three decades encourages the use of multiple geophysical methods for slope investigations. However, fusion of real data is little used in this domain and a gap still exists between the data processed by the geophysicists and the slope hydro-mechanical models developed by the geotechnical engineers. Starting from this statement, we propose a methodological flowchart of multi-source geophysical and geotechnical data integration to construct a slope hydro-mechanical model of a selected profile at the Super-Sauze landslide. Based on data fusion concepts, the methodology aims at integrating various data in order to create a geological and a geotechnical model of the slope profile. The input data consist in seismic and geoelectrical tomographies (that give access to a spatially distributed information on the soil physical state) supplemented by punctual geotechnical tests (dynamic penetration tests). The tomograms and the geotechnical tests are combined into a unique interpreted model characterized by different geotechnical domains. We use the fuzzy logic clustering method in order to take into account the uncertainty coming from each input data. Then an unstructured finite element mesh, adapted to the resolution of the different input data and

  2. Mathematics of Information Processing and the Internet

    Science.gov (United States)

    Hart, Eric W.

    2010-01-01

    The mathematics of information processing and the Internet can be organized around four fundamental themes: (1) access (finding information easily); (2) security (keeping information confidential); (3) accuracy (ensuring accurate information); and (4) efficiency (data compression). In this article, the author discusses each theme with reference to…

  3. Is There Room for "Development" in Developmental Models of Information Processing Biases to Threat in Children and Adolescents?

    Science.gov (United States)

    Field, Andy P.; Lester, Kathryn J.

    2010-01-01

    Clinical and experimental theories assume that processing biases in attention and interpretation are a causal mechanism through which anxiety develops. Despite growing evidence that these processing biases are present in children and, therefore, develop long before adulthood, these theories ignore the potential role of child development. This…

  4. Visual Information Processing Based on Qualitative Mapping

    Institute of Scientific and Technical Information of China (English)

    LI Hua; LIU Yongchang; LI Chao

    2007-01-01

    Visual information processing is not only an important research direction in fields of psychology,neuroscience and artificial intelligence etc,but also the research base on biological recognition theory and technology realization.Visual information processing in existence,e.g.visual information processing facing to nerve calculation,visual information processing using substance shape distilling and wavelet under high yawp,ANN visual information processing and etc,are very complex in comparison.Using qualitative Mapping,this text describes the specific attributes in the course of visual information processing and the results are more brief and straightforward.So the software program of vision recognition is probably easier to realize.

  5. Business process modeling in healthcare.

    Science.gov (United States)

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.

  6. Vision and visual information processing in cubozoans

    DEFF Research Database (Denmark)

    Bielecki, Jan

    to analyse the received information, illustrated by the fact that one third of the human brain is devoted to visual information processing. The cost of maintaining such neural network deter most organisms from investing in the camera type option, if possible, and settle for a model that will more precisely......Eyes have been considered support for the divine design hypothesis over evolution because, surely, eyes cannot function with anything less than all the components that comprise a vertebrate camera type eye. Yet, devoted Darwinists have estimated that complex visual systems can evolve from a single...... light sensitive cell within 400 000 generations and all intermediate stages can be found throughout the Metazoa. Eyes have evolved to accommodate increasingly more complex visual behaviours, from light sensitive tissues involved in circadian entrainment to the complex camera type eyes that can guide...

  7. CURRENT DEVELOPMENTS IN COMPLEX INFORMATION PROCESSING,

    Science.gov (United States)

    DATA PROCESSING, * COMPUTER PROGRAMMING , INFORMATION RETRIEVAL, DATA STORAGE SYSTEMS, MATHEMATICAL LOGIC, ARTIFICIAL INTELLIGENCE, PATTERN RECOGNITION, GAME THEORY, PROGRAMMING LANGUAGES, MATHEMATICAL ANALYSIS.

  8. Memory-Based Cognitive Modeling for Visual Information Processing%基于记忆机制的视觉信息处理认知建模

    Institute of Scientific and Technical Information of China (English)

    王延江; 齐玉娟

    2013-01-01

    Inspired by the way in which humans perceive the environment,a memory-based cognitive model for visual information processing is proposed to imitate some cognitive functions of human brain.The proposed model includes five components:information granule,memory spaces cognitive behaviors,rules for manipulating information among memory spaces,and decision-making processes.According to the three-stage memory model of human brain,three memory spaces are defined to store the current,temporal and permanent visual information respectively,i.e.ultra short-term memory space (USTMS),short-term memory space (STMS) and long-term memory space (LTMS).The past scenes can be remembered or forgotten by the proposed model,and thus the model can adapt to the variation of the scene.The proposed model is applied to two hot issues in computer vision:background modeling and object tracking.Experimental results show that the proposed model can deal with scenes with sudden background,object appearance changing and heavy object occlusions under complex background.%受人类认知环境方式的启发,将人类记忆机制引入到视觉信息处理过程,提出一种基于记忆机制的视觉信息处理认知模型,用于模拟人脑的一些认知过程.该模型主要包括5个部分:信息粒、记忆空间、认知行为、信息传递规则和决策过程.根据人脑三阶段记忆模型定义3个记忆空间:瞬时记忆空间、短时记忆空间和长时记忆空间,分别用于存储当前的、临时的和永久的视觉信息.该模型可记住或遗忘曾经出现过的场景,从而使其能快速适应场景变化.将其应用于计算机视觉研究中的两个关键问题:背景建模与运动目标跟踪.实验结果表明,该模型能较好解决复杂场景下背景或目标姿态突变以及目标被严重遮挡等问题.

  9. Business Process Modeling: Blueprinting

    OpenAIRE

    Al-Fedaghi, Sabah

    2017-01-01

    This paper presents a flow-based methodology for capturing processes specified in business process modeling. The proposed methodology is demonstrated through re-modeling of an IBM Blueworks case study. While the Blueworks approach offers a well-proven tool in the field, this should not discourage workers from exploring other ways of thinking about effectively capturing processes. The diagrammatic representation presented here demonstrates a viable methodology in this context. It is hoped this...

  10. XML-based product information processing method for product design

    Science.gov (United States)

    Zhang, Zhen Yu

    2012-01-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  11. PRISM: a planned risk information seeking model.

    Science.gov (United States)

    Kahlor, LeeAnn

    2010-06-01

    Recent attention on health-related information seeking has focused primarily on information seeking within specific health and health risk contexts. This study attempts to shift some of that focus to individual-level variables that may impact health risk information seeking across contexts. To locate these variables, the researcher posits an integrated model, the Planned Risk Information Seeking Model (PRISM). The model, which treats risk information seeking as a deliberate (planned) behavior, maps variables found in the Theory of Planned Behavior (TPB; Ajzen, 1991) and the Risk Information Seeking and Processing Model (RISP; Griffin, Dunwoody, & Neuwirth, 1999), and posits linkages among those variables. This effort is further informed by Kahlor's (2007) Augmented RISP, the Theory of Motivated Information Management (Afifi & Weiner, 2004), the Comprehensive Model of Information Seeking (Johnson & Meischke, 1993), the Health Information Acquisition Model (Freimuth, Stein, & Kean, 1989), and the Extended Parallel Processing Model (Witte, 1998). The resulting integrated model accounted for 59% of the variance in health risk information-seeking intent and performed better than the TPB or the RISP alone.

  12. Erythropoietin modulates neural and cognitive processing of emotional information in biomarker models of antidepressant drug action in depressed patients

    DEFF Research Database (Denmark)

    Miskowiak, Kamilla W; Favaron, Elisa; Hafizi, Sepehr

    2010-01-01

    Erythropoietin (Epo) has neuroprotective and neurotrophic effects, and may be a novel therapeutic agent in the treatment of psychiatric disorders. We have demonstrated antidepressant-like effects of Epo on the neural and cognitive processing of facial expressions in healthy volunteers. The current...... study investigates the effects of Epo on the neural and cognitive response to emotional facial expressions in depressed patients....

  13. Erythropoietin modulates neural and cognitive processing of emotional information in biomarker models of antidepressant drug action in depressed patients

    DEFF Research Database (Denmark)

    Miskowiak, Kamilla W; Favaron, Elisa; Hafizi, Sepehr

    2010-01-01

    Erythropoietin (Epo) has neuroprotective and neurotrophic effects, and may be a novel therapeutic agent in the treatment of psychiatric disorders. We have demonstrated antidepressant-like effects of Epo on the neural and cognitive processing of facial expressions in healthy volunteers. The curren...

  14. INFORMATION MODELLING OF PROCESS OF ADOPTION OF ADMINISTRATIVE DECISIONS AT THE ORGANIZATION OF PROFESSIONAL DEVELOPMENT OF THE PERSONNEL

    Directory of Open Access Journals (Sweden)

    Yaroslav E. Prokushev

    2015-01-01

    Full Text Available The article is devoted to a problem of theorganization of professional developmentof personnel. The article is consideringtwo interconnected tasks. The fi rst task is: estimation of degree of need of professional development of the specifi c worker. The second task is: choice of the programof professional development. Functionalinformation models of procedures ofadoption of administrative decisions withinthese tasks are developed.

  15. Comparison on information-seeking behavior of postgraduated students in Isfahan University of Medical Sciences and University of Isfahan in writing dissertation based on Kuhlthau model of information search process.

    Science.gov (United States)

    Abedi, Mahnaz; Ashrafi-Rizi, Hasan; Zare-Farashbandi, Firoozeh; Nouri, Rasoul; Hassanzadeh, Akbar

    2014-01-01

    Information-seeking behaviors have been one of the main focuses of researchers in order to identify and solve the problems users face in information recovery. The aim of this research is Comparative on Information-Seeking Behavior of the Postgraduate Students in Isfahan University of Medical Sciences and Isfahan University in Writing Dissertation based on Kuhlthau Model of Information Search Process in 2012. The research method followed is survey and the data collection tool is Narmenji questionnaire. Statistical population was all postgraduate students in Isfahan University of Medical Sciences and Isfahan University. The sample size was 196 people and sampling was stratified randomly. The type of statistical analyses were descriptive (mean and frequency) and inferential (independent t test and Pearson's correlation) and the software used was SPSS20. The findings showed that Isfahan Medical Sciences University followed 20% of the order steps of this model and Isfahan University did not follow this model. In the first stage (Initiation) and sixth (Presentation) of feelings aspects and in actions (total stages) significant difference was found between students from the two universities. Between gender and fourth stage (Formulation) and the total score of feelings the Kuhlthau model there has a significant relationship. Also there was a significant and inverse relationship between the third stage (Exploration) of feelings and age of the students. The results showed that in writing dissertation there were some major differences in following up the Kuhlthau model between students of the two Universities. There are significant differences between some of the stages of feelings and actions of students' information-seeking behavior from the two universities. There is a significant relationship between the fourth stage (Formulation) of feelings in the Kuhlthau Model with gender and third stage of the Feelings (Exploration) with age.

  16. Conjunction of wavelet transform and SOM-mutual information data pre-processing approach for AI-based Multi-Station nitrate modeling of watersheds

    Science.gov (United States)

    Nourani, Vahid; Andalib, Gholamreza; Dąbrowska, Dominika

    2017-05-01

    Accurate nitrate load predictions can elevate decision management of water quality of watersheds which affects to environment and drinking water. In this paper, two scenarios were considered for Multi-Station (MS) nitrate load modeling of the Little River watershed. In the first scenario, Markovian characteristics of streamflow-nitrate time series were proposed for the MS modeling. For this purpose, feature extraction criterion of Mutual Information (MI) was employed for input selection of artificial intelligence models (Feed Forward Neural Network, FFNN and least square support vector machine). In the second scenario for considering seasonality-based characteristics of the time series, wavelet transform was used to extract multi-scale features of streamflow-nitrate time series of the watershed's sub-basins to model MS nitrate loads. Self-Organizing Map (SOM) clustering technique which finds homogeneous sub-series clusters was also linked to MI for proper cluster agent choice to be imposed into the models for predicting the nitrate loads of the watershed's sub-basins. The proposed MS method not only considers the prediction of the outlet nitrate but also covers predictions of interior sub-basins nitrate load values. The results indicated that the proposed FFNN model coupled with the SOM-MI improved the performance of MS nitrate predictions compared to the Markovian-based models up to 39%. Overall, accurate selection of dominant inputs which consider seasonality-based characteristics of streamflow-nitrate process could enhance the efficiency of nitrate load predictions.

  17. Information Processing Approaches to Cognitive Development

    Science.gov (United States)

    1989-08-04

    This chapter reviews the history and current status of information- processing approaches to cognitive development . Because the approach is so...a detailed analysis of self-modifying production systems and their potential for formulating theories of cognitive development . Keywords: Information processing; Cognitive development ; Self modification; Production system.

  18. Moral judgment as information processing: an integrative review

    Science.gov (United States)

    Guglielmo, Steve

    2015-01-01

    How do humans make moral judgments about others’ behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment. PMID:26579022

  19. Moral judgment as information processing: an integrative review.

    Science.gov (United States)

    Guglielmo, Steve

    2015-01-01

    How do humans make moral judgments about others' behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment.

  20. Chinese Information Processing and Its Prospects

    Institute of Scientific and Technical Information of China (English)

    Sheng Li; Tie-Jun Zhao

    2006-01-01

    The paper presents some main progresses and achievements in Chinese information processing. It focuses on six aspects, I.e., Chinese syntactic analysis, Chinese semantic analysis, machine translation, information retrieval, information extraction, and speech recognition and synthesis. The important techniques and possible key problems of the respective branch in the near future are discussed as well.

  1. Modeling spatiotemporal information generation

    NARCIS (Netherlands)

    Scheider, Simon; Gräler, Benedikt; Stasch, Christoph; Pebesma, Edzer

    2016-01-01

    Maintaining knowledge about the provenance of datasets, that is, about how they were obtained, is crucial for their further use. Contrary to what the overused metaphors of ‘data mining’ and ‘big data’ are implying, it is hardly possible to use data in a meaningful way if information about sources an

  2. Information risk and security modeling

    Science.gov (United States)

    Zivic, Predrag

    2005-03-01

    This research paper presentation will feature current frameworks to addressing risk and security modeling and metrics. The paper will analyze technical level risk and security metrics of Common Criteria/ISO15408, Centre for Internet Security guidelines, NSA configuration guidelines and metrics used at this level. Information IT operational standards view on security metrics such as GMITS/ISO13335, ITIL/ITMS and architectural guidelines such as ISO7498-2 will be explained. Business process level standards such as ISO17799, COSO and CobiT will be presented with their control approach to security metrics. Top level, the maturity standards such as SSE-CMM/ISO21827, NSA Infosec Assessment and CobiT will be explored and reviewed. For each defined level of security metrics the research presentation will explore the appropriate usage of these standards. The paper will discuss standards approaches to conducting the risk and security metrics. The research findings will demonstrate the need for common baseline for both risk and security metrics. This paper will show the relation between the attribute based common baseline and corporate assets and controls for risk and security metrics. IT will be shown that such approach spans over all mentioned standards. The proposed approach 3D visual presentation and development of the Information Security Model will be analyzed and postulated. Presentation will clearly demonstrate the benefits of proposed attributes based approach and defined risk and security space for modeling and measuring.

  3. Business Model Process Configurations

    DEFF Research Database (Denmark)

    Taran, Yariv; Nielsen, Christian; Thomsen, Peter

    2015-01-01

    Purpose – The paper aims: 1) To develop systematically a structural list of various business model process configuration and to group (deductively) these selected configurations in a structured typological categorization list. 2) To facilitate companies in the process of BM innovation......, by developing (inductively) an ontological classification framework, in view of the BM process configurations typology developed. Design/methodology/approach – Given the inconsistencies found in the business model studies (e.g. definitions, configurations, classifications) we adopted the analytical induction...... method of data analysis. Findings - A comprehensive literature review and analysis resulted in a list of business model process configurations systematically organized under five classification groups, namely, revenue model; value proposition; value configuration; target customers, and strategic...

  4. Multitasking Information Seeking and Searching Processes.

    Science.gov (United States)

    Spink, Amanda; Ozmutlu, H. Cenk; Ozmutlu, Seda

    2002-01-01

    Presents findings from four studies of the prevalence of multitasking information seeking and searching by Web (via the Excite search engine), information retrieval system (mediated online database searching), and academic library users. Highlights include human information coordinating behavior (HICB); and implications for models of information…

  5. Multitasking Information Seeking and Searching Processes.

    Science.gov (United States)

    Spink, Amanda; Ozmutlu, H. Cenk; Ozmutlu, Seda

    2002-01-01

    Presents findings from four studies of the prevalence of multitasking information seeking and searching by Web (via the Excite search engine), information retrieval system (mediated online database searching), and academic library users. Highlights include human information coordinating behavior (HICB); and implications for models of information…

  6. Structured Information Management Using New Techniques for Processing Text.

    Science.gov (United States)

    Gibb, Forbes; Smart, Godfrey

    1990-01-01

    Describes the development of a software system, SIMPR (Structured Information Management: Processing and Retrieval), that will process documents by indexing them and classifying their subjects. Topics discussed include information storage and retrieval, file inversion techniques, modelling the user, natural language searching, automatic indexing,…

  7. Motivated information processing in organizational teams: Progress, puzzles, and prospects

    NARCIS (Netherlands)

    Nijstad, B.A.; de Dreu, C.K.W.

    2012-01-01

    Much of the research into group and team functioning looks at groups that perform cognitive tasks, such as decision making, problem solving, and innovation. The Motivated Information Processing in Groups Model (MIP-G; De Dreu, Nijstad, & Van Knippenberg, 2008) conjectures that information processing

  8. Motivated information processing in organizational teams: Progress, puzzles, and prospects

    NARCIS (Netherlands)

    Nijstad, B.A.; de Dreu, C.K.W.

    2012-01-01

    Much of the research into group and team functioning looks at groups that perform cognitive tasks, such as decision making, problem solving, and innovation. The Motivated Information Processing in Groups Model (MIP-G; De Dreu, Nijstad, & Van Knippenberg, 2008) conjectures that information processing

  9. Attachment in Middle Childhood: Associations with Information Processing

    Science.gov (United States)

    Zimmermann, Peter; Iwanski, Alexandra

    2015-01-01

    Attachment theory suggests that internal working models of self and significant others influence adjustment during development by controlling information processing and self-regulation. We provide a conceptual overview on possible mechanisms linking attachment and information processing and review the current literature in middle childhood.…

  10. WWTP Process Tank Modelling

    DEFF Research Database (Denmark)

    Laursen, Jesper

    hydrofoil shaped propellers. These two sub-processes deliver the main part of the supplied energy to the activated sludge tank, and for this reason they are important for the mixing conditions in the tank. For other important processes occurring in the activated sludge tank, existing models and measurements...

  11. BIM. Building Information Model. Special issue; BIM. Building Information Model. Themanummer

    Energy Technology Data Exchange (ETDEWEB)

    Van Gelder, A.L.A. [Arta and Consultancy, Lage Zwaluwe (Netherlands); Van den Eijnden, P.A.A. [Stichting Marktwerking Installatietechniek, Zoetermeer (Netherlands); Veerman, J.; Mackaij, J.; Borst, E. [Royal Haskoning DHV, Nijmegen (Netherlands); Kruijsse, P.M.D. [Wolter en Dros, Amersfoort (Netherlands); Buma, W. [Merlijn Media, Waddinxveen (Netherlands); Bomhof, F.; Willems, P.H.; Boehms, M. [TNO, Delft (Netherlands); Hofman, M.; Verkerk, M. [ISSO, Rotterdam (Netherlands); Bodeving, M. [VIAC Installatie Adviseurs, Houten (Netherlands); Van Ravenswaaij, J.; Van Hoven, H. [BAM Techniek, Bunnik (Netherlands); Boeije, I.; Schalk, E. [Stabiplan, Bodegraven (Netherlands)

    2012-11-15

    A series of 14 articles illustrates the various aspects of the Building Information Model (BIM). The essence of BIM is to capture information about the building process and the building product. [Dutch] In 14 artikelen worden diverse aspecten m.b.t. het Building Information Model (BIM) belicht. De essentie van BIM is het vastleggen van informatie over het bouwproces en het bouwproduct.

  12. Mapping individual logical processes in information searching

    Science.gov (United States)

    Smetana, F. O.

    1974-01-01

    An interactive dialog with a computerized information collection was recorded and plotted in the form of a flow chart. The process permits one to identify the logical processes employed in considerable detail and is therefore suggested as a tool for measuring individual thought processes in a variety of situations. A sample of an actual test case is given.

  13. Aptitude from an Information Processing Perspective.

    Science.gov (United States)

    McLaughlin, Barry

    An information-processing approach to language learning is examined; language aptitude is factored into the approach, and the role of working memory is discussed. The process of learning includes two processes that make heavy use of working memory is: automatization and restructuring. At first, learners must make a conscious effort to remember and…

  14. Theory of Neural Information Processing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Galla, Tobias [Abdus Salam International Centre for Theoretical Physics and INFM/CNR SISSA-Unit, Strada Costiera 11, I-34014 Trieste (Italy)

    2006-04-07

    It is difficult not to be amazed by the ability of the human brain to process, to structure and to memorize information. Even by the toughest standards the behaviour of this network of about 10{sup 11} neurons qualifies as complex, and both the scientific community and the public take great interest in the growing field of neuroscience. The scientific endeavour to learn more about the function of the brain as an information processing system is here a truly interdisciplinary one, with important contributions from biology, computer science, physics, engineering and mathematics as the authors quite rightly point out in the introduction of their book. The role of the theoretical disciplines here is to provide mathematical models of information processing systems and the tools to study them. These models and tools are at the centre of the material covered in the book by Coolen, Kuehn and Sollich. The book is divided into five parts, providing basic introductory material on neural network models as well as the details of advanced techniques to study them. A mathematical appendix complements the main text. The range of topics is extremely broad, still the presentation is concise and the book well arranged. To stress the breadth of the book let me just mention a few keywords here: the material ranges from the basics of perceptrons and recurrent network architectures to more advanced aspects such as Bayesian learning and support vector machines; Shannon's theory of information and the definition of entropy are discussed, and a chapter on Amari's information geometry is not missing either. Finally the statistical mechanics chapters cover Gardner theory and the replica analysis of the Hopfield model, not without being preceded by a brief introduction of the basic concepts of equilibrium statistical physics. The book also contains a part on effective theories of the macroscopic dynamics of neural networks. Many dynamical aspects of neural networks are usually hard

  15. Ion trapping for quantum information processing

    Institute of Scientific and Technical Information of China (English)

    WAN Jin-yin; WANG Yu-zhu; LIU Liang

    2007-01-01

    In this paper we have reviewed the recent pro-gresses on the ion trapping for quantum information process-ing and quantum computation. We have first discussed the basic principle of quantum information theory and then fo-cused on ion trapping for quantum information processing.Many variations, especially the techniques of ion chips, have been investigated since the original ion trap quantum compu-tation scheme was proposed. Full two-dimensional control of multiple ions on an ion chip is promising for the realization of scalable ion trap quantum computation and the implemen-tation of quantum networks.

  16. A Two-Layered Diffusion Model Traces the Dynamics of Information Processing in the Valuation-and-Choice Circuit of Decision Making

    Directory of Open Access Journals (Sweden)

    Pietro Piu

    2014-01-01

    Full Text Available A circuit of evaluation and selection of the alternatives is considered a reliable model in neurobiology. The prominent contributions of the literature to this topic are reported. In this study, valuation and choice of a decisional process during Two-Alternative Forced-Choice (TAFC task are represented as a two-layered network of computational cells, where information accrual and processing progress in nonlinear diffusion dynamics. The evolution of the response-to-stimulus map is thus modeled by two linked diffusive modules (2LDM representing the neuronal populations involved in the valuation-and-decision circuit of decision making. Diffusion models are naturally appropriate for describing accumulation of evidence over the time. This allows the computation of the response times (RTs in valuation and choice, under the hypothesis of ex-Wald distribution. A nonlinear transfer function integrates the activities of the two layers. The input-output map based on the infomax principle makes the 2LDM consistent with the reinforcement learning approach. Results from simulated likelihood time series indicate that 2LDM may account for the activity-dependent modulatory component of effective connectivity between the neuronal populations. Rhythmic fluctuations of the estimate gain functions in the delta-beta bands also support the compatibility of 2LDM with the neurobiology of DM.

  17. Revealed Quantum Information in Weak Interaction Processes

    CERN Document Server

    Hiesmayr, B C

    2014-01-01

    We analyze the achievable limits of the quantum information processing of the weak interaction revealed by hyperons with spin. We find that the weak decay process corresponds to an interferometric device with a fixed visibility and fixed phase difference for each hyperon. Nature chooses rather low visibilities expressing a preference to parity conserving or violating processes (except for the decay $\\Sigma^+\\longrightarrow p \\pi^0$). The decay process can be considered as an open quantum channel that carries the information of the hyperon spin to the angular distribution of the momentum of the daughter particles. We find a simple geometrical information theoretic interpretation of this process: two quantization axes are chosen spontaneously with probabilities $\\frac{1\\pm\\alpha}{2}$ where $\\alpha$ is proportional to the visibility times the real part of the phase shift. Differently stated the weak interaction process corresponds to spin measurements with an imperfect Stern-Gerlach apparatus. Equipped with this...

  18. Textual information access statistical models

    CERN Document Server

    Gaussier, Eric

    2013-01-01

    This book presents statistical models that have recently been developed within several research communities to access information contained in text collections. The problems considered are linked to applications aiming at facilitating information access:- information extraction and retrieval;- text classification and clustering;- opinion mining;- comprehension aids (automatic summarization, machine translation, visualization).In order to give the reader as complete a description as possible, the focus is placed on the probability models used in the applications

  19. 基于语义处理技术的信息检索模型%Information Retrieval Model Based on Semantic Processing Technology

    Institute of Scientific and Technical Information of China (English)

    王瑞琴

    2012-01-01

    信息爆炸是当今信息社会的一大特点,如何在海量的信息中有效地找到所需信息因而成为了一个关键问题,语义检索技术是解决这一问题非常有潜力的方法.本文对信息检索中的若干关键问题进行了研究,提出了基于语义处理技术的信息检索模型--SPTIR,该模型主要包括以下关键技术:基于词义消歧的语义查询扩展、基于词汇语义相关性度量的查询优化和基于文档语义相关性的检索结果重排序.最后使用大型测试数据集和多项性能指标对SPTIR模型的检索性能进行了试验评估,实验结果充分验证了SPTIR模型的竞争优势以及该模型采用的各项语义处理技术对提高检索性能所起的积极作用.%We are in an information age that is mainly characterized by information explosion, and how to find moreprecise search results in the ocean of information becomes a key issue. Semantic search technique, fortunately, is a hopeful way to solve this problem. Several key problems in Information Retrieval (IR) domain are addressed and a novel Semantic Processing Technology based Information Retrieval ( SPTIR) model is proposed in this dissertation. SPTIR includes the following key technologies;Word Sense Disambiguation (WSD) based semantic query expansion, word semantic relatedness based query optimization and document semantic relevance based search results re-ranking. Finally large test data sets and a number of performance indicators are used to test the retrieval performance of the proposed model, and the experimental results fully validated the competitive advantage of SPTIR as well as the active role of semantic processing techniques adopted in improving the retrieval performance.

  20. Occurrence reporting and processing of operations information

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-07-21

    DOE O 232.1A, Occurrence Reporting and Processing of Operations Information, and 10 CFR 830.350, Occurrence Reporting and Processing of Operations Information (when it becomes effective), along with this manual, set forth occurrence reporting requirements for Department of Energy (DOE) Departmental Elements and contractors responsible for the management and operation of DOE-owned and -leased facilities. These requirements include categorization of occurrences related to safety, security, environment, health, or operations (``Reportable Occurrences``); DOE notification of these occurrences; and the development and submission of documented follow-up reports. This Manual provides detailed information for categorizing and reporting occurrences at DOE facilities. Information gathered by the Occurrence Reporting and processing System is used for analysis of the Department`s performance in environmental protection, safeguards and security, and safety and health of its workers and the public. This information is also used to develop lessons learned and document events that significantly impact DOE operations.

  1. Foam process models.

    Energy Technology Data Exchange (ETDEWEB)

    Moffat, Harry K.; Noble, David R.; Baer, Thomas A. (Procter & Gamble Co., West Chester, OH); Adolf, Douglas Brian; Rao, Rekha Ranjana; Mondy, Lisa Ann

    2008-09-01

    In this report, we summarize our work on developing a production level foam processing computational model suitable for predicting the self-expansion of foam in complex geometries. The model is based on a finite element representation of the equations of motion, with the movement of the free surface represented using the level set method, and has been implemented in SIERRA/ARIA. An empirically based time- and temperature-dependent density model is used to encapsulate the complex physics of foam nucleation and growth in a numerically tractable model. The change in density with time is at the heart of the foam self-expansion as it creates the motion of the foam. This continuum-level model uses an homogenized description of foam, which does not include the gas explicitly. Results from the model are compared to temperature-instrumented flow visualization experiments giving the location of the foam front as a function of time for our EFAR model system.

  2. A process framework for information security management

    Directory of Open Access Journals (Sweden)

    Knut Haufe

    2016-01-01

    Full Text Available Securing sensitive organizational data has become increasingly vital to organizations. An Information Security Management System (ISMS is a systematic approach for establishing, implementing, operating, monitoring, reviewing, maintaining and improving an organization's information security. Key elements of the operation of an ISMS are ISMS processes. However, and in spite of its importance, an ISMS process framework with a description of ISMS processes and their interaction as well as the interaction with other management processes is not available in the literature. Cost benefit analysis of information security investments regarding single measures protecting information and ISMS processes are not in the focus of current research, mostly focused on economics. This article aims to fill this research gap by proposing such an ISMS process framework as the main contribution. Based on a set of agreed upon ISMS processes in existing standards like ISO 27000 series, COBIT and ITIL. Within the framework, identified processes are described and their interaction and interfaces are specified. This framework helps to focus on the operation of the ISMS, instead of focusing on measures and controls. By this, as a main finding, the systemic character of the ISMS consisting of processes and the perception of relevant roles of the ISMS is strengthened.

  3. A process framework for information security management

    Directory of Open Access Journals (Sweden)

    Knut Haufe

    2016-01-01

    Full Text Available Securing sensitive organizational data has become increasingly vital to organizations. An Information Security Management System (ISMS is a systematic approach for establishing, implementing, operating, monitoring, reviewing, maintaining and improving an organization's information security. Key elements of the operation of an ISMS are ISMS processes. However, and in spite of its importance, an ISMS process framework with a description of ISMS processes and their interaction as well as the interaction with other management processes is not available in the literature. Cost benefit analysis of information security investments regarding single measures protecting information and ISMS processes are not in the focus of current research, mostly focused on economics. This article aims to fill this research gap by proposing such an ISMS process framework as the main contribution. Based on a set of agreed upon ISMS processes in existing standards like ISO 27000 series, COBIT and ITIL. Within the framework, identified processes are described and their interaction and interfaces are specified. This framework helps to focus on the operation of the ISMS, instead of focusing on measures and controls. By this, as a main finding, the systemic character of the ISMS consisting of processes and the perception of relevant roles of the ISMS is strengthened.

  4. ENERGETIC CHARGE OF AN INFORMATION PROCESS

    Directory of Open Access Journals (Sweden)

    Popova T.M.

    2009-12-01

    Full Text Available Main laws of technical thermodynamics are universal and could be applied to processes other than thermodynamic ones. The results of the comparison of peculiarities of irreversible informational and thermodynamic processes are presented in the article and a new term “Infopy” is used. A more precise definition of “infopy” as an energetic charge is given in the article.

  5. Teaching Information Systems Development via Process Variants

    Science.gov (United States)

    Tan, Wee-Kek; Tan, Chuan-Hoo

    2010-01-01

    Acquiring the knowledge to assemble an integrated Information System (IS) development process that is tailored to the specific needs of a project has become increasingly important. It is therefore necessary for educators to impart to students this crucial skill. However, Situational Method Engineering (SME) is an inherently complex process that…

  6. Peculiarities of support for management information process

    OpenAIRE

    Chornous, G.

    2010-01-01

    The article deals with the problems of determination of perspective directions and models of decision making support based on Bussiness Intelligence technology (BI) and proper improvement of the modern information systems and technologies for effective decisions support.

  7. Systematic information processing style and perseverative worry.

    Science.gov (United States)

    Dash, Suzanne R; Meeten, Frances; Davey, Graham C L

    2013-12-01

    This review examines the theoretical rationale for conceiving of systematic information processing as a proximal mechanism for perseverative worry. Systematic processing is characterised by detailed, analytical thought about issue-relevant information, and in this way, is similar to the persistent, detailed processing of information that typifies perseverative worry. We review the key features and determinants of systematic processing, and examine the application of systematic processing to perseverative worry. We argue that systematic processing is a mechanism involved in perseverative worry because (1) systematic processing is more likely to be deployed when individuals feel that they have not reached a satisfactory level of confidence in their judgement and this is similar to the worrier's striving to feel adequately prepared, to have considered every possible negative outcome/detect all potential danger, and to be sure that they will successfully cope with perceived future problems; (2) systematic processing and worry are influenced by similar psychological cognitive states and appraisals; and (3) the functional neuroanatomy underlying systematic processing is located in the same brain regions that are activated during worrying. This proposed mechanism is derived from core psychological processes and offers a number of clinical implications, including the identification of psychological states and appraisals that may benefit from therapeutic interventions for worry-based problems.

  8. Algorithmic information theory mathematics of digital information processing

    CERN Document Server

    Seibt, Peter

    2007-01-01

    Treats the Mathematics of many important areas in digital information processing. This book covers, in a unified presentation, five topics: Data Compression, Cryptography, Sampling (Signal Theory), Error Control Codes, Data Reduction. It is useful for teachers, students and practitioners in Electronic Engineering, Computer Science and Mathematics.

  9. Modeling the Dynamics of an Information System

    Directory of Open Access Journals (Sweden)

    Jacek Unold

    2003-11-01

    Full Text Available The article concentrates on the nature of a social subsystem of an information system. It analyzes the nature of information processes of collectivity within an IS and introduces a model of IS dynamics. The model is based on the assumption that a social subsystem of an information system works as a nonlinear dynamic system. The model of IS dynamics is verified on the indexes of the stock market. It arises from the basic assumption of the technical analysis of the markets, that is, the index chart reflects the play of demand and supply, which in turn represents the crowd sentiment on the market.

  10. Multiscale Analysis of Information Dynamics for Linear Multivariate Processes

    CERN Document Server

    Faes, Luca; Stramaglia, Sebastiano; Nollo, Giandomenico; Stramaglia, Sebastiano

    2016-01-01

    In the study of complex physical and physiological systems represented by multivariate time series, an issue of great interest is the description of the system dynamics over a range of different temporal scales. While information-theoretic approaches to the multiscale analysis of complex dynamics are being increasingly used, the theoretical properties of the applied measures are poorly understood. This study introduces for the first time a framework for the analytical computation of information dynamics for linear multivariate stochastic processes explored at different time scales. After showing that the multiscale processing of a vector autoregressive (VAR) process introduces a moving average (MA) component, we describe how to represent the resulting VARMA process using state-space (SS) models and how to exploit the SS model parameters to compute analytical measures of information storage and information transfer for the original and rescaled processes. The framework is then used to quantify multiscale infor...

  11. Performance of Information Criteria for Spatial Models.

    Science.gov (United States)

    Lee, Hyeyoung; Ghosh, Sujit K

    2009-01-01

    Model choice is one of the most crucial aspect in any statistical data analysis. It is well known that most models are just an approximation to the true data generating process but among such model approximations it is our goal to select the "best" one. Researchers typically consider a finite number of plausible models in statistical applications and the related statistical inference depends on the chosen model. Hence model comparison is required to identify the "best" model among several such candidate models. This article considers the problem of model selection for spatial data. The issue of model selection for spatial models has been addressed in the literature by the use of traditional information criteria based methods, even though such criteria have been developed based on the assumption of independent observations. We evaluate the performance of some of the popular model selection critera via Monte Carlo simulation experiments using small to moderate samples. In particular, we compare the performance of some of the most popular information criteria such as Akaike Information Criterion (AIC), Bayesian Information Criterion (BIC), and Corrected AIC (AICc) in selecting the true model. The ability of these criteria to select the correct model is evaluated under several scenarios. This comparison is made using various spatial covariance models ranging from stationary isotropic to nonstationary models.

  12. PHYSICAL RESOURCES OF INFORMATION PROCESSES AND TECHNOLOGIES

    Directory of Open Access Journals (Sweden)

    Mikhail O. Kolbanev

    2014-11-01

    Full Text Available Subject of study. The paper describes basic information technologies for automating of information processes of data storage, distribution and processing in terms of required physical resources. It is shown that the study of these processes with such traditional objectives of modern computer science, as the ability to transfer knowledge, degree of automation, information security, coding, reliability, and others, is not enough. The reasons are: on the one hand, the increase in the volume and intensity of information exchange in the subject of human activity and, on the other hand, drawing near to the limit of information systems efficiency based on semiconductor technologies. Creation of such technologies, which not only provide support for information interaction, but also consume a rational amount of physical resources, has become an actual problem of modern engineering development. Thus, basic information technologies for storage, distribution and processing of information to support the interaction between people are the object of study, and physical temporal, spatial and energy resources required for implementation of these technologies are the subject of study. Approaches. An attempt is made to enlarge the possibilities of traditional cybernetics methodology, which replaces the consideration of material information component by states search for information objects. It is done by taking explicitly into account the amount of physical resources required for changes in the states of information media. Purpose of study. The paper deals with working out of a common approach to the comparison and subsequent selection of basic information technologies for storage, distribution and processing of data, taking into account not only the requirements for the quality of information exchange in particular subject area and the degree of technology application, but also the amounts of consumed physical resources. Main findings. Classification of resources

  13. Continuous-variable quantum information processing

    DEFF Research Database (Denmark)

    Andersen, Ulrik Lund; Leuchs, G.; Silberhorn, C.

    2010-01-01

    Observables of quantum systems can possess either a discrete or a continuous spectrum. For example, upon measurements of the photon number of a light state, discrete outcomes will result whereas measurements of the light's quadrature amplitudes result in continuous outcomes. If one uses the conti......Observables of quantum systems can possess either a discrete or a continuous spectrum. For example, upon measurements of the photon number of a light state, discrete outcomes will result whereas measurements of the light's quadrature amplitudes result in continuous outcomes. If one uses...... the continuous degree of freedom of a quantum system for encoding, processing or detecting information, one enters the field of continuous-variable (CV) quantum information processing. In this paper we review the basic principles of CV quantum information processing with main focus on recent developments...... in the field. We will be addressing the three main stages of a quantum information system; the preparation stage where quantum information is encoded into CVs of coherent states and single-photon states, the processing stage where CV information is manipulated to carry out a specified protocol and a detection...

  14. Machine Process Capability Information Through Six Sigma

    Energy Technology Data Exchange (ETDEWEB)

    Lackner, M.F.

    1998-03-13

    A project investigating details concerning machine process capability information and its accessibility has been conducted. The thesis of the project proposed designing a part (denoted as a machine capability workpiece) based on the major machining features of a given machine. Parts are machined and measured to gather representative production, short-term variation. The information is utilized to predict the expected defect rate, expressed in terms of a composite sigma level process capability index, for a production part. Presently, decisions concerning process planning, particularly what machine will statistically produce the minimum amount of defects based on machined features and associated tolerances, are rarely made. Six sigma tools and methodology were employed to conduct this investigation at AlliedSignal FM and T. Tools such as the thought process map, factor relationship diagrams, and components of variance were used. This study is progressing toward completion. This research study was an example of how machine process capability information may be gathered for milling planar faces (horizontal) and slot features. The planning method used to determine where and how to gather variation for the part to be designed is known as factor relationship diagramming. Components-of-variation is then applied to the gathered data to arrive at the contributing level of variation illustrated within the factor relationship diagram. The idea of using this capability information beyond process planning to the other business enterprise operations is proposed.

  15. Reshaping the Enterprise through an Information Architecture and Process Reengineering.

    Science.gov (United States)

    Laudato, Nicholas C.; DeSantis, Dennis J.

    1995-01-01

    The approach used by the University of Pittsburgh (Pennsylvania) in designing a campus-wide information architecture and a framework for reengineering the business process included building consensus on a general philosophy for information systems, using pattern-based abstraction techniques, applying data modeling and application prototyping, and…

  16. Reshaping the Enterprise through an Information Architecture and Process Reengineering.

    Science.gov (United States)

    Laudato, Nicholas C.; DeSantis, Dennis J.

    1995-01-01

    The approach used by the University of Pittsburgh (Pennsylvania) in designing a campus-wide information architecture and a framework for reengineering the business process included building consensus on a general philosophy for information systems, using pattern-based abstraction techniques, applying data modeling and application prototyping, and…

  17. Unveiling the mystery of visual information processing in human brain.

    Science.gov (United States)

    Diamant, Emanuel

    2008-08-15

    It is generally accepted that human vision is an extremely powerful information processing system that facilitates our interaction with the surrounding world. However, despite extended and extensive research efforts, which encompass many exploration fields, the underlying fundamentals and operational principles of visual information processing in human brain remain unknown. We still are unable to figure out where and how along the path from eyes to the cortex the sensory input perceived by the retina is converted into a meaningful object representation, which can be consciously manipulated by the brain. Studying the vast literature considering the various aspects of brain information processing, I was surprised to learn that the respected scholarly discussion is totally indifferent to the basic keynote question: "What is information?" in general or "What is visual information?" in particular. In the old days, it was assumed that any scientific research approach has first to define its basic departure points. Why was it overlooked in brain information processing research remains a conundrum. In this paper, I am trying to find a remedy for this bizarre situation. I propose an uncommon definition of "information", which can be derived from Kolmogorov's Complexity Theory and Chaitin's notion of Algorithmic Information. Embracing this new definition leads to an inevitable revision of traditional dogmas that shape the state of the art of brain information processing research. I hope this revision would better serve the challenging goal of human visual information processing modeling.

  18. A N-D VIRTUAL NOTEBOOK ABOUT THE BASILICA OF S. AMBROGIO IN MILAN: INFORMATION MODELING FOR THE COMMUNICATION OF HISTORICAL PHASES SUBTRACTION PROCESS

    Directory of Open Access Journals (Sweden)

    C. Stanga

    2017-08-01

    Full Text Available This essay describes the combination of 3D solutions and software techniques with traditional studies and researches in order to achieve an integrated digital documentation between performed surveys, collected data, and historical research. The approach of this study is based on the comparison of survey data with historical research, and interpretations deduced from a data cross-check between the two mentioned sources. The case study is the Basilica of S. Ambrogio in Milan, one of the greatest monuments in the city, a pillar of the Christianity and of the History of Architecture. It is characterized by a complex stratification of phases of restoration and transformation. Rediscovering the great richness of the traditional architectural notebook, which collected surveys and data, this research aims to realize a virtual notebook, based on a 3D model that supports the dissemination of the collected information. It can potentially be understandable and accessible by anyone through the development of a mobile app. The 3D model was used to explore the different historical phases, starting from the recent layers to the oldest ones, through a virtual subtraction process, following the methods of Archaeology of Architecture. Its components can be imported into parametric software and recognized both in their morphological and typological aspects. It is based on the concept of LoD and ReverseLoD in order to fit the accuracy required by each step of the research.

  19. Introduction to quantum physics and information processing

    CERN Document Server

    Vathsan, Radhika

    2016-01-01

    An Elementary Guide to the State of the Art in the Quantum Information FieldIntroduction to Quantum Physics and Information Processing guides beginners in understanding the current state of research in the novel, interdisciplinary area of quantum information. Suitable for undergraduate and beginning graduate students in physics, mathematics, or engineering, the book goes deep into issues of quantum theory without raising the technical level too much.The text begins with the basics of quantum mechanics required to understand how two-level systems are used as qubits. It goes on to show how quant

  20. Scalable Networked Information Processing Environment (SNIPE)

    Energy Technology Data Exchange (ETDEWEB)

    Fagg, G.E.; Moore, K. [Univ. of Tennessee, Knoxville, TN (United States). Dept. of Computer Science; Dongarra, J.J. [Univ. of Tennessee, Knoxville, TN (United States). Dept. of Computer Science]|[Oak Ridge National Lab., TN (United States). Computer Science and Mathematics Div.; Geist, A. [Oak Ridge National Lab., TN (United States). Computer Science and Mathematics Div.

    1997-11-01

    SNIPE is a metacomputing system that aims to provide a reliable, secure, fault tolerant environment for long term distributed computing applications and data stores across the global Internet. This system combines global naming and replication of both processing and data to support large scale information processing applications leading to better availability and reliability than currently available with typical cluster computing and/or distributed computer environments.

  1. Symposium on Information Processing in Organizations.

    Science.gov (United States)

    1982-04-01

    International, July, 1981. * * 100 The Maximization Process under Uncertainty Richard M. Cyert Morris H. DeGroot The theory of the firm has been...Cyert and Morris DeGvot, C-MU: "Th~ I IJ:li. process under uncertai nt Peter Keen, MIT: "Information systems in organizationsu Patrick Larkey and...uncertainty have been particularly fruitful. Some of these areas are oligopoly (Friedman, 1970; Shubik, 1959); statistical decision theory ( DeGroot , 1970

  2. Terminal chaos for information processing in neurodynamics.

    Science.gov (United States)

    Zak, M

    1991-01-01

    New nonlinear phenomenon-terminal chaos caused by failure of the Lipschitz condition at equilibrium points of dynamical systems is introduced. It is shown that terminal chaos has a well organized probabilistic structure which can be predicted and controlled. This gives an opportunity to exploit this phenomenon for information processing. It appears that chaotic states of neurons activity are associated with higher level of cognitive processes such as generalization and abstraction.

  3. The Trichotomy of Processes: a philosophical basis for information systems

    Directory of Open Access Journals (Sweden)

    George Widmeyer

    2003-11-01

    Full Text Available The principle of trichotomy from the American philosopher Charles S. Peirce can be used to categorize processes into the triad of transactional, relational, and informational. The usefulness of these categories is explicated by a comparison with structuration theory and control theory, and elaborated with a consideration of democracy in a knowledge economy. These three example applications of the process triad show the generality of the conceptual categories and provide a natural way of bringing ideas from social and ethical theories into information systems design. Modeling the world and understanding business applications through the use of the Trichotomy of Processes should facilitate the development of more valuable information systems.

  4. INFORMATION SUPPORT TRANSPORTATION PROCESS MULTIMODAL SYSTEM

    Directory of Open Access Journals (Sweden)

    N. A. Filippova

    2015-02-01

    Full Text Available Background: is to improve information support for the organization and functioning of multimodal systems delive-ry modes of transport in the northern regions of the Russian Federation on the basis of the development of theoretical and methodological and practical provisions, criteria, methods and mathematical models.Method or methodology of the work: a methodology placement vehicle logistics center (TLC in the Nordic region, providing links all transport modes involved in the delivery of energy, developed a model to optimize the parameters of the transport network, used for traffic, the most effective schemes of delivery of goods in multimodal report, taking into account the specifics of the Nordic region, and funding.Results: Studies have applied and may be used by the fe-deral and regional authorities and management in the deve-lopment of integrated programs for energy-Northern regions. Proposed in the theoretical research and methodological approaches are one way to increase the efficiency of the delivery of goods in the event of a little predictable situations on the route, TLC and transhipment points. The developed techniques are used and can be used to improve the northern region of the process control of cargo delivery.Conclusion: Therefore, based on the analysis of the status of the issue, it is quite obvious that the very topical area of optimization of the transport needs of the region to ensure the development and implementation of methods to improve the efficiency and quality of freight traffic by improving organizational structures and technology traffic control all transport space in the region.

  5. A Model for an Electronic Information Marketplace

    Directory of Open Access Journals (Sweden)

    Wei Ge

    2005-11-01

    Full Text Available As the information content on the Internet increases, the task of locating desired information and assessing its quality becomes increasingly difficult. This development causes users to be more willing to pay for information that is focused on specific issues, verifiable, and available upon request. Thus, the nature of the Internet opens up the opportunity for information trading. In this context, the Internet cannot only be used to close the transaction, but also to deliver the product - desired information - to the user. Early attempts to implement such business models have fallen short of expectations. In this paper, we discuss the limitations of such practices and present a modified business model for information trading, which uses a reverse auction approach together with a multiple-buyer price discovery process

  6. Information Processing and Dynamics in Minimally Cognitive Agents

    Science.gov (United States)

    Beer, Randall D.; Williams, Paul L.

    2015-01-01

    There has been considerable debate in the literature about the relative merits of information processing versus dynamical approaches to understanding cognitive processes. In this article, we explore the relationship between these two styles of explanation using a model agent evolved to solve a relational categorization task. Specifically, we…

  7. Processing Of Visual Information In Primate Brains

    Science.gov (United States)

    Anderson, Charles H.; Van Essen, David C.

    1991-01-01

    Report reviews and analyzes information-processing strategies and pathways in primate retina and visual cortex. Of interest both in biological fields and in such related computational fields as artificial neural networks. Focuses on data from macaque, which has superb visual system similar to that of humans. Authors stress concept of "good engineering" in understanding visual system.

  8. Introduction: Natural Language Processing and Information Retrieval.

    Science.gov (United States)

    Smeaton, Alan F.

    1990-01-01

    Discussion of research into information and text retrieval problems highlights the work with automatic natural language processing (NLP) that is reported in this issue. Topics discussed include the occurrences of nominal compounds; anaphoric references; discontinuous language constructs; automatic back-of-the-book indexing; and full-text analysis.…

  9. Springfield Processing Plant (SPP) Facility Information

    Energy Technology Data Exchange (ETDEWEB)

    Leach, Janice; Torres, Teresa M.

    2012-10-01

    The Springfield Processing Plant is a hypothetical facility. It has been constructed for use in training workshops. Information is provided about the facility and its surroundings, particularly security-related aspects such as target identification, threat data, entry control, and response force data.

  10. Spatial information processing in humans and monkeys

    NARCIS (Netherlands)

    Oleksiak, A.

    2010-01-01

    In this thesis a series of experiments are described on human volunteers and rhesus monkeys (Macaca mulatta) in the context of spatial information processing. In the first single-unit recording experiments in monkeys a spatial summation algorithm was investigated. The responses of single neurons to

  11. Incorporating Side Information in Probabilistic Matrix Factorization with Gaussian Processes

    CERN Document Server

    Adams, Ryan Prescott; Murray, Iain

    2010-01-01

    Probabilistic matrix factorization (PMF) is a powerful method for modeling data associated with pairwise relationships, finding use in collaborative filtering, computational biology, and document analysis, among other areas. In many domains, there is additional information that can assist in prediction. For example, when modeling movie ratings, we might know when the rating occurred, where the user lives, or what actors appear in the movie. It is difficult, however, to incorporate this side information into the PMF model. We propose a framework for incorporating side information by coupling together multiple PMF problems via Gaussian process priors. We replace scalar latent features with functions that vary over the space of side information. The GP priors on these functions require them to vary smoothly and share information. We successfully use this new method to predict the scores of professional basketball games, where side information about the venue and date of the game are relevant for the outcome.

  12. A Coprocessor for Accelerating Visual Information Processing

    CERN Document Server

    Stechele, W; Herrmann, S; Simon, J Lidon

    2011-01-01

    Visual information processing will play an increasingly important role in future electronics systems. In many applications, e.g. video surveillance cameras, data throughput of microprocessors is not sufficient and power consumption is too high. Instruction profiling on a typical test algorithm has shown that pixel address calculations are the dominant operations to be optimized. Therefore AddressLib, a structured scheme for pixel addressing was developed, that can be accelerated by AddressEngine, a coprocessor for visual information processing. In this paper, the architectural design of AddressEngine is described, which in the first step supports a subset of the AddressLib. Dataflow and memory organization are optimized during architectural design. AddressEngine was implemented in a FPGA and was tested with MPEG-7 Global Motion Estimation algorithm. Results on processing speed and circuit complexity are given and compared to a pure software implementation. The next step will be the support for the full Addres...

  13. Quantum information processing and nuclear magnetic resonance

    CERN Document Server

    Cummins, H K

    2001-01-01

    as spectrometer pulse sequence programs. Quantum computers are information processing devices which operate by and exploit the laws of quantum mechanics, potentially allowing them to solve problems which are intractable using classical computers. This dissertation considers the practical issues involved in one of the more successful implementations to date, nuclear magnetic resonance (NMR). Techniques for dealing with systematic errors are presented, and a quantum protocol is implemented. Chapter 1 is a brief introduction to quantum computation. The physical basis of its efficiency and issues involved in its implementation are discussed. NMR quantum information processing is reviewed in more detail in Chapter 2. Chapter 3 considers some of the errors that may be introduced in the process of implementing an algorithm, and high-level ways of reducing the impact of these errors by using composite rotations. Novel general expressions for stabilising composite rotations are presented in Chapter 4 and a new class o...

  14. How Students Learn: Information Processing, Intellectual Development and Confrontation

    Science.gov (United States)

    Entwistle, Noel

    1975-01-01

    A model derived from information processing theory is described, which helps to explain the complex verbal learning of students and suggests implications for lecturing techniques. Other factors affecting learning, which are not covered by the model, are discussed in relationship to it: student's intellectual development and effects of individual…

  15. A Process Model for Establishing Business Process Crowdsourcing

    Directory of Open Access Journals (Sweden)

    Nguyen Hoang Thuan

    2017-06-01

    Full Text Available Crowdsourcing can be an organisational strategy to distribute work to Internet users and harness innovation, information, capacities, and variety of business endeavours. As crowdsourcing is different from other business strategies, organisations are often unsure as to how to best structure different crowdsourcing activities and integrate them with other organisational business processes. To manage this problem, we design a process model guiding how to establish business process crowdsourcing. The model consists of seven components covering the main activities of crowdsourcing processes, which are drawn from a knowledge base incorporating diverse knowledge sources in the domain. The built model is evaluated using case studies, suggesting the adequateness and utility of the model.

  16. Information-Theoretic Perspectives on Geophysical Models

    Science.gov (United States)

    Nearing, Grey

    2016-04-01

    To test any hypothesis about any dynamic system, it is necessary to build a model that places that hypothesis into the context of everything else that we know about the system: initial and boundary conditions and interactions between various governing processes (Hempel and Oppenheim, 1948, Cartwright, 1983). No hypothesis can be tested in isolation, and no hypothesis can be tested without a model (for a geoscience-related discussion see Clark et al., 2011). Science is (currently) fundamentally reductionist in the sense that we seek some small set of governing principles that can explain all phenomena in the universe, and such laws are ontological in the sense that they describe the object under investigation (Davies, 1990 gives several competing perspectives on this claim). However, since we cannot build perfect models of complex systems, any model that does not also contain an epistemological component (i.e., a statement, like a probability distribution, that refers directly to the quality of of the information from the model) is falsified immediately (in the sense of Popper, 2002) given only a small number of observations. Models necessarily contain both ontological and epistemological components, and what this means is that the purpose of any robust scientific method is to measure the amount and quality of information provided by models. I believe that any viable philosophy of science must be reducible to this statement. The first step toward a unified theory of scientific models (and therefore a complete philosophy of science) is a quantitative language that applies to both ontological and epistemological questions. Information theory is one such language: Cox' (1946) theorem (see Van Horn, 2003) tells us that probability theory is the (only) calculus that is consistent with Classical Logic (Jaynes, 2003; chapter 1), and information theory is simply the integration of convex transforms of probability ratios (integration reduces density functions to scalar

  17. Extending Conceptual Schemas with Business Process Information

    Directory of Open Access Journals (Sweden)

    Marco Brambilla

    2010-01-01

    Full Text Available The specification of business processes is becoming a more and more critical aspect for organizations. Such processes are specified as workflow models expressing the logical precedence among the different business activities (i.e., the units of work. Typically, workflow models are managed through specific subsystems, called workflow management systems, to ensure a consistent behavior of the applications with respect to the organization business process. However, for small organizations and/or simple business processes, the complexity and capabilities of these dedicated workflow engines may be overwhelming. In this paper, we therefore, advocate for a different and lightweight approach, consisting in the integration of the business process specification within the system conceptual schema. We show how a workflow-extended conceptual schema can be automatically obtained, which serves both to enforce the organization business process and to manage all its relevant domain data in a unified way. This extended model can be directly processed with current CASE tools, for instance, to generate an implementation of the system (including its business process in any technological platform.

  18. Learning from bacteria about natural information processing.

    Science.gov (United States)

    Ben-Jacob, Eshel

    2009-10-01

    Under natural growth conditions, bacteria live in complex hierarchical communities. To conduct complex cooperative behaviors, bacteria utilize sophisticated communication to the extent that their chemical language includes semantic and even pragmatic aspects. I describe how complex colony forms (patterns) emerge through the communication-based interplay between individual bacteria and the colony. Individual cells assume newly co-generated traits and abilities that are not prestored in the genetic information of the cells, that is, not all the information required for efficient responses to all environmental conditions is stored. To solve newly encountered problems, they assess the problem via collective sensing, recall stored information of past experience, and then execute distributed information processing of the 10(9)-10(12) bacteria in the colony--transforming the colony into a "super-brain." I show illuminating examples of swarming intelligence of live bacteria in which they solve optimization problems that are beyond what human beings can solve. This will lead to a discussion about the special nature of bacterial computational principles compared to Turing algorithm computational principles, in particular about the role of distributed information processing.

  19. Disjunctive Information Flow for Communicating Processes

    DEFF Research Database (Denmark)

    Li, Ximeng; Nielson, Flemming; Nielson, Hanne Riis

    2016-01-01

    The security validation of practical computer systems calls for the ability to specify and verify information flow policies that are dependent on data content. Such policies play an important role in concurrent, communicating systems: consider a scenario where messages are sent to different...... processes according to their tagging. We devise a security type system that enforces content-dependent information flow policies in the presence of communication and concurrency. The type system soundly guarantees a compositional noninterference property. All theoretical results have been formally proved...

  20. Modular process modeling for OPC

    Science.gov (United States)

    Keck, M. C.; Bodendorf, C.; Schmidtling, T.; Schlief, R.; Wildfeuer, R.; Zumpe, S.; Niehoff, M.

    2007-03-01

    Modular OPC modeling, describing mask, optics, resist and etch processes separately is an approach to keep efforts for OPC manageable. By exchanging single modules of a modular OPC model, a fast response to process changes during process development is possible. At the same time efforts can be reduced, since only single modular process steps have to be re-characterized as input for OPC modeling as the process is adjusted and optimized. Commercially available OPC tools for full chip processing typically make use of semi-empirical models. The goal of our work is to investigate to what extent these OPC tools can be applied for modeling of single process steps as separate modules. For an advanced gate level process we analyze the modeling accuracy over different process conditions (focus and dose) when combining models for each process step - optics, resist and etch - for differing single processes to a model describing the total process.

  1. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers

    DEFF Research Database (Denmark)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel

    2016-01-01

    the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension...

  2. Study on Information Processing with Knowledge Base of Tibetan Grammar Model%信息处理用藏语语法模型知识库研究

    Institute of Scientific and Technical Information of China (English)

    多拉; 才让三智

    2011-01-01

    语言模型是对自然语言的一种描述,构造语言模型是研究计算语言学、自然语言理解的核心内容之一,好的语言模型将有助于自然语言处理的准确性.由于藏文是属于有形态的语言,既有曲折的特点,也有黏着的特征,并有丰富的格标记.深入研究其格语法体系,使之规范化,建立和完善语言模型知识库.这对于进一步开展机器识别的句法研究以及文本理解、汉藏智能翻译、自动分词、文本自动校对、句法树库建设、信息检索等方面将会起到基础支撑作用.%This article mainly argues that the linguistic model is one kind of description for the natural language, the structure linguistic model studies on computational linguistics and also one of natural language understanding central contents. A good linguistic model can improve the accuracy of processing natural language. Tibetan language has derivational, inflectional features and it also has the rich standard mark foundations. A deep study about its standard grammar system, including the grammar and standard system, helps to establishe the perfect linguistic model knowledge storehouse, which causes it regularization. It is important to have a further development on machine cognition syntax research or understanding of the text, the intelligent translation of Tibetan and Chinese, automatic participle, the text automatic proofreads. Building a syntax tree bank and information retrieval may have the basic positive roles on it.

  3. An analytical approach to customer requirement information processing

    Science.gov (United States)

    Zhou, Zude; Xiao, Zheng; Liu, Quan; Ai, Qingsong

    2013-11-01

    'Customer requirements' (CRs) management is a key component of customer relationship management (CRM). By processing customer-focused information, CRs management plays an important role in enterprise systems (ESs). Although two main CRs analysis methods, quality function deployment (QFD) and Kano model, have been applied to many fields by many enterprises in the past several decades, the limitations such as complex processes and operations make them unsuitable for online businesses among small- and medium-sized enterprises (SMEs). Currently, most SMEs do not have the resources to implement QFD or Kano model. In this article, we propose a method named customer requirement information (CRI), which provides a simpler and easier way for SMEs to run CRs analysis. The proposed method analyses CRs from the perspective of information and applies mathematical methods to the analysis process. A detailed description of CRI's acquisition, classification and processing is provided.

  4. Information analysis for modeling and representation of meaning

    OpenAIRE

    Uda, Norihiko

    1994-01-01

    In this dissertation, information analysis and an information model called the Semantic Structure Model based on information analysis are explained for semantic processing. Methods for self organization of information are also described. In addition, Information-Base Systems for thinking support of research and development in non linear optical materials are explained. As a result of information analysis, general properties of information and structural properties of concepts become clear. Ge...

  5. Recognizing, Thinking and Learning as Information Processes

    Science.gov (United States)

    1987-08-30

    the stimuli that impinge upon the sense organs ( McCulloch , Lettvin, Maturama and Pitts , 1959: Hubel and Wiesel. 1959). By some process (as we shall...Criticism. Moscow: 20 Information Processes 30 August 1987 Foreign Langudges Publishing House. 85. Lettvin, J. Y., H. R. Maturana, W. S. McCulloch . and W...H. Pitts . 1959. What the frog’s eye tells the frog’s brain. Proc. IRE, 47: 1940-1951. Lomov, B. F. 1984. tMethodological and Theoretical Problems in

  6. Intelligent Information Processing in Imaging Fuzes

    Institute of Scientific and Technical Information of China (English)

    王克勇; 郑链; 宋承天

    2003-01-01

    In order to study the problem of intelligent information processing in new types of imaging fuze, the method of extracting the invariance features of target images is adopted, and radial basis function neural network is used to recognize targets. Owing to its ability of parallel processing, its robustness and generalization, the method can realize the recognition of the conditions of missile-target encounters, and meet the requirements of real-time recognition in the imaging fuze. It is shown that based on artificial neural network target recognition and burst point control are feasible.

  7. Digital image processing for information extraction.

    Science.gov (United States)

    Billingsley, F. C.

    1973-01-01

    The modern digital computer has made practical image processing techniques for handling nonlinear operations in both the geometrical and the intensity domains, various types of nonuniform noise cleanup, and the numerical analysis of pictures. An initial requirement is that a number of anomalies caused by the camera (e.g., geometric distortion, MTF roll-off, vignetting, and nonuniform intensity response) must be taken into account or removed to avoid their interference with the information extraction process. Examples illustrating these operations are discussed along with computer techniques used to emphasize details, perform analyses, classify materials by multivariate analysis, detect temporal differences, and aid in human interpretation of photos.

  8. Fractional Transforms in Optical Information Processing

    Directory of Open Access Journals (Sweden)

    Maria Luisa Calvo

    2005-06-01

    Full Text Available We review the progress achieved in optical information processing during the last decade by applying fractional linear integral transforms. The fractional Fourier transform and its applications for phase retrieval, beam characterization, space-variant pattern recognition, adaptive filter design, encryption, watermarking, and so forth is discussed in detail. A general algorithm for the fractionalization of linear cyclic integral transforms is introduced and it is shown that they can be fractionalized in an infinite number of ways. Basic properties of fractional cyclic transforms are considered. The implementation of some fractional transforms in optics, such as fractional Hankel, sine, cosine, Hartley, and Hilbert transforms, is discussed. New horizons of the application of fractional transforms for optical information processing are underlined.

  9. Understanding requirements via natural language information modeling

    Energy Technology Data Exchange (ETDEWEB)

    Sharp, J.K.; Becker, S.D.

    1993-07-01

    Information system requirements that are expressed as simple English sentences provide a clear understanding of what is needed between system specifiers, administrators, users, and developers of information systems. The approach used to develop the requirements is the Natural-language Information Analysis Methodology (NIAM). NIAM allows the processes, events, and business rules to be modeled using natural language. The natural language presentation enables the people who deal with the business issues that are to be supported by the information system to describe exactly the system requirements that designers and developers will implement. Computer prattle is completely eliminated from the requirements discussion. An example is presented that is based upon a section of a DOE Order involving nuclear materials management. Where possible, the section is analyzed to specify the process(es) to be done, the event(s) that start the process, and the business rules that are to be followed during the process. Examples, including constraints, are developed. The presentation steps through the modeling process and shows where the section of the DOE Order needs clarification, extensions or interpretations that could provide a more complete and accurate specification.

  10. Information modeling system for blast furnace control

    Science.gov (United States)

    Spirin, N. A.; Gileva, L. Y.; Lavrov, V. V.

    2016-09-01

    Modern Iron & Steel Works as a rule are equipped with powerful distributed control systems (DCS) and databases. Implementation of DSC system solves the problem of storage, control, protection, entry, editing and retrieving of information as well as generation of required reporting data. The most advanced and promising approach is to use decision support information technologies based on a complex of mathematical models. The model decision support system for control of blast furnace smelting is designed and operated. The basis of the model system is a complex of mathematical models created using the principle of natural mathematical modeling. This principle provides for construction of mathematical models of two levels. The first level model is a basic state model which makes it possible to assess the vector of system parameters using field data and blast furnace operation results. It is also used to calculate the adjustment (adaptation) coefficients of the predictive block of the system. The second-level model is a predictive model designed to assess the design parameters of the blast furnace process when there are changes in melting conditions relative to its current state. Tasks for which software is developed are described. Characteristics of the main subsystems of the blast furnace process as an object of modeling and control - thermal state of the furnace, blast, gas dynamic and slag conditions of blast furnace smelting - are presented.

  11. Building Information Modeling Comprehensive Overview

    Directory of Open Access Journals (Sweden)

    Sergey Kalinichuk

    2015-07-01

    Full Text Available The article is addressed to provide a comprehensive review on recently accelerated development of the Information Technology within project market such as industrial, engineering, procurement and construction. Author’s aim is to cover the last decades of the growth of the Information and Communication Technology in construction industry in particular Building Information Modeling and testifies that the problem of a choice of the effective project realization method not only has not lost its urgency, but has also transformed into one of the major condition of the intensive technology development. All of it has created a great impulse on shortening the project duration and has led to the development of various schedule compression techniques what becomes a focus of modern construction.

  12. Hierarchical process memory: memory as an integral component of information processing

    Science.gov (United States)

    Hasson, Uri; Chen, Janice; Honey, Christopher J.

    2015-01-01

    Models of working memory commonly focus on how information is encoded into and retrieved from storage at specific moments. However, in the majority of real-life processes, past information is used continuously to process incoming information across multiple timescales. Considering single unit, electrocorticography, and functional imaging data, we argue that (i) virtually all cortical circuits can accumulate information over time, and (ii) the timescales of accumulation vary hierarchically, from early sensory areas with short processing timescales (tens to hundreds of milliseconds) to higher-order areas with long processing timescales (many seconds to minutes). In this hierarchical systems perspective, memory is not restricted to a few localized stores, but is intrinsic to information processing that unfolds throughout the brain on multiple timescales. “The present contains nothing more than the past, and what is found in the effect was already in the cause.”Henri L Bergson PMID:25980649

  13. Information processing in convex operational theories

    Energy Technology Data Exchange (ETDEWEB)

    Barnum, Howard Nelch [Los Alamos National Laboratory; Wilce, Alexander G [SUSQUEHANNA UNIV

    2008-01-01

    In order to understand the source and extent of the greater-than-classical information processing power of quantum systems, one wants to characterize both classical and quantum mechanics as points in a broader space of possible theories. One approach to doing this, pioneered by Abramsky and Coecke, is to abstract the essential categorical features of classical and quantum mechanics that support various information-theoretic constraints and possibilities, e.g., the impossibility of cloning in the latter, and the possibility of teleportation in both. Another approach, pursued by the authors and various collaborators, is to begin with a very conservative, and in a sense very concrete, generalization of classical probability theory--which is still sufficient to encompass quantum theory--and to ask which 'quantum' informational phenomena can be reproduced in this much looser setting. In this paper, we review the progress to date in this second programme, and offer some suggestions as to how to link it with the categorical semantics for quantum processes developed by Abramsky and Coecke.

  14. Beyond Information Seeking: Towards a General Model of Information Behaviour

    Science.gov (United States)

    Godbold, Natalya

    2006-01-01

    Introduction: The aim of the paper is to propose new models of information behaviour that extend the concept beyond simply information seeking to consider other modes of behaviour. The models chiefly explored are those of Wilson and Dervin. Argument: A shortcoming of some models of information behaviour is that they present a sequence of stages…

  15. On Activity modelling in process modeling

    Directory of Open Access Journals (Sweden)

    Dorel Aiordachioaie

    2001-12-01

    Full Text Available The paper is looking to the dynamic feature of the meta-models of the process modelling process, the time. Some principles are considered and discussed as main dimensions of any modelling activity: the compatibility of the substances, the equipresence of phenomena and the solvability of the model. The activity models are considered and represented at meta-level.

  16. Information processing speed in ecstasy (MDMA) users.

    Science.gov (United States)

    Wareing, Michelle; Fisk, John E; Montgomery, Catharine; Murphy, Philip N; Chandler, Martin D

    2007-03-01

    Previous research draws parallels between ecstasy-related and age-related deficits in cognitive functioning. Age-related impairments in working memory have been attributed to a slow down in information processing speed. The present study compared 29 current ecstasy users, 10 previous users and 46 non-users on two tests measuring information processing speed and a computation span task measuring working memory. Results showed that ecstasy users performed worse than non-ecstasy users in the letter comparison task although the overall difference was not significant (p=0.089). Results from the pattern recognition task showed that current ecstasy users produced significantly more errors than the other two groups (pecstasy users produced significantly more errors than non-ecstasy users (pecstasy using groups performing significantly worse than non-users on the computation span measure (pmechanism responsible for impairments in the computation span measure is not the same as that in elderly adults where processing speed generally removes most of the age-related variance. Also of relevance is the fact that the ecstasy users reported here had used a range of other drugs making it difficult to unambiguously attribute the results obtained to ecstasy use.

  17. A descriptive model of information problem solving while using internet

    NARCIS (Netherlands)

    Brand-Gruwel, Saskia; Wopereis, Iwan; Walraven, Amber

    2009-01-01

    This paper presents the IPS-I-model: a model that describes the process of information problem solving (IPS) in which the Internet (I) is used to search information. The IPS-I-model is based on three studies, in which students in secondary and (post) higher education were asked to solve information

  18. Quantum information processing through nuclear magnetic resonance

    Energy Technology Data Exchange (ETDEWEB)

    Bulnes, J.D.; Sarthour, R.S.; Oliveira, I.S. [Centro Brasileiro de Pesquisas Fisicas (CBPF), Rio de Janeiro, RJ (Brazil); Bonk, F.A.; Azevedo, E.R. de; Bonagamba, T.J. [Sao Paulo Univ., Sao Carlos, SP (Brazil). Inst. de Fisica; Freitas, J.C.C. [Espirito Santo Univ., Vitoria, ES (Brazil). Dept. de Fisica

    2005-09-15

    We discuss the applications of Nuclear Magnetic Resonance (NMR) to quantum information processing, focusing on the use of quadrupole nuclei for quantum computing. Various examples of experimental implementation of logic gates are given and compared to calculated NMR spectra and their respective density matrices. The technique of Quantum State Tomography for quadrupole nuclei is briefly described, and examples of measured density matrices in a two-qubit I = 3/2 spin system are shown. Experimental results of density matrices representing pseudo-Bell states are given, and an analysis of the entropy of theses states is made. Considering an NMR experiment as a depolarization quantum channel we calculate the entanglement fidelity and discuss the criteria for entanglement in liquid state NMR quantum information. A brief discussion on the perspectives for NMR quantum computing is presented at the end. (author)

  19. Quantum-Information Processing with Semiconductor Macroatoms

    CERN Document Server

    Biolatti, E; Zanardi, P; Rossi, F; Biolatti, Eliana; Iotti, Rita C.; Zanardi, Paolo; Rossi, Fausto

    2000-01-01

    An all optical implementation of quantum information processing with semiconductor macroatoms is proposed. Our quantum hardware consists of an array of semiconductor quantum dots and the computational degrees of freedom are energy-selected interband optical transitions. The proposed quantum-computing strategy exploits exciton-exciton interactions driven by ultrafast sequences of multi-color laser pulses. Contrary to existing proposals based on charge excitations, the present all-optical implementation does not require the application of time-dependent electric fields, thus allowing for a sub-picosecond, i.e. decoherence-free, operation time-scale in realistic state-of-the-art semiconductor nanostructures.

  20. Quantum information processing with noisy cluster states

    CERN Document Server

    Tame, M S; Kim, M S; Vedral, V

    2005-01-01

    We provide an analysis of basic quantum information processing protocols under the effect of intrinsic non-idealities in cluster states. These non-idealities are based on the introduction of randomness in the entangling steps that create the cluster state and are motivated by the unavoidable imperfections faced in creating entanglement using condensed-matter systems. Aided by the use of an alternative and very efficient method to construct cluster state configurations, which relies on the concatenation of fundamental cluster structures, we address quantum state transfer and various fundamental gate simulations through noisy cluster states. We find that a winning strategy to limit the effects of noise, is the management of small clusters processed via just a few measurements. Our study also reinforces recent ideas related to the optical implementation of a one-way quantum computer.

  1. Perception and information processing

    DEFF Research Database (Denmark)

    Scholderer, Joachim

    2010-01-01

    : as consumers, we can only respond to a stimulus if our senses are actually stimulated by it. Psychologically speaking, a stimulus only exists for us once we have formed an internal representation of it. The objective of this chapter is to introduce the systems that are involved in this processing of perceptual......Consumer researchers are interested in the responses of people to commercial stimuli. Usually, these stimuli are products and services, including all attributes, issues, persons, communications, situations, and behaviours related to them. Perception is the first bottleneck in this process...... information and to characterise the operations they perform. To avoid confusion, it should be stressed that the term "perception" is often used in a colloquial sense in consumer research. In concepts like perceived quality, perceived value, or perceived risk, the modifier "perceived" simply highlights...

  2. Modeling business processes: theoretical and practical aspects

    Directory of Open Access Journals (Sweden)

    V.V. Dubininа

    2015-06-01

    Full Text Available The essence of process-oriented enterprise management has been examined in the article. The content and types of information technology have been analyzed in the article, due to the complexity and differentiation of existing methods, as well as the specificity of language, terminology of the enterprise business processes modeling. The theoretical aspects of business processes modeling have been reviewed and the modern traditional modeling techniques received practical application in the visualization model of retailers activity have been studied in the article. In the process of theoretical analysis of the modeling methods found that UFO-toolkit method that has been developed by Ukrainian scientists due to it systemology integrated opportunities, is the most suitable for structural and object analysis of retailers business processes. It was designed visualized simulation model of the business process "sales" as is" of retailers using a combination UFO-elements with the aim of the further practical formalization and optimization of a given business process.

  3. The function of credibility in information processing for risk perception.

    Science.gov (United States)

    Trumbo, Craig W; McComas, Katherine A

    2003-04-01

    This study examines how credibility affects the way people process information and how they subsequently perceive risk. Three conceptual areas are brought together in this analysis: the psychometric model of risk perception, Eagly and Chaiken's heuristic-systematic information processing model, and Meyer's credibility index. Data come from a study of risk communication in the circumstance of state health department investigations of suspected cancer clusters (five cases, N = 696). Credibility is assessed for three information sources: state health departments, citizen groups, and industries involved in each case. Higher credibility for industry and the state directly predicts lower risk perception, whereas high credibility for citizen groups predicts greater risk perception. A path model shows that perceiving high credibility for industry and state-and perceiving low credibility for citizen groups-promotes heuristic processing, which in turn is a strong predictor of lower risk perception. Alternately, perceiving industry and the state to have low credibility also promotes greater systematic processing, which consistently leads to perception of greater risk. Between a one-fifth and one-third of the effect of credibility on risk perception is shown to be indirectly transmitted through information processing.

  4. Attachment and the Processing of Social Information in Adolescence

    Science.gov (United States)

    Dykas, Matthew J.; Cassidy, Jude

    2007-01-01

    A key proposition of attachment theory is that experience-based cognitive representations of attachment, often referred to as internal working models of attachment, influence the manner in which individuals process attachment-relevant social information (Bowlby, 1969/1982, 1973, 1980; Bretherton & Munholland, 1999; Main, Kaplan, & Cassidy, 1985).…

  5. Interactivity, Information Processing, and Learning on the World Wide Web.

    Science.gov (United States)

    Tremayne, Mark; Dunwoody, Sharon

    2001-01-01

    Examines the role of interactivity in the presentation of science news on the World Wide Web. Proposes and tests a model of interactive information processing that suggests that characteristics of users and Web sites influence interactivity, which influences knowledge acquisition. Describes use of a think-aloud method to study participants' mental…

  6. Information Processing in Adolescents with Bipolar I Disorder

    Science.gov (United States)

    Whitney, Jane; Joormann, Jutta; Gotlib, Ian H.; Kelley, Ryan G.; Acquaye, Tenah; Howe, Meghan; Chang, Kiki D.; Singh, Manpreet K.

    2012-01-01

    Background: Cognitive models of bipolar I disorder (BD) may aid in identification of children who are especially vulnerable to chronic mood dysregulation. Information-processing biases related to memory and attention likely play a role in the development and persistence of BD among adolescents; however, these biases have not been extensively…

  7. Inside the Search Process: Information Seeking from the User's Perspective.

    Science.gov (United States)

    Kuhlthau, Carol C.

    1991-01-01

    Discussion of the information search process (ISP) from the user's perspective focuses on a model of the ISP derived from longitudinal studies of high school and college students. Cognitive and affective aspects of the ISP are discussed, and their implications for future research are suggested. (31 references) (LRW)

  8. Motivated information processing, social tuning, and group creativity

    NARCIS (Netherlands)

    Bechtoldt, Myriam N.; De Dreu, Carsten K. W.; Nijstad, Bernard A.; Choi, Hoon-Seok

    2010-01-01

    The extent to which groups are creative has wide implications for their overall performance, including the quality of their problem solutions, judgments, and decisions. To further understanding of group creativity, we integrate the motivated information processing in groups model (De Dreu, Nijstad,

  9. Information Processing in Adolescents with Bipolar I Disorder

    Science.gov (United States)

    Whitney, Jane; Joormann, Jutta; Gotlib, Ian H.; Kelley, Ryan G.; Acquaye, Tenah; Howe, Meghan; Chang, Kiki D.; Singh, Manpreet K.

    2012-01-01

    Background: Cognitive models of bipolar I disorder (BD) may aid in identification of children who are especially vulnerable to chronic mood dysregulation. Information-processing biases related to memory and attention likely play a role in the development and persistence of BD among adolescents; however, these biases have not been extensively…

  10. Motivated information processing, social tuning, and group creativity

    NARCIS (Netherlands)

    Bechtoldt, M.N.; de Dreu, C.K.W.; Nijstad, B.A.; Choi, H.-S.

    2010-01-01

    The extent to which groups are creative has wide implications for their overall performance, including the quality of their problem solutions, judgments, and decisions. To further understanding of group creativity, we integrate the motivated information processing in groups model (De Dreu, Nijstad,

  11. Auditory processing models

    DEFF Research Database (Denmark)

    Dau, Torsten

    2008-01-01

    The Handbook of Signal Processing in Acoustics will compile the techniques and applications of signal processing as they are used in the many varied areas of Acoustics. The Handbook will emphasize the interdisciplinary nature of signal processing in acoustics. Each Section of the Handbook will pr...

  12. A Process Model for Establishing Business Process Crowdsourcing

    OpenAIRE

    Nguyen Hoang Thuan; Pedro Antunes; David Johnstone

    2017-01-01

    Crowdsourcing can be an organisational strategy to distribute work to Internet users and harness innovation, information, capacities, and variety of business endeavours. As crowdsourcing is different from other business strategies, organisations are often unsure as to how to best structure different crowdsourcing activities and integrate them with other organisational business processes. To manage this problem, we design a process model guiding how to establish business process crowdsourcing....

  13. The processing of information from sensors in intelligent systems

    Science.gov (United States)

    Kokovin, V. A.; Sytin, A. N.

    2017-01-01

    The article describes the processing of information obtained from sensors in intelligent systems. The paper analyzes the need of advanced treatment for a paralleling operation calculator which reduces the time of response to input events. A realization of a speculative processing algorithm in the FPGA by streaming control is based on a data flow model. This solution can be used in applications related to telecommunications networks of distributed control systems.

  14. Utility-based early modulation of processing distracting stimulus information.

    Science.gov (United States)

    Wendt, Mike; Luna-Rodriguez, Aquiles; Jacobsen, Thomas

    2014-12-10

    Humans are selective information processors who efficiently prevent goal-inappropriate stimulus information to gain control over their actions. Nonetheless, stimuli, which are both unnecessary for solving a current task and liable to cue an incorrect response (i.e., "distractors"), frequently modulate task performance, even when consistently paired with a physical feature that makes them easily discernible from target stimuli. Current models of cognitive control assume adjustment of the processing of distractor information based on the overall distractor utility (e.g., predictive value regarding the appropriate response, likelihood to elicit conflict with target processing). Although studies on distractor interference have supported the notion of utility-based processing adjustment, previous evidence is inconclusive regarding the specificity of this adjustment for distractor information and the stage(s) of processing affected. To assess the processing of distractors during sensory-perceptual phases we applied EEG recording in a stimulus identification task, involving successive distractor-target presentation, and manipulated the overall distractor utility. Behavioral measures replicated previously found utility modulations of distractor interference. Crucially, distractor-evoked visual potentials (i.e., posterior N1) were more pronounced in high-utility than low-utility conditions. This effect generalized to distractors unrelated to the utility manipulation, providing evidence for item-unspecific adjustment of early distractor processing to the experienced utility of distractor information.

  15. Learning to rank for information retrieval and natural language processing

    CERN Document Server

    Li, Hang

    2014-01-01

    Learning to rank refers to machine learning techniques for training a model in a ranking task. Learning to rank is useful for many applications in information retrieval, natural language processing, and data mining. Intensive studies have been conducted on its problems recently, and significant progress has been made. This lecture gives an introduction to the area including the fundamental problems, major approaches, theories, applications, and future work.The author begins by showing that various ranking problems in information retrieval and natural language processing can be formalized as tw

  16. Information Flow in the Launch Vehicle Design/Analysis Process

    Science.gov (United States)

    Humphries, W. R., Sr.; Holland, W.; Bishop, R.

    1999-01-01

    This paper describes the results of a team effort aimed at defining the information flow between disciplines at the Marshall Space Flight Center (MSFC) engaged in the design of space launch vehicles. The information flow is modeled at a first level and is described using three types of templates: an N x N diagram, discipline flow diagrams, and discipline task descriptions. It is intended to provide engineers with an understanding of the connections between what they do and where it fits in the overall design process of the project. It is also intended to provide design managers with a better understanding of information flow in the launch vehicle design cycle.

  17. Working Memory Capacity and Redundant Information Processing Efficiency

    Directory of Open Access Journals (Sweden)

    Michael John Endres

    2015-05-01

    Full Text Available Working memory capacity (WMC is typically measured by the amount of task-relevant information an individual can keep in mind while resisting distraction or interference from task irrelevant information. The current research investigated the extent to which differences in WMC were associated with performance on a novel redundant memory probes (RMP task that systematically varied the amount of to-be-remembered (targets and to-be-ignored (distractor information. The RMP task was designed to both facilitate and inhibit working memory search processes, as evidenced by differences in accuracy, response time, and Linear Ballistic Accumulator (LBA model estimates of information processing efficiency. Participants (N = 170 completed standard intelligence tests and dual-span WMC tasks, along with the RMP task. As expected, accuracy, response-time, and LBA model results indicated memory search and retrieval processes were facilitated under redundant- target conditions, but also inhibited under mixed target/distractor and redundant-distractor conditions. Repeated measures analyses also indicated that, while individuals classified as high (n = 85 and low (n = 85 WMC did not differ in the magnitude of redundancy effects, groups did differ in the efficiency of memory search and retrieval processes overall. Results suggest that redundant information reliably facilitates and inhibits the efficiency or speed of working memory search, and these effects are independent of more general limits and individual differences in the capacity or space of working memory.

  18. Prosody's Contribution to Fluency: An Examination of the Theory of Automatic Information Processing

    Science.gov (United States)

    Schrauben, Julie E.

    2010-01-01

    LaBerge and Samuels' (1974) theory of automatic information processing in reading offers a model that explains how and where the processing of information occurs and the degree to which processing of information occurs. These processes are dependent upon two criteria: accurate word decoding and automatic word recognition. However, LaBerge and…

  19. Random Matrices for Information Processing – A Democratic Vision

    DEFF Research Database (Denmark)

    Cakmak, Burak

    The thesis studies three important applications of random matrices to information processing. Our main contribution is that we consider probabilistic systems involving more general random matrix ensembles than the classical ensembles with iid entries, i.e. models that account for statistical...... dependence between the entries. Specifically, the involved matrices are invariant or fulfill a certain asymptotic freeness condition as their dimensions grow to infinity. Informally speaking, all latent variables contribute to the system model in a democratic fashion – there are no preferred latent variables...

  20. Open source clinical portals: a model for healthcare information systems to support care processes and feed clinical research. An Italian case of design, development, reuse, and exploitation.

    Science.gov (United States)

    Locatelli, Paolo; Baj, Emanuele; Restifo, Nicola; Origgi, Gianni; Bragagia, Silvia

    2011-01-01

    Open source is a still unexploited chance for healthcare organizations and technology providers to answer to a growing demand for innovation and to join economical benefits with a new way of managing hospital information systems. This chapter will present the case of the web enterprise clinical portal developed in Italy by Niguarda Hospital in Milan with the support of Fondazione Politecnico di Milano, to enable a paperless environment for clinical and administrative activities in the ward. This represents also one rare case of open source technology and reuse in the healthcare sector, as the system's porting is now taking place at Besta Neurological Institute in Milan. This institute is customizing the portal to feed researchers with structured clinical data collected in its portal's patient records, so that they can be analyzed, e.g., through business intelligence tools. Both organizational and clinical advantages are investigated, from process monitoring, to semantic data structuring, to recognition of common patterns in care processes.

  1. Quantum Information Processing using Nonlinear Optical Effects

    DEFF Research Database (Denmark)

    Andersen, Lasse Mejling

    of the converted idler depends on the other pump. This allows for temporal-mode-multiplexing. When the effects of nonlinear phase modulation (NPM) are included, the phases of the natural input and output modes are changed, reducing the separability. These effects are to some degree mediated by pre......This PhD thesis treats applications of nonlinear optical effects for quantum information processing. The two main applications are four-wave mixing in the form of Bragg scattering (BS) for quantum-state-preserving frequency conversion, and sum-frequency generation (SFG) in second-order nonlinear...... to obtain a 100 % conversion efficiency is to use multiple stages of frequency conversion, but this setup suffers from the combined effects of NPM. This problem is circumvented by using asymmetrically pumped BS, where one pump is continuous wave. For this setup, NPM is found to only lead to linear phase...

  2. Preattentive Processing of Numerical Visual Information.

    Science.gov (United States)

    Hesse, Philipp N; Schmitt, Constanze; Klingenhoefer, Steffen; Bremmer, Frank

    2017-01-01

    Humans can perceive and estimate approximate numerical information, even when accurate counting is impossible e.g., due to short presentation time. If the number of objects to be estimated is small, typically around 1-4 items, observers are able to give very fast and precise judgments with high confidence-an effect that is called subitizing. Due to its speed and effortless nature subitizing has usually been assumed to be preattentive, putting it into the same category as other low level visual features like color or orientation. More recently, however, a number of studies have suggested that subitizing might be dependent on attentional resources. In our current study we investigated the potentially preattentive nature of visual numerical perception in the subitizing range by means of EEG. We presented peripheral, task irrelevant sequences of stimuli consisting of a certain number of circular patches while participants were engaged in a demanding, non-numerical detection task at the fixation point drawing attention away from the number stimuli. Within a sequence of stimuli of a given number of patches (called "standards") we interspersed some stimuli of different numerosity ("oddballs"). We compared the evoked responses to visually identical stimuli that had been presented in two different conditions, serving as standard in one condition and as oddball in the other. We found significant visual mismatch negativity (vMMN) responses over parieto-occipital electrodes. In addition to the event-related potential (ERP) analysis, we performed a time-frequency analysis (TFA) to investigate whether the vMMN was accompanied by additional oscillatory processes. We found a concurrent increase in evoked theta power of similar strength over both hemispheres. Our results provide clear evidence for a preattentive processing of numerical visual information in the subitizing range.

  3. Quantum information processing with optical vortices

    Energy Technology Data Exchange (ETDEWEB)

    Khoury, Antonio Z. [Universidade Federal Fluminense (UFF), Niteroi, RJ (Brazil)

    2012-07-01

    Full text: In this work we discuss several proposals for quantum information processing using the transverse structure of paraxial beams. Different techniques for production and manipulation of optical vortices have been employed and combined with polarization transformations in order to investigate fundamental properties of quantum entanglement as well as to propose new tools for quantum information processing. As an example, we have recently proposed and demonstrated a controlled NOT (CNOT) gate based on a Michelson interferometer in which the photon polarization is the control bit and the first order transverse mode is the target. The device is based on a single lens design for an astigmatic mode converter that transforms the transverse mode of paraxial optical beams. In analogy with Bell's inequality for two-qubit quantum states, we propose an inequality criterion for the non-separability of the spin-orbit degrees of freedom of a laser beam. A definition of separable and non-separable spin-orbit modes is used in consonance with the one presented in Phys. Rev. Lett. 99, 2007. As the usual Bell's inequality can be violated for entangled two-qubit quantum states, we show both theoretically and experimentally that the proposed spin-orbit inequality criterion can be violated for non-separable modes. The inequality is discussed both in the classical and quantum domains. We propose a polarization to orbital angular momentum teleportation scheme using entangled photon pairs generated by spontaneous parametric down conversion. By making a joint detection of the polarization and angular momentum parity of a single photon, we are able to detect all the Bell-states and perform, in principle, perfect teleportation from a discrete to a continuous system using minimal resources. The proposed protocol implementation demands experimental resources that are currently available in quantum optics laboratories. (author)

  4. Natural language processing and advanced information management

    Science.gov (United States)

    Hoard, James E.

    1989-01-01

    Integrating diverse information sources and application software in a principled and general manner will require a very capable advanced information management (AIM) system. In particular, such a system will need a comprehensive addressing scheme to locate the material in its docuverse. It will also need a natural language processing (NLP) system of great sophistication. It seems that the NLP system must serve three functions. First, it provides an natural language interface (NLI) for the users. Second, it serves as the core component that understands and makes use of the real-world interpretations (RWIs) contained in the docuverse. Third, it enables the reasoning specialists (RSs) to arrive at conclusions that can be transformed into procedures that will satisfy the users' requests. The best candidate for an intelligent agent that can satisfactorily make use of RSs and transform documents (TDs) appears to be an object oriented data base (OODB). OODBs have, apparently, an inherent capacity to use the large numbers of RSs and TDs that will be required by an AIM system and an inherent capacity to use them in an effective way.

  5. Information Support of Processes in Warehouse Logistics

    Directory of Open Access Journals (Sweden)

    Gordei Kirill

    2013-11-01

    Full Text Available In the conditions of globalization and the world economic communications, the role of information support of business processes increases in various branches and fields of activity. There is not an exception for the warehouse activity. Such information support is realized in warehouse logistic systems. In relation to territorial administratively education, the warehouse logistic system gets a format of difficult social and economic structure which controls the economic streams covering the intermediary, trade and transport organizations and the enterprises of other branches and spheres. Spatial movement of inventory items makes new demands to participants of merchandising. Warehousing (in the meaning – storage – is one of the operations entering into logistic activity, on the organization of a material stream, as a requirement. Therefore, warehousing as "management of spatial movement of stocks" – is justified. Warehousing, in such understanding, tries to get rid of the perception as to containing stocks – a business expensive. This aspiration finds reflection in the logistic systems working by the principle: "just in time", "economical production" and others. Therefore, the role of warehouses as places of storage is transformed to understanding of warehousing as an innovative logistic system.

  6. INNOVATION PROCESS MODELLING

    Directory of Open Access Journals (Sweden)

    JANUSZ K. GRABARA

    2011-01-01

    Full Text Available Modelling phenomena in accordance with the structural approach enables one to simplify the observed relations and to present the classification grounds. An example may be a model of organisational structure identifying the logical relations between particular units and presenting the division of authority, work.

  7. Is Analytic Information Processing a Feature of Expertise in Medicine?

    Science.gov (United States)

    McLaughlin, Kevin; Rikers, Remy M.; Schmidt, Henk G.

    2008-01-01

    Diagnosing begins by generating an initial diagnostic hypothesis by automatic information processing. Information processing may stop here if the hypothesis is accepted, or analytical processing may be used to refine the hypothesis. This description portrays analytic processing as an optional extra in information processing, leading us to…

  8. New certification process starts. Pt. 2. Application of Building Information Modeling (BIM) for the certification according to DGNB; Zertifizierungsprozess auf ''neuen Beinen''. T. 2. Anwendung von Building Informationen (BIM) zur Zertifizierung nach DGNB

    Energy Technology Data Exchange (ETDEWEB)

    Essig, Bernd [SCHOLZE Consulting GmbH, Leinfelden-Echterdingen (Germany). Geschaeftsbereich Ingenieur-Beratungsleistungen in Facility-, Informations-, Qualitaets-, Nachhaltigkeitsmanagement und Energieberatung; Ernst, Tatjana [SCHOLZE Consulting GmbH, Leinfelden-Echterdingen (Germany)

    2013-04-01

    The overall life cycle plays a significant role in the certification by German Sustainable Building Council (Stuttgart, Federal Republic of Germany). In order that certification becomes much more than only a documentation, comprehensive data are requested already in an early phase of planning. Early investigations of alternatives may have a significant influence on the life cycle of a building. But, from where do the auditors achieve these amounts of data without overcharging the partners involved in the planning process with questions. The Building Information Modelling contributes to the development of the future enhancement of process quality by an innovative information management.

  9. Towards the understanding of network information processing in biology

    Science.gov (United States)

    Singh, Vijay

    Living organisms perform incredibly well in detecting a signal present in the environment. This information processing is achieved near optimally and quite reliably, even though the sources of signals are highly variable and complex. The work in the last few decades has given us a fair understanding of how individual signal processing units like neurons and cell receptors process signals, but the principles of collective information processing on biological networks are far from clear. Information processing in biological networks, like the brain, metabolic circuits, cellular-signaling circuits, etc., involves complex interactions among a large number of units (neurons, receptors). The combinatorially large number of states such a system can exist in makes it impossible to study these systems from the first principles, starting from the interactions between the basic units. The principles of collective information processing on such complex networks can be identified using coarse graining approaches. This could provide insights into the organization and function of complex biological networks. Here I study models of biological networks using continuum dynamics, renormalization, maximum likelihood estimation and information theory. Such coarse graining approaches identify features that are essential for certain processes performed by underlying biological networks. We find that long-range connections in the brain allow for global scale feature detection in a signal. These also suppress the noise and remove any gaps present in the signal. Hierarchical organization with long-range connections leads to large-scale connectivity at low synapse numbers. Time delays can be utilized to separate a mixture of signals with temporal scales. Our observations indicate that the rules in multivariate signal processing are quite different from traditional single unit signal processing.

  10. Analytic information processing style in epilepsy patients.

    Science.gov (United States)

    Buonfiglio, Marzia; Di Sabato, Francesco; Mandillo, Silvia; Albini, Mariarita; Di Bonaventura, Carlo; Giallonardo, Annateresa; Avanzini, Giuliano

    2017-08-01

    Relevant to the study of epileptogenesis is learning processing, given the pivotal role that neuroplasticity assumes in both mechanisms. Recently, evoked potential analyses showed a link between analytic cognitive style and altered neural excitability in both migraine and healthy subjects, regardless of cognitive impairment or psychological disorders. In this study we evaluated analytic/global and visual/auditory perceptual dimensions of cognitive style in patients with epilepsy. Twenty-five cryptogenic temporal lobe epilepsy (TLE) patients matched with 25 idiopathic generalized epilepsy (IGE) sufferers and 25 healthy volunteers were recruited and participated in three cognitive style tests: "Sternberg-Wagner Self-Assessment Inventory", the C. Cornoldi test series called AMOS, and the Mariani Learning style Questionnaire. Our results demonstrate a significant association between analytic cognitive style and both IGE and TLE and respectively a predominant auditory and visual analytic style (ANOVA: p values <0,0001). These findings should encourage further research to investigate information processing style and its neurophysiological correlates in epilepsy. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Contexts for concepts: Information modeling for semantic interoperability

    NARCIS (Netherlands)

    Oude Luttighuis, P.H.W.M.; Stap, R.E.; Quartel, D.

    2011-01-01

    Conceptual information modeling is a well-established practice, aimed at preparing the implementation of information systems, the specification of electronic message formats, and the design of information processes. Today's ever more connected world however poses new challenges for conceptual inform

  12. Modeling of Visual Information Processing in Retinal Prosthesis%视网膜假体视觉信息处理模型

    Institute of Scientific and Technical Information of China (English)

    裴智军; 乔清理; 艾慧坚

    2011-01-01

    Visual prosthesis offers alternative way for partially repairing visual impairment. The Modeling and simulating of how biological retina encodes visual information with spike trains is a key issue for the development of a retinal prosthesis. Based on state-of-the-art retinal physiological mechanism, a two-layered visual information model is brought forward. It includes the Out Plexiform Layer (OPL) for information extraction and the Inner Plexiform Layer (IPL) for information encoding. Using spatiotemporal filter, static nonlinear rectification and Poisson spike generation, we establish the relations between input stimulus images and output spike trains. The simulation result shows us the contour edge images and spike sequences corresponding stimulus images. A theory model to develop artificial retina for the retinally blind is given.%视觉假体方案是神经工程领域用于部分或完全恢复视觉的主要手段,建立生物视网膜模型,并仿真其视觉信息处理功能,是视网膜假体研究中的一个重要组成部分.从已知的生理机制出发,提出两层结构的视网膜信息处理模型:外网状层(Out Plexiform Layer,OPL)信息提取和内网状层(Inner Plexiform Layer,IPL)信息编码,通过时空滤波、静态非线性调整和泊松峰电位产生,建立输入刺激图像和输出峰电位序列之间的直接关系,并在 MATLAB 平台上结合其图形模块进行仿真研究,得到了与内、外网状层处理结果对应的边缘轮廓图像和携带视觉信息的峰电位序列,为视网膜假体研究提供了一种理论上的可行性模型.

  13. Business Process Information Systems Work Program On BPPAUDNI In Indonesia

    Directory of Open Access Journals (Sweden)

    Bahar

    2015-08-01

    Full Text Available Abstract Scope of area very large the use of conventional media socialization and communication that has limited range of space and time such as brochures newsletters magazines newspapers books radio and television as well as visit the seminar directly to the location of stakeholders business process management information which has not been well established as well as the data management system not well ordered causing Non-Formal and Informal Early Chilhood Education Development Center BP-PAUDNI not properly socialized a partnership is not carried out effectively especially to stakeholders located in urban areas outside behind the communication process response is not subjected to ineffective and are not available centralized data bank that provides comprehensive information related to the activities of BP-PAUDNI that can be easily accessed by the general public and stakeholders related. This paper describes a model of Business Process Management Implementation Work Program Development Model Non-Formal and Informal Early Chilhood Education PAUDNI as a foundation for building Web-based information system that is supported by Communication Network Technology.

  14. What do information reuse and automated processing require in engineering design? Semantic process

    Directory of Open Access Journals (Sweden)

    Ossi Nykänen

    2011-12-01

    Full Text Available Purpose: The purpose of this study is to characterize, analyze, and demonstrate machine-understandable semantic process for validating, integrating, and processing technical design information. This establishes both a vision and tools for information reuse and semi-automatic processing in engineering design projects, including virtual machine laboratory applications with generated components.Design/methodology/approach: The process model has been developed iteratively in terms of action research, constrained by the existing technical design practices and assumptions (design documents, expert feedback, available technologies (pre-studies and experiments with scripting and pipeline tools, benchmarking with other process models and methods (notably the RUP and DITA, and formal requirements (computability and the critical information paths for the generated applications. In practice, the work includes both quantitative and qualitative components.Findings: Technical design processes may be greatly enhanced in terms of semantic process thinking, by enriching design information, and automating information validation and transformation tasks. Contemporary design information, however, is mainly intended for human consumption, and needs to be explicitly enriched with the currently missing data and interfaces. In practice, this may require acknowledging the role of technical information or knowledge engineer, to lead the development of the semantic design information process in a design organization. There is also a trade-off between machine-readability and system complexity that needs to be studied further, both empirically and in theory.Research limitations/implications: The conceptualization of the semantic process is essentially an abstraction based on the idea of progressive design. While this effectively allows implementing semantic processes with, e.g., pipeline technologies, the abstraction is valid only when technical design is organized into

  15. Aerospace Materials Process Modelling

    Science.gov (United States)

    1988-08-01

    deTresca La d~termination du coefficient de frottement de Tresca 9 est effectu~e de facon courante en forgeant un anneau de g~oan~trie fix~e. On mesure la...ailleurs et vaut a= 105 xt 0 , 2 5 Les riductions relatives du diam~tre int~rieur sont report~es sur l1abaque TVM(fig. 2a). Les coefficient de frottement ...validated material data bass. Information such as constitutive equations, intrinsic workability maps, effective heat-transfer coefficients , interface

  16. Information Integration, Retention, and Levels of Information Processing.

    Science.gov (United States)

    Levin, Irwin P.

    A combination of information integration methodology and measures of retention was used to investigate how subjects differentially attend to and weight information in judgmental tasks. Subjects were shown sets of test scores for hypothetical students and were asked to rate the performance of each student or predict each student's performance on a…

  17. BPMN Impact on Process Modeling

    OpenAIRE

    Polak, Przemyslaw

    2013-01-01

    Recent years have seen huge rise in popularity of BPMN in the area of business process modeling, especially among business analysts. This notation has characteristics that distinguish it significantly from the previously popular process modeling notations, such as EPC. The article contains the analysis of some important characteristics of BPMN and provides author’s conclusions on the impact that the popularity and specificity of BPMN can have on the practice of process modeling. Author's obse...

  18. Radiolysis Process Model

    Energy Technology Data Exchange (ETDEWEB)

    Buck, Edgar C.; Wittman, Richard S.; Skomurski, Frances N.; Cantrell, Kirk J.; McNamara, Bruce K.; Soderquist, Chuck Z.

    2012-07-17

    Assessing the performance of spent (used) nuclear fuel in geological repository requires quantification of time-dependent phenomena that may influence its behavior on a time-scale up to millions of years. A high-level waste repository environment will be a dynamic redox system because of the time-dependent generation of radiolytic oxidants and reductants and the corrosion of Fe-bearing canister materials. One major difference between used fuel and natural analogues, including unirradiated UO2, is the intense radiolytic field. The radiation emitted by used fuel can produce radiolysis products in the presence of water vapor or a thin-film of water (including OH• and H• radicals, O2-, eaq, H2O2, H2, and O2) that may increase the waste form degradation rate and change radionuclide behavior. H2O2 is the dominant oxidant for spent nuclear fuel in an O2 depleted water environment, the most sensitive parameters have been identified with respect to predictions of a radiolysis model under typical conditions. As compared with the full model with about 100 reactions it was found that only 30-40 of the reactions are required to determine [H2O2] to one part in 10–5 and to preserve most of the predictions for major species. This allows a systematic approach for model simplification and offers guidance in designing experiments for validation.

  19. A Process Model of Quantum Mechanics

    OpenAIRE

    Sulis, William

    2014-01-01

    A process model of quantum mechanics utilizes a combinatorial game to generate a discrete and finite causal space upon which can be defined a self-consistent quantum mechanics. An emergent space-time M and continuous wave function arise through a non-uniform interpolation process. Standard non-relativistic quantum mechanics emerges under the limit of infinite information (the causal space grows to infinity) and infinitesimal scale (the separation between points goes to zero). The model has th...

  20. Modeling of column apparatus processes

    CERN Document Server

    Boyadjiev, Christo; Boyadjiev, Boyan; Popova-Krumova, Petya

    2016-01-01

    This book presents a new approach for the modeling of chemical and interphase mass transfer processes in industrial column apparatuses, using convection-diffusion and average-concentration models. The convection-diffusion type models are used for a qualitative analysis of the processes and to assess the main, small and slight physical effects, and then reject the slight effects. As a result, the process mechanism can be identified. It also introduces average concentration models for quantitative analysis, which use the average values of the velocity and concentration over the cross-sectional area of the column. The new models are used to analyze different processes (simple and complex chemical reactions, absorption, adsorption and catalytic reactions), and make it possible to model the processes of gas purification with sulfur dioxide, which form the basis of several patents.

  1. Event Modeling in UML. Unified Modeling Language and Unified Process

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2002-01-01

    We show how events can be modeled in terms of UML. We view events as change agents that have consequences and as information objects that represent information. We show how to create object-oriented structures that represent events in terms of attributes, associations, operations, state charts......, and messages. We outline a run-time environment for the processing of events with multiple participants....

  2. Refreshing Information Literacy: Learning from Recent British Information Literacy Models

    Science.gov (United States)

    Martin, Justine

    2013-01-01

    Models play an important role in helping practitioners implement and promote information literacy. Over time models can lose relevance with the advances in technology, society, and learning theory. Practitioners and scholars often call for adaptations or transformations of these frameworks to articulate the learning needs in information literacy…

  3. The Infopriv model for information privacy

    OpenAIRE

    2012-01-01

    D.Phil. (Computer Science) The privacy of personal information is crucial in today's information systems. Traditional security models are mainly concerned with the protection of information inside a computer system. These models assume that the users of a computer system are trustworthy and will not disclose information to unauthorised parties. However, this assumption does not always apply to information privacy since people are the major cause of privacy violations. Alternative models ar...

  4. Approaches to Chemical and Biochemical Information and Signal Processing

    Science.gov (United States)

    Privman, Vladimir

    2012-02-01

    We outline models and approaches for error control required to prevent buildup of noise when ``gates'' and other ``network elements'' based on (bio)chemical reaction processes are utilized to realize stable, scalable networks for information and signal processing. We also survey challenges and possible future research. [4pt] [1] Control of Noise in Chemical and Biochemical Information Processing, V. Privman, Israel J. Chem. 51, 118-131 (2010).[0pt] [2] Biochemical Filter with Sigmoidal Response: Increasing the Complexity of Biomolecular Logic, V. Privman, J. Halamek, M. A. Arugula, D. Melnikov, V. Bocharova and E. Katz, J. Phys. Chem. B 114, 14103-14109 (2010).[0pt] [3] Towards Biosensing Strategies Based on Biochemical Logic Systems, E. Katz, V. Privman and J. Wang, in: Proc. Conf. ICQNM 2010 (IEEE Comp. Soc. Conf. Publ. Serv., Los Alamitos, California, 2010), pages 1-9.

  5. Parsimonious Language Models for Information Retrieval

    NARCIS (Netherlands)

    Hiemstra, Djoerd; Robertson, Stephen; Zaragoza, Hugo

    2004-01-01

    We systematically investigate a new approach to estimating the parameters of language models for information retrieval, called parsimonious language models. Parsimonious language models explicitly address the relation between levels of language models that are typically used for smoothing. As such,

  6. Standardization of information systems development processes and banking industry adaptations

    CERN Document Server

    Tanrikulu, Zuhal

    2011-01-01

    This paper examines the current system development processes of three major Turkish banks in terms of compliance to internationally accepted system development and software engineering standards to determine the common process problems of banks. After an in-depth investigation into system development and software engineering standards, related process-based standards were selected. Questions were then prepared covering the whole system development process by applying the classical Waterfall life cycle model. Each question is made up of guidance and suggestions from the international system development standards. To collect data, people from the information technology departments of three major banks in Turkey were interviewed. Results have been aggregated by examining the current process status of the three banks together. Problematic issues were identified using the international system development standards.

  7. Modeling Software Processes and Artifacts

    NARCIS (Netherlands)

    van den Berg, Klaas; Bosch, Jan; Mitchell, Stuart

    1997-01-01

    The workshop on Modeling Software Processes and Artifacts explored the application of object technology in process modeling. After the introduction and the invited lecture, a number of participants presented their position papers. First, an overview is given on some background work, and the aims, as

  8. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers

    DEFF Research Database (Denmark)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel

    2016-01-01

    of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling.......A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated...... the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension...

  9. Multi-enzyme Process Modeling

    DEFF Research Database (Denmark)

    Andrade Santacoloma, Paloma de Gracia

    . In this way the model parameters that drives the main dynamic behavior can be identified and thus a better understanding of this type of processes. In order to develop, test and verify the methodology, three case studies were selected, specifically the bi-enzyme process for the production of lactobionic acid......The subject of this thesis is to develop a methodological framework that can systematically guide mathematical model building for better understanding of multi-enzyme processes. In this way, opportunities for process improvements can be identified by analyzing simulations of either existing...... in the scientific literature. Reliable mathematical models of such multi-catalytic schemes can exploit the potential benefit of these processes. In this way, the best outcome of the process can be obtained understanding the types of modification that are required for process optimization. An effective evaluation...

  10. Introduction to spiking neural networks: Information processing, learning and applications.

    Science.gov (United States)

    Ponulak, Filip; Kasinski, Andrzej

    2011-01-01

    The concept that neural information is encoded in the firing rate of neurons has been the dominant paradigm in neurobiology for many years. This paradigm has also been adopted by the theory of artificial neural networks. Recent physiological experiments demonstrate, however, that in many parts of the nervous system, neural code is founded on the timing of individual action potentials. This finding has given rise to the emergence of a new class of neural models, called spiking neural networks. In this paper we summarize basic properties of spiking neurons and spiking networks. Our focus is, specifically, on models of spike-based information coding, synaptic plasticity and learning. We also survey real-life applications of spiking models. The paper is meant to be an introduction to spiking neural networks for scientists from various disciplines interested in spike-based neural processing.

  11. Modeling nuclear processes by Simulink

    Energy Technology Data Exchange (ETDEWEB)

    Rashid, Nahrul Khair Alang Md, E-mail: nahrul@iium.edu.my [Faculty of Engineering, International Islamic University Malaysia, Jalan Gombak, Selangor (Malaysia)

    2015-04-29

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  12. Modeling nuclear processes by Simulink

    Science.gov (United States)

    Rashid, Nahrul Khair Alang Md

    2015-04-01

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  13. Building information modelling (BIM: now and beyond

    Directory of Open Access Journals (Sweden)

    Salman Azhar

    2012-12-01

    Full Text Available Building Information Modeling (BIM, also called n-D Modeling or Virtual Prototyping Technology, is a revolutionary development that is quickly reshaping the Architecture-Engineering-Construction (AEC industry. BIM is both a technology and a process. The technology component of BIM helps project stakeholders to visualize what is to be built in a simulated environment to identify any potential design, construction or operational issues. The process component enables close collaboration and encourages integration of the roles of all stakeholders on a project. The paper presents an overview of BIM with focus on its core concepts, applications in the project life cycle and benefits for project stakeholders with the help of case studies. The paper also elaborates risks and barriers to BIM implementation and future trends.

  14. Building information modelling (BIM: now and beyond

    Directory of Open Access Journals (Sweden)

    Salman Azhar

    2015-10-01

    Full Text Available Building Information Modeling (BIM, also called n-D Modeling or Virtual Prototyping Technology, is a revolutionary development that is quickly reshaping the Architecture-Engineering-Construction (AEC industry. BIM is both a technology and a process. The technology component of BIM helps project stakeholders to visualize what is to be built in a simulated environment to identify any potential design, construction or operational issues. The process component enables close collaboration and encourages integration of the roles of all stakeholders on a project. The paper presents an overview of BIM with focus on its core concepts, applications in the project life cycle and benefits for project stakeholders with the help of case studies. The paper also elaborates risks and barriers to BIM implementation and future trends.

  15. A new model of information behaviour based on the Search Situation Transition schema Information searching, Information behaviour, Behavior, Information retrieval, Information seeking

    Directory of Open Access Journals (Sweden)

    Nils Pharo

    2004-01-01

    Full Text Available This paper presents a conceptual model of information behaviour. The model is part of the Search Situation Transition method schema. The method schema is developed to discover and analyse interplay between phenomena traditionally analysed as factors influencing either information retrieval or information seeking. In this paper the focus is on the model's five main categories: the work task, the searcher, the social/organisational environment, the search task, and the search process. In particular, the search process and its sub-categories search situation and transition and the relationship between these are discussed. To justify the method schema an empirical study was designed according to the schema's specifications. In the paper a subset of the study is presented analysing the effects of work tasks on Web information searching. Findings from this small-scale study indicate a strong relationship between the work task goal and the level of relevance used for judging resources during search processes.

  16. Essays on Imperfect Information Processing in Economics

    NARCIS (Netherlands)

    S.S. Ficco (Stefano)

    2007-01-01

    textabstractEconomic agents generally operate in uncertain environments and, prior to making decisions, invest time and resources to collect useful information. Consumers compare the prices charged by di..erent firms before purchasing a product. Politicians gather information from di..erent sourc

  17. Information management and design & engineering processes

    NARCIS (Netherlands)

    Lutters, Diederick; ten Brinke, E.; Streppel, A.H.; Kals, H.J.J.

    2000-01-01

    In analysing design and manufacturing tasks and their mutual interactions, it appears that the underlying information of these tasks is of the utmost importance. If this information is managed in a formalized, structured way, it can serve as a basis for the control of design and manufacturing proces

  18. NLP Meets the Jabberwocky: Natural Language Processing in Information Retrieval.

    Science.gov (United States)

    Feldman, Susan

    1999-01-01

    Focuses on natural language processing (NLP) in information retrieval. Defines the seven levels at which people extract meaning from text/spoken language. Discusses the stages of information processing; how an information retrieval system works; advantages to adding full NLP to information retrieval systems; and common problems with information…

  19. Human Information Processing Guidelines for Decision-Aiding Displays.

    Science.gov (United States)

    1981-12-01

    5 4. Pachella, R. G. "The Interpretation of Reaction Time in Information Processing Research." In B. Kantowitz (Ed.). Human Information Pro- cessing... Kantowitz (Ed.). Human Information Proces- sing: Tutorials in Performance and Cognition, Hillsdale, New Jersey: Lawrence Erlbaum Associates, 1974. 65...Finite Number of Inputs." In B. Kantowitz (Ed.). Human Information Processing: Tutorials in Performance and Cognition, Hillsdale, New Jersey: Lawrence

  20. Information Systems’ Portfolio: Contributions of Enterprise and Process Architecture

    Directory of Open Access Journals (Sweden)

    Silvia Fernandes

    2017-09-01

    Full Text Available We are witnessing a need for a quick and intelligent reaction from organizations to the level and speed of change in business processes.New information technologies and systems (IT/IS are challenging business models and products. One of the great shakes comes from the online and/or mobile apps and platforms.These are having a tremendous impact in launching innovative and competitive services through the combination of digital and physical features. This leads to actively rethink enterprise information systems’ portfolio, its management and suitability. One relevant way for enterprises to manage their IT/IS in order to cope with those challenges is enterprise and process architecture. A decision-making culture based on processes helps to understand and define the different elements that shape an organization and how those elements inter-relate inside and outside it. IT/IS portfolio management requires an increasing need of modeling data and process flows for better discerning and acting at its selection and alignment with business goals. The new generation of enterprise architecture (NGEA helps to design intelligent processes that answer quickly and creatively to new and challenging trends. This has to be open, agile and context-aware to allow well-designed services that match users’ expectations. This study includes two real cases/problems to solve quickly in companies and solutions are presented in line with this architectural approach.

  1. Optimal Disturbance Accommodation with Limited Model Information

    CERN Document Server

    Farokhi, F; Johansson, K H

    2011-01-01

    The design of optimal dynamic disturbance-accommodation controller with limited model information is considered. We adapt the family of limited model information control design strategies, defined earlier by the authors, to handle dynamic-controllers. This family of limited model information design strategies construct subcontrollers distributively by accessing only local plant model information. The closed-loop performance of the dynamic-controllers that they can produce are studied using a performance metric called the competitive ratio which is the worst case ratio of the cost a control design strategy to the cost of the optimal control design with full model information.

  2. Probing models of information spreading in social networks

    CERN Document Server

    Zoller, J

    2014-01-01

    We apply signal processing analysis to the information spreading in scale-free network. To reproduce typical behaviors obtained from the analysis of information spreading in the world wide web we use a modified SIS model where synergy effects and influential nodes are taken into account. This model depends on a single free parameter that characterize the memory-time of the spreading process. We show that by means of fractal analysis it is possible -from aggregated easily accessible data- to gain information on the memory time of the underlying mechanism driving the information spreading process.

  3. Client Information Processing: A Reaction to Heppner and Krauskopf.

    Science.gov (United States)

    Strohmer, Douglas C.

    1987-01-01

    Reacts to Heppner and Krauskopf's article on an information processing approach to personal problem solving. Commends the approach for relating to counseling much research on memory, social judgment, coping, information processing, and problem solving. Asserts that it remains unclear how information processing/problem-solving view of client…

  4. MODELLING PURCHASING PROCESSES FROM QUALITY ASPECTS

    Directory of Open Access Journals (Sweden)

    Zora Arsovski

    2008-12-01

    Full Text Available Management has a fundamental task to identify and direct primary and specific processes within purchasing function, applying the up-to-date information infrastructure. ISO 9001:2000 defines a process as a number of interrelated or interactive activities transforming inputs and outputs, and the "process approach" as a systematic identification in management processes employed with the organization and particularly - relationships among the processes. To direct a quality management system using process approach, the organization is to determine the map of its general (basic processes. Primary processes are determined on the grounds of their interrelationship and impact on satisfying customers' needs. To make a proper choice of general business processes, it is necessary to determine the entire business flow, beginning with the customer demand up to the delivery of products or service provided. In the next step the process model is to be converted into data model which is essential for implementation of the information system enabling automation, monitoring, measuring, inspection, analysis and improvement of key purchase processes. In this paper are given methodology and some results of investigation of development of IS for purchasing process from aspects of quality.

  5. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    In reduced form default models, the instantaneous default intensity is classically the modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature has shown a tendency towards...... specifying the cumulative hazard process directly. Within this framework we present a new model class where cumulative hazards are described by self-similar additive processes, also known as Sato processes. Furthermore we also analyze specifications obtained via a simple deterministic time......-change of a homogeneous Levy process. While the processes in these two classes share the same average behavior over time, the associated intensities exhibit very different properties. Concrete specifications are calibrated to data on the single names included in the iTraxx Europe index. The performances are compared...

  6. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    2010-01-01

    In reduced form default models, the instantaneous default intensity is the classical modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature tends to specify the cumulative...... hazard process directly. Within this framework we present a new model class where cumulative hazards are described by self-similar additive processes, also known as Sato processes. Furthermore, we analyze specifications obtained via a simple deterministic time-change of a homogeneous Lévy process. While...... the processes in these two classes share the same average behavior over time, the associated intensities exhibit very different properties. Concrete specifications are calibrated to data on all the single names included in the iTraxx Europe index. The performances are compared with those of the classical CIR...

  7. Exploration into Stages in the Information Search Process in Online Information Retrieval: Communication between Users and Intermediaries.

    Science.gov (United States)

    Kuhlthau, Carol; And Others

    1992-01-01

    Describes a study that examined interactions between users and intermediaries prior to online searching to examine communication about the user's stage in the search process. A model of information seeking that views the information search process as a series of cognitive stages is presented, and implications for user-intermediary interactions are…

  8. Process modelling on a canonical basis[Process modelling; Canonical modelling

    Energy Technology Data Exchange (ETDEWEB)

    Siepmann, Volker

    2006-12-20

    Based on an equation oriented solving strategy, this thesis investigates a new approach to process modelling. Homogeneous thermodynamic state functions represent consistent mathematical models of thermodynamic properties. Such state functions of solely extensive canonical state variables are the basis of this work, as they are natural objective functions in optimisation nodes to calculate thermodynamic equilibrium regarding phase-interaction and chemical reactions. Analytical state function derivatives are utilised within the solution process as well as interpreted as physical properties. By this approach, only a limited range of imaginable process constraints are considered, namely linear balance equations of state variables. A second-order update of source contributions to these balance equations is obtained by an additional constitutive equation system. These equations are general dependent on state variables and first-order sensitivities, and cover therefore practically all potential process constraints. Symbolic computation technology efficiently provides sparsity and derivative information of active equations to avoid performance problems regarding robustness and computational effort. A benefit of detaching the constitutive equation system is that the structure of the main equation system remains unaffected by these constraints, and a priori information allows to implement an efficient solving strategy and a concise error diagnosis. A tailor-made linear algebra library handles the sparse recursive block structures efficiently. The optimisation principle for single modules of thermodynamic equilibrium is extended to host entire process models. State variables of different modules interact through balance equations, representing material flows from one module to the other. To account for reusability and encapsulation of process module details, modular process modelling is supported by a recursive module structure. The second-order solving algorithm makes it

  9. Modelling of CWS combustion process

    Science.gov (United States)

    Rybenko, I. A.; Ermakova, L. A.

    2016-10-01

    The paper considers the combustion process of coal water slurry (CWS) drops. The physico-chemical process scheme consisting of several independent parallel-sequential stages is offered. This scheme of drops combustion process is proved by the particle size distribution test and research stereomicroscopic analysis of combustion products. The results of mathematical modelling and optimization of stationary regimes of CWS combustion are provided. During modeling the problem of defining possible equilibrium composition of products, which can be obtained as a result of CWS combustion processes at different temperatures, is solved.

  10. PUBLIC RELATIONS AS AN INFORMATION PROCESS PHENOMENON

    Directory of Open Access Journals (Sweden)

    TKACH L. M.

    2016-06-01

    Full Text Available Formulation of the problem. If public relations as a phenomenon of information management are examined, we deal with the question of knowledge content and nature of relationship of PR with environment, ability to manage the perception and attitude of people to events in the environment; ensure priority of information over other resources. Goal. To investigate the concept of "public relations" of foreign and domestic experts; consider the typology of the public and the "laws" of public opinion; define the basic principles according to which relations with public should be built, and to identify PR activities as a kind of social communication. Conclusions. Public relations on the basis of advanced information and communication technologies create fundamentally new opportunities for information control and influence on public consciousness.

  11. Carbon The First Frontier of Information Processing

    CERN Document Server

    Patel, A

    2002-01-01

    Information is often encoded as an aperiodic chain of building blocks. Modern digital computers use bits as the building blocks, but in general the choice of building blocks depends on the nature of the information to be encoded. What are the optimal building blocks to encode structural information? This can be analysed by replacing the operations of addition and multiplication of conventional arithmetic by translation and rotation. It is argued that at the molecular level, the best component for encoding structural information is carbon. Living organisms discovered this billions of years ago, and used carbon as the back-bone for constructing proteins which function according to their structure. Structural analysis of polypeptide chains shows that 20 building blocks are necessary to fold them into arbitrary shapes. Properties of amino acids suggest that the present genetic code was preceded by a more primitive one, coding for 10 amino acids using two nucleotide bases.

  12. Directory of Energy Information Administration Models 1994

    Energy Technology Data Exchange (ETDEWEB)

    1994-07-01

    This directory revises and updates the 1993 directory and includes 15 models of the National Energy Modeling System (NEMS). Three other new models in use by the Energy Information Administration (EIA) have also been included: the Motor Gasoline Market Model (MGMM), Distillate Market Model (DMM), and the Propane Market Model (PPMM). This directory contains descriptions about each model, including title, acronym, purpose, followed by more detailed information on characteristics, uses and requirements. Sources for additional information are identified. Included in this directory are 37 EIA models active as of February 1, 1994.

  13. Optimizing the HIV/AIDS informed consent process in India

    Directory of Open Access Journals (Sweden)

    Shrotri A

    2004-08-01

    Full Text Available Abstract Background While the basic ethical issues regarding consent may be universal to all countries, the consent procedures required by international review boards which include detailed scientific and legal information, may not be optimal when administered within certain populations. The time and the technicalities of the process itself intimidate individuals in societies where literacy and awareness about medical and legal rights is low. Methods In this study, we examined pregnant women's understanding of group education and counseling (GEC about HIV/AIDS provided within an antenatal clinic in Maharashtra, India. We then enhanced the GEC process with the use of culturally appropriate visual aids and assessed the subsequent changes in women's understanding of informed consent issues. Results We found the use of visual aids during group counseling sessions increased women's overall understanding of key issues regarding informed consent from 38% to 72%. Moreover, if these same visuals were reinforced during individual counseling, improvements in women's overall comprehension rose to 96%. Conclusions This study demonstrates that complex constructs such as informed consent can be conveyed in populations with little education and within busy government hospital settings, and that the standard model may not be sufficient to ensure true informed consent.

  14. Open Standards for Sensor Information Processing

    Energy Technology Data Exchange (ETDEWEB)

    Pouchard, Line Catherine [ORNL; Poole, Stephen W [ORNL; Lothian, Josh [ORNL

    2009-07-01

    This document explores sensor standards, sensor data models, and computer sensor software in order to determine the specifications and data representation best suited for analyzing and monitoring computer system health using embedded sensor data. We review IEEE 1451, OGC Sensor Model Language and Transducer Model Language (TML), lm-sensors and Intelligent Platform Management Inititative (IPMI).

  15. Information modelling and knowledge bases XXV

    CERN Document Server

    Tokuda, T; Jaakkola, H; Yoshida, N

    2014-01-01

    Because of our ever increasing use of and reliance on technology and information systems, information modelling and knowledge bases continue to be important topics in those academic communities concerned with data handling and computer science. As the information itself becomes more complex, so do the levels of abstraction and the databases themselves. This book is part of the series Information Modelling and Knowledge Bases, which concentrates on a variety of themes in the important domains of conceptual modeling, design and specification of information systems, multimedia information modelin

  16. Information paths within the new product development process

    DEFF Research Database (Denmark)

    Jespersen, Kristina Risom

    2007-01-01

    The present study explores the information acquisition within the new product development process (NPD). The effect of the front-end and environmental turbulence on the inter-stage connectedness of information within the NPD process is examined. An agent-based simulation is applied as the data......-end is not driving the information acquisition through the stages of the NPD process, and that environmental turbulence disconnects stages from the information paths in the NPD process. This implies that information is at the same time a key to success and a key to entrapment in the NPD process....

  17. Conceptual Model of Multidimensional Marketing Information System

    Science.gov (United States)

    Kriksciuniene, Dalia; Urbanskiene, Ruta

    This article is aimed to analyse, why the information systems at the enterprise not always satisfy the expectations of marketing management specialists. The computerized systems more and more successfully serve information needs of those areas of enterprise management, where they can create the information equivalent of real management processes. Yet their inability to effectively fulfill marketing needs indicate the gaps not only in ability to structure marketing processes, but in the conceptual development of marketing information systems (MkIS) as well.

  18. 77 FR 26911 - Processed Raspberry Promotion, Research, and Information Order

    Science.gov (United States)

    2012-05-08

    ... Service 7 CFR Part 1208 RIN 0581-AC79 Processed Raspberry Promotion, Research, and Information Order... Processed Raspberry Promotion, Research, and Information Order (Order). The program will be implemented..., producers of raspberries for processing and importers of processed raspberries will pay an assessment of...

  19. Building Information Modeling in engineering teaching

    DEFF Research Database (Denmark)

    Andersson, Niclas; Andersson, Pernille Hammar

    2010-01-01

    The application of Information and Communication Technology (ICT) in construction supports business as well as project processes by providing integrated systems for communication, administration, quantity takeoff, time scheduling, cost estimating, progress control among other things. The rapid...... technological development of ICT systems and the increased application of ICT in industry significantly influence the management and organisation of construction projects, and consequently, ICT has implications for the education of engineers and the preparation of students for their future professional careers...... in this case is represented by adopting Building Information Modelling, BIM, for construction management purposes. Course evaluations, a questionnaire and discussions with students confirm a genuinely positive attitude towards the role-play simulation and interaction with industry professionals. The students...

  20. Social Models: Blueprints or Processes?

    Science.gov (United States)

    Little, Graham R.

    1981-01-01

    Discusses the nature and implications of two different models for societal planning: (1) the problem-solving process approach based on Karl Popper; and (2) the goal-setting "blueprint" approach based on Karl Marx. (DC)

  1. Complementarity of Historic Building Information Modelling and Geographic Information Systems

    Science.gov (United States)

    Yang, X.; Koehl, M.; Grussenmeyer, P.; Macher, H.

    2016-06-01

    In this paper, we discuss the potential of integrating both semantically rich models from Building Information Modelling (BIM) and Geographical Information Systems (GIS) to build the detailed 3D historic model. BIM contributes to the creation of a digital representation having all physical and functional building characteristics in several dimensions, as e.g. XYZ (3D), time and non-architectural information that are necessary for construction and management of buildings. GIS has potential in handling and managing spatial data especially exploring spatial relationships and is widely used in urban modelling. However, when considering heritage modelling, the specificity of irregular historical components makes it problematic to create the enriched model according to its complex architectural elements obtained from point clouds. Therefore, some open issues limiting the historic building 3D modelling will be discussed in this paper: how to deal with the complex elements composing historic buildings in BIM and GIS environment, how to build the enriched historic model, and why to construct different levels of details? By solving these problems, conceptualization, documentation and analysis of enriched Historic Building Information Modelling are developed and compared to traditional 3D models aimed primarily for visualization.

  2. Understanding the Information Research Process of Experienced Online Information Researchers to Inform Development of a Scholars Portal

    Directory of Open Access Journals (Sweden)

    Martha Whitehead

    2009-06-01

    Full Text Available Objective - The main purpose of this study was to understand the information research process of experienced online information researchers in a variety of disciplines, gather their ideas for improvement and as part of this to validate a proposed research framework for use in future development of Ontario’s Scholars Portal.Methods - This was a qualitative research study in which sixty experienced online information researchers participated in face-to-face workshops that included a collaborative design component. The sessions were conducted and recorded by usability specialists who subsequently analyzed the data and identified patterns and themes.Results - Key themes included the similarities of the information research process across all disciplines, the impact of interdisciplinarity, the social aspect of research and opportunities for process improvement. There were many specific observations regarding current and ideal processes. Implications for portal development and further research included: supporting a common process while accommodating user-defined differences; supporting citation chaining practices with new opportunities for data linkage and granularity; enhancing keyword searching with various types of intervention; exploring trusted social networks; exploring new mental models for data manipulation while retaining traditional objects; improving citation and document management. Conclusion – The majority of researchers in the study had almost no routine in their information research processes, had developed few techniques to assist themselves and had very little awareness of the tools available to help them. There are many opportunities to aid researchers in the research process that can be explored when developing scholarly research portals. That development will be well guided by the framework ‘discover, gather, synthesize, create, share.’

  3. Integrating Geographical Information Systems, Fuzzy Logic and Analytical Hierarchy Process in Modelling Optimum Sites for Locating Water Reservoirs. A Case Study of the Debub District in Eritrea

    Directory of Open Access Journals (Sweden)

    Rodney G. Tsiko

    2011-03-01

    Full Text Available The aim of this study was to model water reservoir site selection for a real world application in the administrative district of Debub, Eritrea. This is a region were scarcity of water is a fundamental problem. Erratic rainfall, drought and unfavourable hydro-geological characteristics exacerbates the region’s water supply. Consequently, the population of Debub is facing severe water shortages and building reservoirs has been promoted as a possible solution to meet the future demand of water supply. This was the most powerful motivation to identify candidate sites for locating water reservoirs. A number of conflicting qualitative and quantitative criteria exist for evaluating alternative sites. Decisions regarding criteria are often accompanied by ambiguities and vagueness. This makes fuzzy logic a more natural approach to this kind of Multi-criteria Decision Analysis (MCDA problems. This paper proposes a combined two-stage MCDA methodology. The first stage involved utilizing the most simplistic type of data aggregation techniques known as Boolean Intersection or logical AND to identify areas restricted by environmental and hydrological constraints and therefore excluded from further study. The second stage involved integrating fuzzy logic with the Analytic Hierarchy Process (AHP to identify optimum and back-up candidate water reservoir sites in the area designated for further study.

  4. Acetylcholine molecular arrays enable quantum information processing

    Science.gov (United States)

    Tamulis, Arvydas; Majauskaite, Kristina; Talaikis, Martynas; Zborowski, Krzysztof; Kairys, Visvaldas

    2017-09-01

    We have found self-assembly of four neurotransmitter acetylcholine (ACh) molecular complexes in a water molecules environment by using geometry optimization with DFT B97d method. These complexes organizes to regular arrays of ACh molecules possessing electronic spins, i.e. quantum information bits. These spin arrays could potentially be controlled by the application of a non-uniform external magnetic field. The proper sequence of resonant electromagnetic pulses would then drive all the spin groups into the 3-spin entangled state and proceed large scale quantum information bits.

  5. Internet-based intelligent information processing systems

    CERN Document Server

    Tonfoni, G; Ichalkaranje, N S

    2003-01-01

    The Internet/WWW has made it possible to easily access quantities of information never available before. However, both the amount of information and the variation in quality pose obstacles to the efficient use of the medium. Artificial intelligence techniques can be useful tools in this context. Intelligent systems can be applied to searching the Internet and data-mining, interpreting Internet-derived material, the human-Web interface, remote condition monitoring and many other areas. This volume presents the latest research on the interaction between intelligent systems (neural networks, adap

  6. The Application of Multiobjective Evolutionary Algorithms to an Educational Computational Model of Science Information Processing: A Computational Experiment in Science Education

    Science.gov (United States)

    Lamb, Richard L.; Firestone, Jonah B.

    2017-01-01

    Conflicting explanations and unrelated information in science classrooms increase cognitive load and decrease efficiency in learning. This reduced efficiency ultimately limits one's ability to solve reasoning problems in the science. In reasoning, it is the ability of students to sift through and identify critical pieces of information that is of…

  7. The Application of Multiobjective Evolutionary Algorithms to an Educational Computational Model of Science Information Processing: A Computational Experiment in Science Education

    Science.gov (United States)

    Lamb, Richard L.; Firestone, Jonah B.

    2017-01-01

    Conflicting explanations and unrelated information in science classrooms increase cognitive load and decrease efficiency in learning. This reduced efficiency ultimately limits one's ability to solve reasoning problems in the science. In reasoning, it is the ability of students to sift through and identify critical pieces of information that is of…

  8. Model feedstock supply processing plants

    Directory of Open Access Journals (Sweden)

    V. M. Bautin

    2013-01-01

    Full Text Available The model of raw providing the processing enterprises entering into vertically integrated structure on production and processing of dairy raw materials, differing by an orientation on achievement of cumulative effect by the integrated structure acting as criterion function which maximizing is reached by optimization of capacities, volumes of deliveries of raw materials and its qualitative characteristics, costs of industrial processing of raw materials and demand for dairy production is developed.

  9. Model Checking of Boolean Process Models

    CERN Document Server

    Schneider, Christoph

    2011-01-01

    In the field of Business Process Management formal models for the control flow of business processes have been designed since more than 15 years. Which methods are best suited to verify the bulk of these models? The first step is to select a formal language which fixes the semantics of the models. We adopt the language of Boolean systems as reference language for Boolean process models. Boolean systems form a simple subclass of coloured Petri nets. Their characteristics are low tokens to model explicitly states with a subsequent skipping of activations and arbitrary logical rules of type AND, XOR, OR etc. to model the split and join of the control flow. We apply model checking as a verification method for the safeness and liveness of Boolean systems. Model checking of Boolean systems uses the elementary theory of propositional logic, no modal operators are needed. Our verification builds on a finite complete prefix of a certain T-system attached to the Boolean system. It splits the processes of the Boolean sy...

  10. Structural Information Retention in Visual Art Processing.

    Science.gov (United States)

    Koroscik, Judith Smith

    The accuracy of non-art college students' longterm retention of structural information presented in Leonardo da Vinci's "Mona Lisa" was tested. Seventeen female undergraduates viewed reproductions of the painting and copies that closely resembled structural attributes of the original. Only 3 of the 17 subjects reported having viewed a reproduction…

  11. Structural Information Retention in Visual Art Processing.

    Science.gov (United States)

    Koroscik, Judith Smith

    The accuracy of non-art college students' longterm retention of structural information presented in Leonardo da Vinci's "Mona Lisa" was tested. Seventeen female undergraduates viewed reproductions of the painting and copies that closely resembled structural attributes of the original. Only 3 of the 17 subjects reported having viewed a reproduction…

  12. Consolidated Environmental Resource Database Information Process (CERDIP)

    Science.gov (United States)

    2015-11-19

    4 6.1 STEP 1: Define the Area of Interest ........................................................................... 4 6.2...GIS Geographic Information System IGO Inter-Governmental Organizations IUCN International Union for the Conservation of Nature KML Keyhole...a tactic in their attempt to erase the cultural heritage of societies they desire to dominate. Similar to conflict minerals, VEOs and international

  13. Hierarchy of Information Processing in the Brain

    DEFF Research Database (Denmark)

    Deco, Gustavo; Kringelbach, Morten L

    2017-01-01

    A general theory of brain function has to be able to explain local and non-local network computations over space and time. We propose a new framework to capture the key principles of how local activity influences global computation, i.e., describing the propagation of information and thus the bro...

  14. Modeling interdependencies between business and communication processes in hospitals.

    Science.gov (United States)

    Brigl, Birgit; Wendt, Thomas; Winter, Alfred

    2003-01-01

    The optimization and redesign of business processes in hospitals is an important challenge for the hospital information management who has to design and implement a suitable HIS architecture. Nevertheless, there are no tools available specializing in modeling information-driven business processes and the consequences on the communication between information processing, tools. Therefore, we will present an approach which facilitates the representation and analysis of business processes and resulting communication processes between application components and their interdependencies. This approach aims not only to visualize those processes, but to also to evaluate if there are weaknesses concerning the information processing infrastructure which hinder the smooth implementation of the business processes.

  15. INFORMATIONAL MODEL OF MENTAL ROTATION OF FIGURES

    Directory of Open Access Journals (Sweden)

    V. A. Lyakhovetskiy

    2016-01-01

    Full Text Available Subject of Study.The subject of research is the information structure of objects internal representations and operations over them, used by man to solve the problem of mental rotation of figures. To analyze this informational structure we considered not only classical dependencies of the correct answers on the angle of rotation, but also the other dependencies obtained recently in cognitive psychology. Method.The language of technical computing Matlab R2010b was used for developing information model of the mental rotation of figures. Such model parameters as the number of bits in the internal representation, an error probability in a single bit, discrete rotation angle, comparison threshold, and the degree of difference during rotation can be changed. Main Results.The model reproduces qualitatively such psychological dependencies as the linear increase of time of correct answers and the number of errors on the angle of rotation for identical figures, "flat" dependence of the time of correct answers and the number of errors on the angle of rotation for mirror-like figures. The simulation results suggest that mental rotation is an iterative process of finding a match between the two figures, each step of which can lead to a significant distortion of the internal representation of the stored objects. Matching is carried out within the internal representations that have no high invariance to rotation angle. Practical Significance.The results may be useful for understanding the role of learning (including the learning with a teacher in the development of effective information representation and operations on them in artificial intelligence systems.

  16. Internet User Behaviour Model Discovery Process

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available The Academy of Economic Studies has more than 45000 students and about 5000 computers with Internet access which are connected to AES network. Students can access internet on these computers through a proxy server which stores information about the way the Internet is accessed. In this paper, we describe the process of discovering internet user behavior models by analyzing proxy server raw data and we emphasize the importance of such models for the e-learning environment.

  17. Internet User Behaviour Model Discovery Process

    OpenAIRE

    Dragos Marcel VESPAN

    2007-01-01

    The Academy of Economic Studies has more than 45000 students and about 5000 computers with Internet access which are connected to AES network. Students can access internet on these computers through a proxy server which stores information about the way the Internet is accessed. In this paper, we describe the process of discovering internet user behavior models by analyzing proxy server raw data and we emphasize the importance of such models for the e-learning environment.

  18. How Qualitative Methods Can be Used to Inform Model Development.

    Science.gov (United States)

    Husbands, Samantha; Jowett, Susan; Barton, Pelham; Coast, Joanna

    2017-06-01

    Decision-analytic models play a key role in informing healthcare resource allocation decisions. However, there are ongoing concerns with the credibility of models. Modelling methods guidance can encourage good practice within model development, but its value is dependent on its ability to address the areas that modellers find most challenging. Further, it is important that modelling methods and related guidance are continually updated in light of any new approaches that could potentially enhance model credibility. The objective of this article was to highlight the ways in which qualitative methods have been used and recommended to inform decision-analytic model development and enhance modelling practices. With reference to the literature, the article discusses two key ways in which qualitative methods can be, and have been, applied. The first approach involves using qualitative methods to understand and inform general and future processes of model development, and the second, using qualitative techniques to directly inform the development of individual models. The literature suggests that qualitative methods can improve the validity and credibility of modelling processes by providing a means to understand existing modelling approaches that identifies where problems are occurring and further guidance is needed. It can also be applied within model development to facilitate the input of experts to structural development. We recommend that current and future model development would benefit from the greater integration of qualitative methods, specifically by studying 'real' modelling processes, and by developing recommendations around how qualitative methods can be adopted within everyday modelling practice.

  19. Computer Modelling of Dynamic Processes

    Directory of Open Access Journals (Sweden)

    B. Rybakin

    2000-10-01

    Full Text Available Results of numerical modeling of dynamic problems are summed in the article up. These problems are characteristic for various areas of human activity, in particular for problem solving in ecology. The following problems are considered in the present work: computer modeling of dynamic effects on elastic-plastic bodies, calculation and determination of performances of gas streams in gas cleaning equipment, modeling of biogas formation processes.

  20. Command Process Modeling & Risk Analysis

    Science.gov (United States)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  1. Path modeling and process control

    DEFF Research Database (Denmark)

    Høskuldsson, Agnar; Rodionova, O.; Pomerantsev, A.

    2007-01-01

    and having three or more stages. The methods are applied to a process control of a multi-stage production process having 25 variables and one output variable. When moving along the process, variables change their roles. It is shown how the methods of path modeling can be applied to estimate variables...... of the next stage with the purpose of obtaining optimal or almost optimal quality of the output variable. An important aspect of the methods presented is the possibility of extensive graphic analysis of data that can provide the engineer with a detailed view of the multi-variate variation in data.......Many production processes are carried out in stages. At the end of each stage, the production engineer can analyze the intermediate results and correct process parameters (variables) of the next stage. Both analysis of the process and correction to process parameters at next stage should...

  2. Influence Business Process On The Quality Of Accounting Information System

    Directory of Open Access Journals (Sweden)

    Meiryani

    2015-01-01

    Full Text Available Abstract The purpose of this study was to determine the influence of business process to the quality of the accounting information system. This study aims to examine the influence of business process on the quality of the information system of accounting information system. The study was theoritical research which considered the roles of business process on quality of accounting information system which use secondary data collection. The results showed that the business process have a significant effect on the quality of accounting information systems.

  3. Towards Web-based representation and processing of health information

    Directory of Open Access Journals (Sweden)

    Oldfield Eddie

    2009-01-01

    Full Text Available Abstract Background There is great concern within health surveillance, on how to grapple with environmental degradation, rapid urbanization, population mobility and growth. The Internet has emerged as an efficient way to share health information, enabling users to access and understand data at their fingertips. Increasingly complex problems in the health field require increasingly sophisticated computer software, distributed computing power, and standardized data sharing. To address this need, Web-based mapping is now emerging as an important tool to enable health practitioners, policy makers, and the public to understand spatial health risks, population health trends and vulnerabilities. Today several web-based health applications generate dynamic maps; however, for people to fully interpret the maps they need data source description and the method used in the data analysis or statistical modeling. For the representation of health information through Web-mapping applications, there still lacks a standard format to accommodate all fixed (such as location and variable (such as age, gender, health outcome, etc indicators in the representation of health information. Furthermore, net-centric computing has not been adequately applied to support flexible health data processing and mapping online. Results The authors of this study designed a HEalth Representation XML (HERXML schema that consists of the semantic (e.g., health activity description, the data sources description, the statistical methodology used for analysis, geometric, and cartographical representations of health data. A case study has been carried on the development of web application and services within the Canadian Geospatial Data Infrastructure (CGDI framework for community health programs of the New Brunswick Lung Association. This study facilitated the online processing, mapping and sharing of health information, with the use of HERXML and Open Geospatial Consortium (OGC services

  4. A Study on Improving Information Processing Abilities Based on PBL

    Science.gov (United States)

    Kim, Du Gyu; Lee, JaeMu

    2014-01-01

    This study examined an instruction method for the improvement of information processing abilities in elementary school students. Current elementary students are required to develop information processing abilities to create new knowledge for this digital age. There is, however, a shortage of instruction strategies for these information processing…

  5. USAF Logistics Process Optimization Study for the Aircraft Asset Sustainment Process. Volume 3. Future to be Asset Sustainment Process Model.

    Science.gov (United States)

    2007-11-02

    Models), contains the To-Be Retail Asset Sustainment Process Model displaying the activities and functions related to the improved processes for receipt...of a logistics process model for a more distant future asset sustainment scenario unconstrained by today’s logistics information systems limitations...It also contains a process model reflecting the Reengineering Team’s vision of the future asset sustainment process.

  6. How to Improve Listening Skills--in Light of the Information Processing Theory

    Institute of Scientific and Technical Information of China (English)

    LU Tian-tian

    2013-01-01

    Using Gagne’s information processing theory to analyze the listening process so as to provide a pedagogical model for L2 learners to solve listening problems as well as to cast some lights on listening teaching.

  7. An Information Processing Perspective on Divergence and Convergence in Collaborative Learning

    Science.gov (United States)

    Jorczak, Robert L.

    2011-01-01

    This paper presents a model of collaborative learning that takes an information processing perspective of learning by social interaction. The collaborative information processing model provides a theoretical basis for understanding learning principles associated with social interaction and explains why peer-to-peer discussion is potentially more…

  8. An Information Processing Perspective on Divergence and Convergence in Collaborative Learning

    Science.gov (United States)

    Jorczak, Robert L.

    2011-01-01

    This paper presents a model of collaborative learning that takes an information processing perspective of learning by social interaction. The collaborative information processing model provides a theoretical basis for understanding learning principles associated with social interaction and explains why peer-to-peer discussion is potentially more…

  9. Information Acquisition & Processing in Scanning Probe Microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Kalinin, Sergei V [ORNL; Jesse, Stephen [ORNL; Proksch, Roger [Asylum Research, Santa Barbara, CA

    2008-01-01

    Much of the imaging and spectroscopy capabilities of the existing 20,000+ scanning probe microscopes worldwide relies on specialized data processing that links the microsecond (and sometimes faster) time scale of cantilever motion to the millisecond (and sometimes slower) time scale of image acquisition and feedback. In most SPMs, the cantilever is excited to oscillate sinusoidally and the time-averaged amplitude and/or phase are used as imaging or control signals. Traditionally, the step of converting the rapid motion of the cantilever into an amplitude or phase is performed by phase sensitive homodyne or phase-locked loop detection. The emergence of fast configurable data processing electronics in last several years has allowed the development of non-sinusoidal data acquisition and processing methods. Here, we briefly review the principles and limitations of phase sensitive detectors and discuss some of the emergent technologies based on rapid spectroscopic measurements in frequency- and time domains.

  10. Goal Based Testing: A Risk Informed Process

    Science.gov (United States)

    Everline, Chester; Smith, Clayton; Distefano, Sal; Goldin, Natalie

    2014-01-01

    A process for life demonstration testing is developed, which can reduce the number of resources required by conventional sampling theory while still maintaining the same degree of rigor and confidence level. This process incorporates state-of-the-art probabilistic thinking and is consistent with existing NASA guidance documentation. This view of life testing changes the paradigm of testing a system for many hours to show confidence that a system will last for the required number of years to one that focuses efforts and resources on exploring how the system can fail at end-of-life and building confidence that the failure mechanisms are understood and well mitigated.

  11. Gathering Information from Transport Systems for Processing in Supply Chains

    Science.gov (United States)

    Kodym, Oldřich; Unucka, Jakub

    2016-12-01

    Paper deals with complex system for processing information from means of transport acting as parts of train (rail or road). It focuses on automated information gathering using AutoID technology, information transmission via Internet of Things networks and information usage in information systems of logistic firms for support of selected processes on MES and ERP levels. Different kinds of gathered information from whole transport chain are discussed. Compliance with existing standards is mentioned. Security of information in full life cycle is integral part of presented system. Design of fully equipped system based on synthesized functional nodes is presented.

  12. Numerical support, information processing and attitude change

    NARCIS (Netherlands)

    de Dreu, C.K.W.; de Vries, N.K.

    1993-01-01

    In two experiments we studied the prediction that majority support induces stronger convergent processing than minority support for a persuasive message, the more so when recipients are explicitly forced to pay attention to the source's point of view; this in turn affects the amount of attitude chan

  13. Modeling and Analysis of Information Product Maps

    Science.gov (United States)

    Heien, Christopher Harris

    2012-01-01

    Information Product Maps are visual diagrams used to represent the inputs, processing, and outputs of data within an Information Manufacturing System. A data unit, drawn as an edge, symbolizes a grouping of raw data as it travels through this system. Processes, drawn as vertices, transform each data unit input into various forms prior to delivery…

  14. Unsupervised Neural Network Quantifies the Cost of Visual Information Processing.

    Science.gov (United States)

    Orbán, Levente L; Chartier, Sylvain

    2015-01-01

    Untrained, "flower-naïve" bumblebees display behavioural preferences when presented with visual properties such as colour, symmetry, spatial frequency and others. Two unsupervised neural networks were implemented to understand the extent to which these models capture elements of bumblebees' unlearned visual preferences towards flower-like visual properties. The computational models, which are variants of Independent Component Analysis and Feature-Extracting Bidirectional Associative Memory, use images of test-patterns that are identical to ones used in behavioural studies. Each model works by decomposing images of floral patterns into meaningful underlying factors. We reconstruct the original floral image using the components and compare the quality of the reconstructed image to the original image. Independent Component Analysis matches behavioural results substantially better across several visual properties. These results are interpreted to support a hypothesis that the temporal and energetic costs of information processing by pollinators served as a selective pressure on floral displays: flowers adapted to pollinators' cognitive constraints.

  15. Development of a framework for information acquisition and processing in cyber-physical systems

    NARCIS (Netherlands)

    Li, Y.; Song, Y.; Horvath, I.; Opiyo, E.Z.; Zhang, G.

    2014-01-01

    In the designing and modeling of CPSs, the information acquisition and processing processes are often application dependent and process oriented. Those information management frameworks are simple and effective for small scale systems. However, many functions developed are not reusable or cannot be

  16. Human machine interaction: The special role for human unconscious emotional information processing

    NARCIS (Netherlands)

    Noort, M.W.M.L. van den; Hugdahl, K.; Bosch, M.P.C.

    2005-01-01

    The nature of (un)conscious human emotional information processing remains a great mystery. On the one hand, classical models view human conscious emotional information processing as computation among the brain's neurons but fail to address its enigmatic features. On the other hand, quantum processe

  17. The Role of Unconscious Information Processing in the Acquisition and Learning of Instructional Messages

    Science.gov (United States)

    Kuldas, Seffetullah; Bakar, Zainudin Abu; Ismail, Hairul Nizam

    2012-01-01

    This review investigates how the unconscious information processing can create satisfactory learning outcomes, and can be used to ameliorate the challenges of teaching students to regulate their learning processes. The search for the ideal model of human information processing as regards achievement of teaching and learning objectives is a…

  18. Human machine interaction: The special role for human unconscious emotional information processing

    NARCIS (Netherlands)

    Noort, M.W.M.L. van den; Hugdahl, K.; Bosch, M.P.C.

    2005-01-01

    The nature of (un)conscious human emotional information processing remains a great mystery. On the one hand, classical models view human conscious emotional information processing as computation among the brain's neurons but fail to address its enigmatic features. On the other hand, quantum processe

  19. Information Integration - the process of integration, evolution and versioning

    NARCIS (Netherlands)

    Keijzer, de Ander; Keulen, van Maurice

    2005-01-01

    At present, many information sources are available wherever you are. Most of the time, the information needed is spread across several of those information sources. Gathering this information is a tedious and time consuming job. Automating this process would assist the user in its task. Integration

  20. Quantum Information Processing in Disordered and Complex Quantum Systems

    CERN Document Server

    De, A S; Ahufinger, V; Briegel, H J; Sanpera, A; Lewenstein, M; De, Aditi Sen; Sen, Ujjwal; Ahufinger, Veronica; Briegel, Hans J.; Sanpera, Anna; Lewenstein, Maciej

    2005-01-01

    We investigate quantum information processing and manipulations in disordered systems of ultracold atoms and trapped ions. First, we demonstrate generation of entanglement and local realization of quantum gates in a quantum spin glass system. Entanglement in such systems attains significantly high values, after quenched averaging, and has a stable positive value for arbitrary times. Complex systems with long range interactions, such as ion chains or dipolar atomic gases, can be modeled by neural network Hamiltonians. In such systems, we find the characteristic time of persistence of quenched averaged entanglement, and also find the time of its revival.

  1. Application of Event-Driven Process Chain Modeling of Military Information System%事件驱动过程链在军事信息系统建模中的应用

    Institute of Scientific and Technical Information of China (English)

    于晓浩; 刘俊先; 陈涛; 罗雪山

    2011-01-01

    Available modeling of system is obligatory in rational designing of military information system. Proposes a combined modeling method by analyzing Event-driven Process Chain (EPC) and Petri net. Extends the EPC modeling with organization and data flow description when using EPC to describe the process of system. Analyzes the verification method of well-structured workflow under the combination of control flow and data flow. Implements a modeling tool and uses a case study to show the modeling process and its application.%要实现军事信息系统的合理设计必须进行有效的系统建模.通过分析事件驱动的过程链(EPC)和Petri网的特点,提出两者相结合的建模方法,着重研究了基于EPC的系统过程描述方法,在EPC模型中引入组织描述和数据流描述,并分析了控制流、数据流相结合的结构合理性验证方法.设计实现了系统建模工具,并通过实例介绍建模过程,验证方法的可行性.

  2. Assimilating host model information into a limited area model

    Directory of Open Access Journals (Sweden)

    Nils Gustafsson

    2012-01-01

    Full Text Available We propose to add an extra source of information to the data-assimilation of the regional HIgh Resolution Limited Area Model (HIRLAM model, constraining larger scales to the host model providing the lateral boundary conditions. An extra term, Jk, measuring the distance to the large-scale vorticity of the host model, is added to the cost-function of the variational data-assimilation. Vorticity is chosen because it is a good representative of the large-scale flow and because vorticity is a basic control variable of the HIRLAM variational data-assimilation. Furthermore, by choosing only vorticity, the remaining model variables, divergence, temperature, surface pressure and specific humidity will be allowed to adapt to the modified vorticity field in accordance with the internal balance constraints of the regional model. The error characteristics of the Jk term are described by the horizontal spectral densities and the vertical eigenmodes (eigenvectors and eigenvalues of the host model vorticity forecast error fields, expressed in the regional model geometry. The vorticity field, provided by the European Centre for Medium-range Weather Forecasts (ECMWF operational model, was assimilated into the HIRLAM model during an experiment period of 33 d in winter with positive impact on forecast verification statistics for upper air variables and mean sea level pressure.The review process was handled by Editor-in-Cheif Harald Lejenäs

  3. A model-driven approach to information security compliance

    Science.gov (United States)

    Correia, Anacleto; Gonçalves, António; Teodoro, M. Filomena

    2017-06-01

    The availability, integrity and confidentiality of information are fundamental to the long-term survival of any organization. Information security is a complex issue that must be holistically approached, combining assets that support corporate systems, in an extended network of business partners, vendors, customers and other stakeholders. This paper addresses the conception and implementation of information security systems, conform the ISO/IEC 27000 set of standards, using the model-driven approach. The process begins with the conception of a domain level model (computation independent model) based on information security vocabulary present in the ISO/IEC 27001 standard. Based on this model, after embedding in the model mandatory rules for attaining ISO/IEC 27001 conformance, a platform independent model is derived. Finally, a platform specific model serves the base for testing the compliance of information security systems with the ISO/IEC 27000 set of standards.

  4. Food Security Information Platform Model Based on Internet of Things

    Directory of Open Access Journals (Sweden)

    Lei Zhang

    2015-06-01

    Full Text Available According to the tracking and tracing requirements of food supply chain management and quality and safety, this study built food security information platform using the Internet of things technology, with reference to the EPC standard, the use of RFID technology, adopting the model of SOA, based on SCOR core processes, researches the food security information platform which can set up the whole process from the source to the consumption of the traceability information, provides food information, strengthens the food identity verification, prevents food identification and information of error identification to the consumer and government food safety regulators, provides good practices for food safety traceability.

  5. Quantum-Classical Hybrid for Information Processing

    Science.gov (United States)

    Zak, Michail

    2011-01-01

    Based upon quantum-inspired entanglement in quantum-classical hybrids, a simple algorithm for instantaneous transmissions of non-intentional messages (chosen at random) to remote distances is proposed. The idea is to implement instantaneous transmission of conditional information on remote distances via a quantum-classical hybrid that preserves superposition of random solutions, while allowing one to measure its state variables using classical methods. Such a hybrid system reinforces the advantages, and minimizes the limitations, of both quantum and classical characteristics. Consider n observers, and assume that each of them gets a copy of the system and runs it separately. Although they run identical systems, the outcomes of even synchronized runs may be different because the solutions of these systems are random. However, the global constrain must be satisfied. Therefore, if the observer #1 (the sender) made a measurement of the acceleration v(sub 1) at t =T, then the receiver, by measuring the corresponding acceleration v(sub 1) at t =T, may get a wrong value because the accelerations are random, and only their ratios are deterministic. Obviously, the transmission of this knowledge is instantaneous as soon as the measurements have been performed. In addition to that, the distance between the observers is irrelevant because the x-coordinate does not enter the governing equations. However, the Shannon information transmitted is zero. None of the senders can control the outcomes of their measurements because they are random. The senders cannot transmit intentional messages. Nevertheless, based on the transmitted knowledge, they can coordinate their actions based on conditional information. If the observer #1 knows his own measurements, the measurements of the others can be fully determined. It is important to emphasize that the origin of entanglement of all the observers is the joint probability density that couples their actions. There is no centralized source

  6. Biological Information Processing in Single Microtubules

    Science.gov (United States)

    2014-03-05

    communication is believed to be primarily via Hudgkin- Huxley spiking and secondarily via electronic conduction via gap junctions, and modulated by...neurotransmitters as Hudgkin Huxley predicted. Resonant energy transfer and synchronous wireless communication would be incorporated into this model, and the

  7. Cloud Standardization: Consistent Business Processes and Information

    Directory of Open Access Journals (Sweden)

    Razvan Daniel ZOTA

    2013-01-01

    Full Text Available Cloud computing represents one of the latest emerging trends in distributed computing that enables the existence of hardware infrastructure and software applications as services. The present paper offers a general approach to the cloud computing standardization as a mean of improving the speed of adoption for the cloud technologies. Moreover, this study tries to show out how organizations may achieve more consistent business processes while operating with cloud computing technologies.

  8. Silicon Quantum Dots for Quantum Information Processing

    Science.gov (United States)

    2013-11-01

    NeillConcelman CNOT Controlled-not gate CPU Central processing unit DC Direct current DCE dichloroethylene EBL Electron beam lithography ESR Electron spin...Electron Beam Lithography of Gates and Alignment Markers The main electron beam lithography ( EBL ) machine used for fabricating the devices in this thesis...vided additional support for patterning parameters such as drawing sequences, drawing directions and area dose, and controlling the EBL writing

  9. Living is information processing; from molecules to global systems

    CERN Document Server

    Farnsworth, Keith D; Gershenson, Carlos

    2012-01-01

    We extend the concept that life is an informational phenomenon, at every level of organisation, from molecules to the global ecological system. According to this thesis: (a) living is information processing, in which memory is maintained by both molecular states and ecological states as well as the more obvious nucleic acid coding; (b) this information processing has one overall function - to perpetuate itself; and (c) the processing method is filtration (cognition) of, and synthesis of, information at lower levels to appear at higher levels in complex systems (emergence). We show how information patterns, are united by the creation of mutual context, generating persistent consequences, to result in `functional information'. This constructive process forms arbitrarily large complexes of information, the combined effects of which include the functions of life. Molecules and simple organisms have already been measured in terms of functional information content; we show how quantification may extended to each le...

  10. Topic modelling in the information warfare domain

    CSIR Research Space (South Africa)

    De Waal, A

    2013-11-01

    Full Text Available In this paper the authors provide context to Topic Modelling as an Information Warfare technique. Topic modelling is a technique that discovers latent topics in unstructured and unlabelled collection of documents. The topic structure can be searched...

  11. Can the Analytical Hierarchy Process Model Be Effectively Applied in the Prioritization of Information Assurance Defense In-Depth Measures? --A Quantitative Study

    Science.gov (United States)

    Alexander, Rodney T.

    2017-01-01

    Organizational computing devices are increasingly becoming targets of cyber-attacks, and organizations have become dependent on the safety and security of their computer networks and their organizational computing devices. Business and government often use defense in-depth information assurance measures such as firewalls, intrusion detection…

  12. ECONOMIC MODELING PROCESSES USING MATLAB

    Directory of Open Access Journals (Sweden)

    Anamaria G. MACOVEI

    2008-06-01

    Full Text Available To study economic phenomena and processes using mathem atical modeling, and to determine the approximatesolution to a problem we need to choose a method of calculation and a numerical computer program, namely thepackage of programs MatLab. Any economic process or phenomenon is a mathematical description of h is behavior,and thus draw up an economic and mathematical model that has the following stages: formulation of the problem, theanalysis process modeling, the production model and design verification, validation and implementation of the model.This article is presented an economic model and its modeling is using mathematical equations and software packageMatLab, which helps us approximation effective solution. As data entry is considered the net cost, the cost of direct andtotal cost and the link between them. I presented the basic formula for determining the total cost. Economic modelcalculations were made in MatLab software package and with graphic representation of its interpretation of the resultsachieved in terms of our specific problem.

  13. Theory Creation, Modification, and Testing: An Information-Processing Model and Theory of the Anticipated and Unanticipated Consequences of Research and Development

    Science.gov (United States)

    Perla, Rocco J.; Carifio, James

    2011-01-01

    Background: Extending Merton's (1936) work on the consequences of purposive social action, the model, theory and taxonomy outlined here incorporates and formalizes both anticipated and unanticipated research findings in a unified theoretical framework. The model of anticipated research findings was developed initially by Carifio (1975, 1977) and…

  14. Modular quantum-information processing by dissipation

    Science.gov (United States)

    Marshall, Jeffrey; Campos Venuti, Lorenzo; Zanardi, Paolo

    2016-11-01

    Dissipation can be used as a resource to control and simulate quantum systems. We discuss a modular model based on fast dissipation capable of performing universal quantum computation, and simulating arbitrary Lindbladian dynamics. The model consists of a network of elementary dissipation-generated modules and it is in principle scalable. In particular, we demonstrate the ability to dissipatively prepare all single-qubit gates, and the controlled-not gate; prerequisites for universal quantum computing. We also show a way to implement a type of quantum memory in a dissipative environment, whereby we can arbitrarily control the loss in both coherence, and concurrence, over the evolution. Moreover, our dissipation-assisted modular construction exhibits a degree of inbuilt robustness to Hamiltonian and, indeed, Lindbladian errors, and as such is of potential practical relevance.

  15. Information model construction of MES oriented to mechanical blanking workshop

    Science.gov (United States)

    Wang, Jin-bo; Wang, Jin-ye; Yue, Yan-fang; Yao, Xue-min

    2016-11-01

    Manufacturing Execution System (MES) is one of the crucial technologies to implement informatization management in manufacturing enterprises, and the construction of its information model is the base of MES database development. Basis on the analysis of the manufacturing process information in mechanical blanking workshop and the information requirement of MES every function module, the IDEF1X method was adopted to construct the information model of MES oriented to mechanical blanking workshop, and a detailed description of the data structure feature included in MES every function module and their logical relationship was given from the point of view of information relationship, which laid the foundation for the design of MES database.

  16. Quantum information processing and relativistic quantum fields

    Science.gov (United States)

    Benincasa, Dionigi M. T.; Borsten, Leron; Buck, Michel; Dowker, Fay

    2014-04-01

    It is shown that an ideal measurement of a one-particle wave packet state of a relativistic quantum field in Minkowski spacetime enables superluminal signalling. The result holds for a measurement that takes place over an intervention region in spacetime whose extent in time in some frame is longer than the light-crossing time of the packet in that frame. Moreover, these results are shown to apply not only to ideal measurements but also to unitary transformations that rotate two orthogonal one-particle states into each other. In light of these observations, possible restrictions on the allowed types of intervention are considered. A more physical approach to such questions is to construct explicit models of the interventions as interactions between the field and other quantum systems such as detectors. The prototypical Unruh-DeWitt detector couples to the field operator itself and so most likely respects relativistic causality. On the other hand, detector models which couple to a finite set of frequencies of field modes are shown to lead to superluminal signalling. Such detectors do, however, provide successful phenomenological models of atom-qubits interacting with quantum fields in a cavity but are valid only on time scales many orders of magnitude larger than the light-crossing time of the cavity.

  17. [EMDR and adaptive information processing. Psychotherapy as a stimulation of the self-reparative psychological processes].

    Science.gov (United States)

    Fernandez, Isabel; Giovannozzi, Gabriella

    2012-01-01

    Based on the concept of traumatic event, the model of the adaptive information processing is described to illustrate how EMDR is applied to reprocess the trauma and resolve post-traumatic psychopathology. The eight phases of the EMDR treatment are presented together with the way an EMDR session is conducted and the contribution and innovation that EMDR represents in the field of therapy of post-traumatic states and its applicability in other symptomatic conditions.

  18. Modeling of the Hydroentanglement Process

    Directory of Open Access Journals (Sweden)

    Ping Xiang

    2006-11-01

    Full Text Available Mechanical performance of hydroentangled nonwovens is determined by the degree of the fiber entanglement, which depends on parameters of the fibers, fiberweb, forming surface, water jet and the process speed. This paper develops a computational fluid dynamics model of the hydroentanglement process. Extensive comparison with experimental data showed that the degree of fiber entanglement is linearly related to flow vorticity in the fiberweb, which is induced by impinging water jets. The fiberweb is modeled as a porous material of uniform porosity and the actual geometry of forming wires is accounted for in the model. Simulation results are compared with experimental data for a Perfojet ® sleeve and four woven forming surfaces. Additionally, the model is used to predict the effect of fiberweb thickness on the degree of fiber entanglement for different forming surfaces.

  19. [Documentation and information processing in clinical anesthesia].

    Science.gov (United States)

    Hofmockel, R; Benad, G; Rakow, W

    1995-01-01

    Whereas anaesthesia recording was already introduced by Codman and Cushing in 1894 and is now stipulated by law, work registration in anaesthesia and work comparison between departments of anaesthesia have only been built up in the last ten years. Based on the rapid development of computer technology specific solutions were found, ranging from simple software developments to complex on-line monitor systems. The basic data bank published by the German Society of Anaesthesiology and Intensive Medicine represents a minimum of data, which can be added to at will. We report on our new anaesthesia record and corresponding software with computer supported manual data registration, which was developed at our clinic, taking into account state-of-the-art data processing and quality assurance. A new development, which includes on-line data processing with automatic data registration followed by graphic or numeric data print-out, is now presented by Datex. In combination with manual data input of all parameters of work registration, complete automatic anaesthesia recording can be achieved. The authors report on their first own experience with this system.

  20. A Model for Information Integration Using Service Oriented Architectur

    Directory of Open Access Journals (Sweden)

    C. Punitha Devi

    2014-06-01

    Full Text Available Business agility remains to be the keyword that drives the business into different directions and enabling a 360 degree shift in the business process. To achieve agility the organization should work on real time information and data. The need to have instant access to information appears to be ever shine requirement of all organizations or enterprise. Access to information does not come directly with a single query but a complex process termed Information integration. Information integration has been in existence for the past two decades and has been progressive up to now. The challenges and issues keep on persisting as information integration problem evolves by itself. This paper addresses the issues in the approaches, techniques and models pertaining to information integration and identifies the problem for a need for a complete model. As SOA is the architectural style that is changing the business patterns today, this paper proposes a service oriented model for information integration. The model mainly focuses on giving a complete structure for information integration that is adaptable to any environment and open in nature. Here information is converted into service and then the information services are integrated through service oriented integration to provide the integrated information also as service.

  1. Modified Claus process probabilistic model

    Energy Technology Data Exchange (ETDEWEB)

    Larraz Mora, R. [Chemical Engineering Dept., Univ. of La Laguna (Spain)

    2006-03-15

    A model is proposed for the simulation of an industrial Claus unit with a straight-through configuration and two catalytic reactors. Process plant design evaluations based on deterministic calculations does not take into account the uncertainties that are associated with the different input variables. A probabilistic simulation method was applied in the Claus model to obtain an impression of how some of these inaccuracies influences plant performance. (orig.)

  2. Process Models for Security Architectures

    Directory of Open Access Journals (Sweden)

    Floarea NASTASE

    2006-01-01

    Full Text Available This paper presents a model for an integrated security system, which can be implemented in any organization. It is based on security-specific standards and taxonomies as ISO 7498-2 and Common Criteria. The functionalities are derived from the classes proposed in the Common Criteria document. In the paper we present the process model for each functionality and also we focus on the specific components.

  3. Multi-enzyme Process Modeling

    DEFF Research Database (Denmark)

    Andrade Santacoloma, Paloma de Gracia

    The subject of this thesis is to develop a methodological framework that can systematically guide mathematical model building for better understanding of multi-enzyme processes. In this way, opportunities for process improvements can be identified by analyzing simulations of either existing...... are affected (in a positive or negative way) by the presence of the other enzymes and compounds in the media. In this thesis the concept of multi-enzyme in-pot term is adopted for processes that are carried out by the combination of enzymes in a single reactor and implemented at pilot or industrial scale...

  4. A Policy Model for Secure Information Flow

    Science.gov (United States)

    Adetoye, Adedayo O.; Badii, Atta

    When a computer program requires legitimate access to confidential data, the question arises whether such a program may illegally reveal sensitive information. This paper proposes a policy model to specify what information flow is permitted in a computational system. The security definition, which is based on a general notion of information lattices, allows various representations of information to be used in the enforcement of secure information flow in deterministic or nondeterministic systems. A flexible semantics-based analysis technique is presented, which uses the input-output relational model induced by an attacker’s observational power, to compute the information released by the computational system. An illustrative attacker model demonstrates the use of the technique to develop a termination-sensitive analysis. The technique allows the development of various information flow analyses, parametrised by the attacker’s observational power, which can be used to enforce what declassification policies.

  5. The Institutionalization of Scientific Information: A Scientometric Model (ISI-S Model).

    Science.gov (United States)

    Vinkler, Peter

    2002-01-01

    Introduces a scientometric model (ISI-S model) for describing the institutionalization process of scientific information. ISI-S describes the information and knowledge systems of scientific publications as a global network of interdependent information and knowledge clusters that are dynamically changing by their content and size. (Author/LRW)

  6. Delayed early proprioceptive information processing in schizophrenia.

    Science.gov (United States)

    Arnfred, Sidse M; Hemmingsen, Ralf P; Parnas, Josef

    2006-12-01

    It was first suggested that disordered proprioception was a core feature of schizophrenia by Sandor Rado in 1953. Using a recently designed proprioceptive event-related potential paradigm based on a change of load, we studied 12 unmedicated male out-patients with schizophrenia and 24 controls. In the patients, the early contralateral parietal activity was delayed and later central activity had increased amplitude, but gating was unaffected. The results could be understood within the "deficiency of corollary discharge" model of schizophrenia but not within the "filtering" theory. Further studies, including psychiatric controls, are necessary to verify the specificity of the abnormality.

  7. Delayed early proprioceptive information processing in schizophrenia

    DEFF Research Database (Denmark)

    Arnfred, Sidse M; Hemmingsen, RP; Parnas, Josef;

    2006-01-01

    It was first suggested that disordered proprioception was a core feature of schizophrenia by Sandor Rado in 1953. Using a recently designed proprioceptive event-related potential paradigm based on a change of load, we studied 12 unmedicated male out-patients with schizophrenia and 24 controls....... In the patients, the early contralateral parietal activity was delayed and later central activity had increased amplitude, but gating was unaffected. The results could be understood within the "deficiency of corollary discharge" model of schizophrenia but not within the "filtering" theory. Further studies...

  8. Delayed early proprioceptive information processing in schizophrenia

    DEFF Research Database (Denmark)

    Arnfred, Sidse M; Hemmingsen, RP; Parnas, Josef

    2006-01-01

    It was first suggested that disordered proprioception was a core feature of schizophrenia by Sandor Rado in 1953. Using a recently designed proprioceptive event-related potential paradigm based on a change of load, we studied 12 unmedicated male out-patients with schizophrenia and 24 controls....... In the patients, the early contralateral parietal activity was delayed and later central activity had increased amplitude, but gating was unaffected. The results could be understood within the "deficiency of corollary discharge" model of schizophrenia but not within the "filtering" theory. Further studies...

  9. Quantum information processing with mesoscopic photonic states

    DEFF Research Database (Denmark)

    Madsen, Lars Skovgaard

    2012-01-01

    The thesis is built up around a versatile optical experimental setup based on a laser, two optical parametric ampliers, a few sets of modulators and two sets of homodyne detectors, which together with passive linear optics generate, process and characterize various types of Gaussian quantum states...... in the mixture of coherent states. Further we investigate the robustness of the discord of a broader range of states and suggest a toolbox of states which can be used to test if a protocol is discord based, before performing a rigid proof. Gaussian quantum key distribution can be implemented with current....... Using this setup we have experimentally and theoretically investigated Gaussian quantum discord, continuous variable quantum key distribution and quantum polarization. The Gaussian discord broadens the definition of non-classical correlations from entanglement, to all types of correlations which cannot...

  10. Quantum Information Processing and Relativistic Quantum Fields

    CERN Document Server

    Benincasa, Dionigi M T; Buck, Michel; Dowker, Fay

    2014-01-01

    It is shown that an ideal measurement of a one-particle wave packet state of a relativistic quantum field in Minkowski spacetime enables superluminal signalling. The result holds for a measurement that takes place over an intervention region in spacetime whose extent in time in some frame is longer than the light-crossing time of the packet in that frame. Moreover, these results are shown to apply not only to ideal measurements but also to unitary transformations that rotate two orthogonal one-particle states into each other. In light of these observations, possible restrictions on the allowed types of intervention are considered. A more physical approach to such questions is to construct explicit models of the interventions as interactions between the field and other quantum systems such as detectors. The prototypical Unruh-DeWitt detector couples to the field operator itself and so most likely respects relativistic causality. On the other hand, detector models which couple to a finite set of frequencies of ...

  11. Multiscale information modelling for heart morphogenesis

    Science.gov (United States)

    Abdulla, T.; Imms, R.; Schleich, J. M.; Summers, R.

    2010-07-01

    Science is made feasible by the adoption of common systems of units. As research has become more data intensive, especially in the biomedical domain, it requires the adoption of a common system of information models, to make explicit the relationship between one set of data and another, regardless of format. This is being realised through the OBO Foundry to develop a suite of reference ontologies, and NCBO Bioportal to provide services to integrate biomedical resources and functionality to visualise and create mappings between ontology terms. Biomedical experts tend to be focused at one level of spatial scale, be it biochemistry, cell biology, or anatomy. Likewise, the ontologies they use tend to be focused at a particular level of scale. There is increasing interest in a multiscale systems approach, which attempts to integrate between different levels of scale to gain understanding of emergent effects. This is a return to physiological medicine with a computational emphasis, exemplified by the worldwide Physiome initiative, and the European Union funded Network of Excellence in the Virtual Physiological Human. However, little work has been done on how information modelling itself may be tailored to a multiscale systems approach. We demonstrate how this can be done for the complex process of heart morphogenesis, which requires multiscale understanding in both time and spatial domains. Such an effort enables the integration of multiscale metrology.

  12. Research of Home Information Technology Adoption Model

    Institute of Scientific and Technical Information of China (English)

    Ao Shan; Ren Weiyin; Lin Peishan; Tang Shoulian

    2008-01-01

    The Information Technology at Home has caught the attention of various industries such as IT, Home Appliances, Communication, and Real Estate. Based on the information technology acceptance theories and family consumption behaviors theories, this study summarized and analyzed four key belief variables i.e. Perceived Value, Perceived Risk, Perceived Cost and Perceived Ease of Use, which influence the acceptance of home information technology. The study also summaries three groups of external variables. They axe social, industrial, and family influence factors. The social influence factors include Subjective Norm; the industry factors include the Unification of Home Information Technological Standards, the Perfection of Home Information Industry Value Chain, and the Competitiveness of Home Information Industry; and the family factors include Family Income, Family Life Cycle and Family Educational Level. The study discusses the relationship among these external variables and cognitive variables. The study provides Home Information Technology Acceptance Model based on the Technology Acceptance Model and the characteristics of home information technology consumption.

  13. Modeling of biopharmaceutical processes. Part 2: Process chromatography unit operation

    DEFF Research Database (Denmark)

    Kaltenbrunner, Oliver; McCue, Justin; Engel, Philip;

    2008-01-01

    Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent...

  14. Probabilistic Modeling in Dynamic Information Retrieval

    OpenAIRE

    Sloan, M. C.

    2016-01-01

    Dynamic modeling is used to design systems that are adaptive to their changing environment and is currently poorly understood in information retrieval systems. Common elements in the information retrieval methodology, such as documents, relevance, users and tasks, are dynamic entities that may evolve over the course of several interactions, which is increasingly captured in search log datasets. Conventional frameworks and models in information retrieval treat these elements as static, or only...

  15. 集装箱岸桥安全操作中信息处理建模及分析%Modeling and Analysing Information Processing in Safety Operation of Quayside Container Crane

    Institute of Scientific and Technical Information of China (English)

    潘洋; 梁承姬; 郑惠强

    2013-01-01

    为研究集装箱岸桥(QC)装卸作业安全操作机理,利用信息论的理论和方法对岸桥信息处理与传递进行建模和计算.首先总结集装箱装卸作业流程的11个作业环节,以人脑为信道,利用通信系统模型建立岸桥司机装卸作业各环节的信息传输转换模型.基于甘特图思想,建立整个装卸作业流程的刺激-反应流程图;并计算作为刺激的各信息源事件的信息量I(xi),信源熵H(X)和信息传递量HT.根据描述性模型和定量计算结果,对各作业环节提出有针对性的事故预防措施.%To analyse mechanism of the safety operation of QC,a set of descriptive models based on Information Theory were built and the quantitative calculations were made.According to the 11 stages of QC loading/unloading operation,the feature of loading/unloading was extracted.So a set of models describing information processing of each stage during loading/unloading are presented as human brain being taken like a channel.And in order to quantitatively calculate the information processing,a stimulus-response flow diagram is drawn on the principle of Gantt chart.Then information content I(xi),information entropy H(X) and information transmission HT are calculated on the basis of the theory and method of Information Theory.Finally accidents prevention measures for each stage are proposed according to the descriptive models and the quantitative analysis.

  16. Building information modeling based on intelligent parametric technology

    Institute of Scientific and Technical Information of China (English)

    ZENG Xudong; TAN Jie

    2007-01-01

    In order to push the information organization process of the building industry,promote sustainable architectural design and enhance the competitiveness of China's building industry,the author studies building information modeling (BIM) based on intelligent parametric modeling technology.Building information modeling is a new technology in the field of computer aided architectural design,which contains not only geometric data,but also the great amount of engineering data throughout the lifecycle of a building.The author also compares BIM technology with two-dimensional CAD technology,and demonstrates the advantages and characteristics of intelligent parametric modeling technology.Building information modeling,which is based on intelligent parametric modeling technology,will certainly replace traditional computer aided architectural design and become the new driving force to push forward China's building industry in this information age.

  17. Complete integrability of information processing by biochemical reactions

    Science.gov (United States)

    Agliari, Elena; Barra, Adriano; Dello Schiavo, Lorenzo; Moro, Antonio

    2016-11-01

    Statistical mechanics provides an effective framework to investigate information processing in biochemical reactions. Within such framework far-reaching analogies are established among (anti-) cooperative collective behaviors in chemical kinetics, (anti-)ferromagnetic spin models in statistical mechanics and operational amplifiers/flip-flops in cybernetics. The underlying modeling - based on spin systems - has been proved to be accurate for a wide class of systems matching classical (e.g. Michaelis-Menten, Hill, Adair) scenarios in the infinite-size approximation. However, the current research in biochemical information processing has been focusing on systems involving a relatively small number of units, where this approximation is no longer valid. Here we show that the whole statistical mechanical description of reaction kinetics can be re-formulated via a mechanical analogy - based on completely integrable hydrodynamic-type systems of PDEs - which provides explicit finite-size solutions, matching recently investigated phenomena (e.g. noise-induced cooperativity, stochastic bi-stability, quorum sensing). The resulting picture, successfully tested against a broad spectrum of data, constitutes a neat rationale for a numerically effective and theoretically consistent description of collective behaviors in biochemical reactions.

  18. FINANCIAL MARKET MODEL WITH INFLUENTIAL INFORMED INVESTORS

    OpenAIRE

    AXEL GRORUD; MONIQUE PONTIER

    2005-01-01

    We develop a financial model with an "influential informed" investor who has an additional information and influences asset prices by means of his strategy. The prices dynamics are supposed to be driven by a Brownian motion, the informed investor's strategies affect the risky asset trends and the interest rate. Our paper could be seen as an extension of Cuoco and Cvitanic's work [4] since, as these authors, we solve the informed influential investor's optimization problem. But our main result...

  19. Seeking and processing information for health decisions among elderly Chinese Singaporean women.

    Science.gov (United States)

    Chang, Leanne; Basnyat, Iccha; Teo, Daniel

    2014-01-01

    Information behavior includes activities of active information seeking, passive acquisition of information, and information use. Guided by the Elaboration Likelihood Model, this study explored elderly Singaporean women's health information behavior to understand how they sought, evaluated, and used health information in everyday lives. Twenty-two in-depth interviews were conducted with elderly Chinese women aged 61 to 79. Qualitative analysis of the interview data yielded three meta-themes: information-seeking patterns, trustworthiness of health information, and peripheral route of decision making. Results revealed that elderly women took both systematic and heuristic approaches to processing information but relied on interpersonal networks to negotiate health choices.

  20. Network Model Building (Process Mapping)

    OpenAIRE

    Blau, Gary; Yih, Yuehwern

    2004-01-01

    12 slides Provider Notes:See Project Planning Video (Windows Media) Posted at the bottom are Gary Blau's slides. Before watching, please note that "process mapping" and "modeling" are mentioned in the video and notes. Here they are meant to refer to the NSCORT "project plan"

  1. An integrated approach for modelling of aircraft maintenance processes

    Directory of Open Access Journals (Sweden)

    D. Yu. Kiselev

    2015-01-01

    Full Text Available The paper deals with modeling of the processes of maintenance and repair of aircraft. The role of information in im-proving the effectiveness of maintenance systems is described. The methodology for functional modelling of maintenance processes is given. A simulation model is used for modelling possible changes.

  2. Photonic Architecture for Scalable Quantum Information Processing in Diamond

    Directory of Open Access Journals (Sweden)

    Kae Nemoto

    2014-08-01

    Full Text Available Physics and information are intimately connected, and the ultimate information processing devices will be those that harness the principles of quantum mechanics. Many physical systems have been identified as candidates for quantum information processing, but none of them are immune from errors. The challenge remains to find a path from the experiments of today to a reliable and scalable quantum computer. Here, we develop an architecture based on a simple module comprising an optical cavity containing a single negatively charged nitrogen vacancy center in diamond. Modules are connected by photons propagating in a fiber-optical network and collectively used to generate a topological cluster state, a robust substrate for quantum information processing. In principle, all processes in the architecture can be deterministic, but current limitations lead to processes that are probabilistic but heralded. We find that the architecture enables large-scale quantum information processing with existing technology.

  3. Directory of Energy Information Administration models 1996

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-07-01

    This directory revises and updates the Directory of Energy Information Administration Models 1995, DOE/EIA-0293(95), Energy Information Administration (EIA), U.S. Department of Energy, July 1995. Four models have been deleted in this directory as they are no longer being used: (1) Market Penetration Model for Ground-Water Heat Pump Systems (MPGWHP); (2) Market Penetration Model for Residential Rooftop PV Systems (MPRESPV-PC); (3) Market Penetration Model for Active and Passive Solar Technologies (MPSOLARPC); and (4) Revenue Requirements Modeling System (RRMS).

  4. A New Technique For Information Processing of CLIC Technical Documentation

    CERN Document Server

    Tzermpinos, Konstantinos

    2013-01-01

    The scientific work presented in this paper could be described as a novel, systemic approach to the process of organization of CLIC documentation. The latter refers to the processing of various sets of archived data found on various CERN archiving services in a more friendly and organized way. From physics aspect, this is equal to having an initial system characterized by high entropy, which after some transformation of energy and matter will produce a final system of reduced entropy. However, this reduction in entropy can be considered valid for open systems only, which are sub-systems of grander isolated systems, to which the total entropy will always increase. Thus, using as basis elements from information theory, systems theory and thermodynamics, the unorganized form of data pending to be organized to a higher form, is modeled as an initial open sub-system with increased entropy, which, after the processing of information, will produce a final system with decreased entropy. This systemic approach to the ...

  5. Living is information processing: from molecules to global systems.

    Science.gov (United States)

    Farnsworth, Keith D; Nelson, John; Gershenson, Carlos

    2013-06-01

    We extend the concept that life is an informational phenomenon, at every level of organisation, from molecules to the global ecological system. According to this thesis: (a) living is information processing, in which memory is maintained by both molecular states and ecological states as well as the more obvious nucleic acid coding; (b) this information processing has one overall function-to perpetuate itself; and (c) the processing method is filtration (cognition) of, and synthesis of, information at lower levels to appear at higher levels in complex systems (emergence). We show how information patterns, are united by the creation of mutual context, generating persistent consequences, to result in 'functional information'. This constructive process forms arbitrarily large complexes of information, the combined effects of which include the functions of life. Molecules and simple organisms have already been measured in terms of functional information content; we show how quantification may be extended to each level of organisation up to the ecological. In terms of a computer analogy, life is both the data and the program and its biochemical structure is the way the information is embodied. This idea supports the seamless integration of life at all scales with the physical universe. The innovation reported here is essentially to integrate these ideas, basing information on the 'general definition' of information, rather than simply the statistics of information, thereby explaining how functional information operates throughout life.

  6. Translating Building Information Modeling to Building Energy Modeling Using Model View Definition

    Directory of Open Access Journals (Sweden)

    WoonSeong Jeong

    2014-01-01

    Full Text Available This paper presents a new approach to translate between Building Information Modeling (BIM and Building Energy Modeling (BEM that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1 the BIM-based Modelica models generated from Revit2Modelica and (2 BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1 enables BIM models to be translated into ModelicaBEM models, (2 enables system interface development based on the MVD for thermal simulation, and (3 facilitates the reuse of original BIM data into building energy simulation without an import/export process.

  7. Translating building information modeling to building energy modeling using model view definition.

    Science.gov (United States)

    Jeong, WoonSeong; Kim, Jong Bum; Clayton, Mark J; Haberl, Jeff S; Yan, Wei

    2014-01-01

    This paper presents a new approach to translate between Building Information Modeling (BIM) and Building Energy Modeling (BEM) that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM) has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD) consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM) and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica) development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1) the BIM-based Modelica models generated from Revit2Modelica and (2) BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1) enables BIM models to be translated into ModelicaBEM models, (2) enables system interface development based on the MVD for thermal simulation, and (3) facilitates the reuse of original BIM data into building energy simulation without an import/export process.

  8. Incorporating evolutionary processes into population viability models.

    Science.gov (United States)

    Pierson, Jennifer C; Beissinger, Steven R; Bragg, Jason G; Coates, David J; Oostermeijer, J Gerard B; Sunnucks, Paul; Schumaker, Nathan H; Trotter, Meredith V; Young, Andrew G

    2015-06-01

    We examined how ecological and evolutionary (eco-evo) processes in population dynamics could be better integrated into population viability analysis (PVA). Complementary advances in computation and population genomics can be combined into an eco-evo PVA to offer powerful new approaches to understand the influence of evolutionary processes on population persistence. We developed the mechanistic basis of an eco-evo PVA using individual-based models with individual-level genotype tracking and dynamic genotype-phenotype mapping to model emergent population-level effects, such as local adaptation and genetic rescue. We then outline how genomics can allow or improve parameter estimation for PVA models by providing genotypic information at large numbers of loci for neutral and functional genome regions. As climate change and other threatening processes increase in rate and scale, eco-evo PVAs will become essential research tools to evaluate the effects of adaptive potential, evolutionary rescue, and locally adapted traits on persistence.

  9. Modeling of the reburning process

    Energy Technology Data Exchange (ETDEWEB)

    Rota, R.; Bonini, F.; Servida, A.; Morbidelli, M.; Carra, S. [Politecnico di Milano, Milano (Italy). Dip. di Chimica Fisica Applicata

    1997-07-01

    Reburning has become a popular method of abating NO{sub x} emission in power plants. Its effectiveness is strongly affected by the interaction between gas phase chemistry and combustion chamber fluid dynamics. Both the mixing of the reactant streams and the elementary reactions in the gas phase control the overall kinetics of the process. This work developed a model coupling a detailed kinetic mechanism to a simplified description of the fluid dynamics of the reburning chamber. The model was checked with reference to experimental data from the literature. Detailed kinetic modeling was found to be essential to describe the reburning process, since the fluid dynamics of the reactor have a strong influence on reactions within. 20 refs., 9 figs., 3 tabs.

  10. Validity of information security policy models

    Directory of Open Access Journals (Sweden)

    Joshua Onome Imoniana

    Full Text Available Validity is concerned with establishing evidence for the use of a method to be used with a particular set of population. Thus, when we address the issue of application of security policy models, we are concerned with the implementation of a certain policy, taking into consideration the standards required, through attribution of scores to every item in the research instrument. En today's globalized economic scenarios, the implementation of information security policy, in an information technology environment, is a condition sine qua non for the strategic management process of any organization. Regarding this topic, various studies present evidences that, the responsibility for maintaining a policy rests primarily with the Chief Security Officer. The Chief Security Officer, in doing so, strives to enhance the updating of technologies, in order to meet all-inclusive business continuity planning policies. Therefore, for such policy to be effective, it has to be entirely embraced by the Chief Executive Officer. This study was developed with the purpose of validating specific theoretical models, whose designs were based on literature review, by sampling 10 of the Automobile Industries located in the ABC region of Metropolitan São Paulo City. This sampling was based on the representativeness of such industries, particularly with regards to each one's implementation of information technology in the region. The current study concludes, presenting evidence of the discriminating validity of four key dimensions of the security policy, being such: the Physical Security, the Logical Access Security, the Administrative Security, and the Legal & Environmental Security. On analyzing the Alpha of Crombach structure of these security items, results not only attest that the capacity of those industries to implement security policies is indisputable, but also, the items involved, homogeneously correlate to each other.

  11. Information criteria for astrophysical model selection

    CERN Document Server

    Liddle, A R

    2007-01-01

    Model selection is the problem of distinguishing competing models, perhaps featuring different numbers of parameters. The statistics literature contains two distinct sets of tools, those based on information theory such as the Akaike Information Criterion (AIC), and those on Bayesian inference such as the Bayesian evidence and Bayesian Information Criterion (BIC). The Deviance Information Criterion combines ideas from both heritages; it is readily computed from Monte Carlo posterior samples and, unlike the AIC and BIC, allows for parameter degeneracy. I describe the properties of the information criteria, and as an example compute them from WMAP3 data for several cosmological models. I find that at present the information theory and Bayesian approaches give significantly different conclusions from that data.

  12. Improving the process of process modelling by the use of domain process patterns

    Science.gov (United States)

    Koschmider, Agnes; Reijers, Hajo A.

    2015-01-01

    The use of business process models has become prevalent in a wide area of enterprise applications. But while their popularity is expanding, concerns are growing with respect to their proper creation and maintenance. An obvious way to boost the efficiency of creating high-quality business process models would be to reuse relevant parts of existing models. At this point, however, limited support exists to guide process modellers towards the usage of appropriate model content. In this paper, a set of content-oriented patterns is presented, which is extracted from a large set of process models from the order management and manufacturing production domains. The patterns are derived using a newly proposed set of algorithms, which are being discussed in this paper. The authors demonstrate how such Domain Process Patterns, in combination with information on their historic usage, can support process modellers in generating new models. To support the wider dissemination and development of Domain Process Patterns within and beyond the studied domains, an accompanying website has been set up.

  13. Growth and Visual Information Processing in Infants in Southern Ethiopia

    Science.gov (United States)

    Kennedy, Tay; Thomas, David G.; Woltamo, Tesfaye; Abebe, Yewelsew; Hubbs-Tait, Laura; Sykova, Vladimira; Stoecker, Barbara J.; Hambidge, K. Michael

    2008-01-01

    Speed of information processing and recognition memory can be assessed in infants using a visual information processing (VIP) paradigm. In a sample of 100 infants 6-8 months of age from Southern Ethiopia, we assessed relations between growth and VIP. The 69 infants who completed the VIP protocol had a mean weight z score of -1.12 plus or minus…

  14. Agricultural information dissemination using ICTs: A review and analysis of information dissemination models in China

    Directory of Open Access Journals (Sweden)

    Yun Zhang

    2016-03-01

    Full Text Available Over the last three decades, China’s agriculture sector has been transformed from the traditional to modern practice through the effective deployment of Information and Communication Technologies (ICTs. Information processing and dissemination have played a critical role in this transformation process. Many studies in relation to agriculture information services have been conducted in China, but few of them have attempted to provide a comprehensive review and analysis of different information dissemination models and their applications. This paper aims to review and identify the ICT based information dissemination models in China and to share the knowledge and experience in applying emerging ICTs in disseminating agriculture information to farmers and farm communities to improve productivity and economic, social and environmental sustainability. The paper reviews and analyzes the development stages of China’s agricultural information dissemination systems and different mechanisms for agricultural information service development and operations. Seven ICT-based information dissemination models are identified and discussed. Success cases are presented. The findings provide a useful direction for researchers and practitioners in developing future ICT based information dissemination systems. It is hoped that this paper will also help other developing countries to learn from China’s experience and best practice in their endeavor of applying emerging ICTs in agriculture information dissemination and knowledge transfer.

  15. Information architecture: Standards adoption and retirement process service action plan

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    The purpose of this Service Action Plan is to announce, as well as provide, a high-level outline of a new Departmental process for the adoption and retirement of information technology standards. This process supports the implementation of a Department of Energy (DOE) Information Architecture. This plan was prepared with the Department of Energy information technology standards customers and stakeholders in mind. The process described in this plan will be serviced primarily by staff from the Office of the Deputy Assistant Secretary for Information Management with assistance from designated program and site Information Technology Standards Points of Contact. We welcome any comments regarding this new Departmental process and encourage the proposal of information technology standards for adoption or retirement.

  16. Information architecture: Standards adoption and retirement process service action plan

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    The purpose of this Service Action Plan is to announce, as well as provide, a high-level outline of a new Departmental process for the adoption and retirement of information technology standards. This process supports the implementation of a Department of Energy (DOE) Information Architecture. This plan was prepared with the Department of Energy information technology standards customers and stakeholders in mind. The process described in this plan will be serviced primarily by staff from the Office of the Deputy Assistant Secretary for Information Management with assistance from designated program and site Information Technology Standards Points of Contact. We welcome any comments regarding this new Departmental process and encourage the proposal of information technology standards for adoption or retirement.

  17. Preliminary Investigation on Information Processing for the Weak Observation Problem

    Directory of Open Access Journals (Sweden)

    Hu Wei-dong

    2014-08-01

    Full Text Available Increasingly, detecting systems are facing the common problem of information processing owing to low SNR, low data rate, low resolution, and low information dimensions, and it is called the weak observation problem. This paper analyzes its origin and proposes the concept of information assembling by using the repetition and prediction properties of the object of interest. Then, the probability of the cloud inference method based on Bayesian theory is proposed to address a weak observation problem such as state estimation. Eventually several new requirements for sensor design, information processing, and system control are discussed, which are three crucial factors in information system design.

  18. 面向OLGP的多维信息网络数据仓库模型设计%Design of Multi-Dimensional Information Network Datawarehouse Model for Online Graph Processing

    Institute of Scientific and Technical Information of China (English)

    聂章艳; 李川; 唐常杰; 徐洪宇; 张永辉; 杨宁

    2014-01-01

    信息网络的出现使信息由简单的数值型数据演化成较复杂的图网络结构。如何对基于图的信息网络数据进行良好的组织和存储成为一个亟待解决的问题。利用维建模的方法对基于图的信息网络数据进行模型设计,提出了多维信息网络仓库模型。该模型由边事实表、节点事实表、信息维连接属性表以及拓扑维节点属性表组成,能够为在线图处理提供底层的数据平台。实验表明该模型在消除冗余、查询时间、存储空间上均较泛关系表有明显优势。新模型在1.25万篇ACM论文上的查询时间稳定在几十毫秒,较泛关系表的查询时间约减少一个数量级。在空间性能上,随着论文数量的增加,该模型存储空间开销的增长速度远小于泛关系表的增长速度。%With the emergence of information network,the information evolves from simple numerical data to complex graph network. How to organize and store the information network data becomes an urging problem. This paper proposes a multi-dimension information network datawarehouse model (MINDM), which aims to provide the data foundation to online graph processing. The MINDM includes edge fact table, node fact table, information link attribution table and topology node attribution table. The experimental results show that the MINDM can eliminate redundancy, reduce the cost of average query time, and save the space storage. The query time remains stable within a few milliseconds while performing queries on the 12.5 thousand ACM papers real dataset, keeping sharp comparison to van relation model with more than hundreds of milliseconds for the same processing stage. With the number of papers growing, the storage space of the proposed model increases much slower than the van relation model.

  19. Managing Event Information Modeling, Retrieval, and Applications

    CERN Document Server

    Gupta, Amarnath

    2011-01-01

    With the proliferation of citizen reporting, smart mobile devices, and social media, an increasing number of people are beginning to generate information about events they observe and participate in. A significant fraction of this information contains multimedia data to share the experience with their audience. A systematic information modeling and management framework is necessary to capture this widely heterogeneous, schemaless, potentially humongous information produced by many different people. This book is an attempt to examine the modeling, storage, querying, and applications of such an

  20. Information processing speed in obstructive sleep apnea syndrome: a review.

    Science.gov (United States)

    Kilpinen, R; Saunamäki, T; Jehkonen, M

    2014-04-01

    To provide a comprehensive review of studies on information processing speed in patients with obstructive sleep apnea syndrome (OSAS) as compared to healthy controls and normative data, and to determine whether continuous positive airway pressure (CPAP) treatment improves information processing speed. A systematic review was performed on studies drawn from Medline and PsycINFO (January 1990-December 2011) and identified from lists of references in these studies. After inclusion criteria, 159 articles were left for abstract review, and after exclusion criteria 44 articles were fully reviewed. The number of patients in the studies reviewed ranged from 10 to 157 and the study samples consisted mainly of men. Half of the studies reported that patients with OSAS showed reduced information processing speed when compared to healthy controls. Reduced information processing speed was seen more often (75%) when compared to norm-referenced data. Psychomotor speed seemed to be particularly liable to change. CPAP treatment improved processing speed, but the improvement was marginal when compared to placebo or conservative treatment. Patients with OSAS are affected by reduced information processing speed, which may persist despite CPAP treatment. Information processing is usually assessed as part of other cognitive functioning, not as a cognitive domain per se. However, it is important to take account of information processing speed when assessing other aspects of cognitive functioning. This will make it possible to determine whether cognitive decline in patients with OSAS is based on lower-level or higher-level cognitive processes or both.