Fagg, G.E.; Moore, K. [Univ. of Tennessee, Knoxville, TN (United States). Dept. of Computer Science; Dongarra, J.J. [Univ. of Tennessee, Knoxville, TN (United States). Dept. of Computer Science]|[Oak Ridge National Lab., TN (United States). Computer Science and Mathematics Div.; Geist, A. [Oak Ridge National Lab., TN (United States). Computer Science and Mathematics Div.
SNIPE is a metacomputing system that aims to provide a reliable, secure, fault tolerant environment for long term distributed computing applications and data stores across the global Internet. This system combines global naming and replication of both processing and data to support large scale information processing applications leading to better availability and reliability than currently available with typical cluster computing and/or distributed computer environments.
Policy makers must consider the work force, technology, cost, and legal implications of their legislative proposals. AHIMA, AAMT, CHIA, and MTIA urge lawmakers to craft regulatory solutions that enforce HIPAA and support advancements in modern health information processing practices that improve the quality and cost of healthcare. We also urge increased investment in health information work force development and implementation of new technologies to advance critical healthcare outcomes--timely, accurate, accessible, and secure information to support patient care. It is essential that state legislatures reinforce the importance of improving information processing solutions for healthcare and not take actions that will produce unintended and detrimental consequences.
Modjaev, A. D.; Leonova, N. M.
Recent years, a new scientific branch connected with the activities in social sphere management developing intensively and it is called "Social Cybernetics". In the framework of this scientific branch, theory and methods of management of social sphere are formed. Considerable attention is paid to the management, directly in real time. However, the decision of such management tasks is largely constrained by the lack of or insufficiently deep study of the relevant sections of the theory and methods of management. The article discusses the use of cybernetic principles in solving problems of control in social systems. Applying to educational activities a model of composite interrelated objects representing the behaviour of students at various stages of educational process is introduced. Statistical processing of experimental data obtained during the actual learning process is being done. If you increase the number of features used, additionally taking into account the degree and nature of variability of levels of current progress of students during various types of studies, new properties of students' grouping are discovered. L-clusters were identified, reflecting the behaviour of learners with similar characteristics during lectures. It was established that the characteristics of the clusters contain information about the dynamics of learners' behaviour, allowing them to be used in additional lessons. The ways of solving the problem of adaptive control based on the identified dynamic characteristics of the learners are planned.
Full Text Available Sensemaking is a popular and useful organizational behavior concept that is gaining visibility in the field of information systems. However, it remains relatively unknown compared to more established information systems concepts like technology acceptance and resistance. To enhance and propel greater use of sensemaking in information systems, this article offers a systematic explanation of sensemaking, specifically focusing on its concept, process, strengths, and shortcomings, as well as discussing ways forward for information systems in contemporary business environments.
Оksana S. Savelyeva
Full Text Available The questions concerning the insurance of openness and transparency of the educational process, monitoring the provision of educational services and the quality of learning within a unified information environment of Odessa National Polytechnic University are considered. It is proposed to consider the organization of the educational process as a major component of the educational process, that is a system of activities covering the distribution of the academic load between departments, recruitment of teachers, the formation of class schedules, consultation, final control and state certification. The analysis and the forming of set of parameters are carried out, the main components of the functional subsystem "The organization of educational process" as one of the components of the information environment of university are identified. Building a system hierarchically ensures the effective management of subsystems of organization of educational process and interaction between participants of the educational process and allows the system to change quickly if it is necessary.
Kleinschmidt, Elko; de Brentani, Ulrike; Salomo, Søren
, functionally, geographically and culturally. To this end, an IT-communication strength is essential, one that is nested in an internal organizational environment that ensures its effective functioning. Using organizational information processing (OIP) theory as a framework, superior global NPD program......Innovation in its essence is an information processing activity. Thus, a major factor impacting the success of new product development (NPD) programs, especially those responding to global markets, is the firm's ability to access, share and apply NPD information, which is often widely dispersed...
Yuriy F. Telnov
Full Text Available The paper represents the technology of application of dynamic intelligent process management system for integrated information-educational environment of university and providing the access for community in order to develop flexible education programs and teaching manuals based on multi-agent and service-oriented architecture. The article depicts the prototype of dynamic intelligent process management system using for forming of educational-methodic body. Efficiency of creation and usage of dynamic intelligent process management system is evaluated.
Pereira, Raphael Gomes; Aguiar, Leandro Pfleger de [Siemens Company (Brazil)
With the recently globalization expansion (growth), the exploration of energetic resources is crossing over countries boundaries, resulting in worldwide companies exploring Oil and Gas fields available in any place of the world. To the government's bodies, this information about those fields should be treated as a national security interest subject by bringing an adequate management and protection to all the important and critical information and assets, and making possible, at the same time, the freedom and transparency in concurrence processes. This create a complex security context to be managed, where information disruption might, for instance, imply in broke of integrity in public auctions processes as a result of privileged information usage. Furthermore, with the terrorism problem, the process itself becomes an attractive target for different kinds of attacks, motivated by the opportunism to explore the known incapacity of the big industries in well manage their large and complex environments. With all transformations that are happening in productive processes, as the growing TCP/IP protocol usage, the Windows operating systems adoption in SCADA systems and the integration of industrial with business network, are factors that contribute to an eminent landscape of problems. This landscape demonstrates the need from the organizations and countries that are operating in energetic resources exploration, for renew their risk management areas, establishing a unique and integrated process to protect information security infrastructure. This work presents a study of the challenges to be faced by the organizations while rebuilding their internal processes to integrate the risk management and information security areas, as long as a set of essential steps to establish an affective corporative governance of risk management and compliance aspects. Moreover, the work presents the necessary points of the government involvement to improve all the regulatory aspects
Victor Ya. Tsvetkov
Full Text Available The article analyzes information space, information field and information environment; shows that information space can be natural and artificial; information field is substantive and processual object and articulates the space property; information environment is concerned with some object and acts as the surrounding in relation to it and is considered with regard to it. It enables to define information environment as a subset of information space. It defines its passive description. Information environment can also be defined as a subset of information field. It corresponds to its active description.
Yuri V. Dragnev
Full Text Available The article examines information environment as an integral element of information space in the process of professional development of future teacher of physical culture, notes that the strategic objective of the system of higher education is training of competent future teacher of physical culture in the field of information technologies, when information competence and information culture are major components of professionalism in modern information-oriented society
Russo-Ponsaran, Nicole; McKown, Clark; Johnson, Jason; Russo, Jaclyn; Crossman, Jacob; Reife, Ilana
Social information processing (SIP) skills are critical for developing and maintaining peer relationships. Building on existing assessment techniques, Virtual Environment for SIP (VESIP TM ), a simulation-based assessment that immerses children in social decision-making scenarios, was developed. This study presents preliminary evidence of VESIP's usefulness for measuring SIP skills in children with and without autism spectrum disorders (ASD). Twenty-one children with ASD and 29 control children participated. It was hypothesized that (a) children (8-12 years old), with and without ASD, would understand and interact effectively with VESIP; (b) VESIP scores would be reliable in both populations; and (c) children with ASD would score lower on SIP domains than typically developing peers. Results supported these hypotheses. Finally, response bias was also evaluated, showing that children with ASD have different problem-solving strategies than their peers. VESIP has great potential as a scalable assessment of SIP strengths and challenges in children with and without ASD. Autism Res 2018, 11: 305-317. © 2017 International Society for Autism Research, Wiley Periodicals, Inc. Children with autism spectrum disorders (ASDs) often struggle interpreting and responding to social situations. The present study suggests that an animated, simulation-based assessment approach is an effective way to measure how children with or without ASDs problem-solve challenging social situations. VESIP is an easy-to-use assessment tool that can help practitioners understand a child's particular strengths and weaknesses. © 2017 International Society for Autism Research, Wiley Periodicals, Inc.
As an important part of software engineering, the software process decides the success or failure of software product. The design and development feature of security software process is discussed, so is the necessity and the present significance of using such process. Coordinating the function software, the process for security software and its testing are deeply discussed. The process includes requirement analysis, design, coding, debug and testing, submission and maintenance. In each process, the paper proposed the subprocesses to support software security. As an example, the paper introduces the above process into the power information platform.
Pieters, Julius Marie; Limbach, R.; de Jong, Anthonius J.M.
A systematic analysis of the design process of authors of (simulation based) discovery learning environments was carried out. The analysis aimed at identifying the design activities of authors and categorising knowledge gaps that they experience. First, five existing studies were systematically
Vishnyakova, Dina; Gobeill, Julien; Oezdemir-Zaech, Fatma; Kreim, Olivier; Vachon, Therese; Clade, Thierry; Haenning, Xavier; Mikhailov, Dmitri; Ruch, Patrick
We present an electronic capture tool to process informed consents, which are mandatory recorded when running a clinical trial. This tool aims at the extraction of information expressing the duration of the consent given by the patient to authorize the exploitation of biomarker-related information collected during clinical trials. The system integrates a language detection module (LDM) to route a document into the appropriate information extraction module (IEM). The IEM is based on language-specific sets of linguistic rules for the identification of relevant textual facts. The achieved accuracy of both the LDM and IEM is 99%. The architecture of the system is described in detail.
Full Text Available Significant economic losses, large affected populations, and serious environmental damage caused by recurrent natural disaster events (NDE worldwide indicate insufficiency in emergency preparedness and response. The barrier of full life cycle data preparation and information support is one of the main reasons. This paper adopts the method of integrated environmental modeling, incorporates information from existing event protocols, languages, and models, analyzes observation demands from different event stages, and forms the abstract full life cycle natural disaster event metamodel (FLCNDEM based on meta-object facility. Then task library and knowledge base for floods are built to instantiate FLCNDEM, forming the FLCNDEM for floods (FLCNDEMF. FLCNDEMF is formalized according to Event Pattern Markup Language, and a prototype system, Natural Disaster Event Manager, is developed to assist in the template-based modeling and management. The flood in Liangzi (LZ Lake of Hubei, China on 16 July 2010 is adopted to illustrate how to apply FLCNDEM in real scenarios. FLCNDEM-based modeling is realized, and the candidate remote sensing (RS dataset for different observing missions are provided for LZ Lake flood. Taking the mission of flood area extraction as an example, the appropriate RS data are selected via the model of simplified general perturbation version 4, and the flood area in different phases are calculated and displayed on the map. The phase-based modeling and visualization intuitively display the spatial-temporal distribution and the evolution process of the LZ Lake flood, and it is of great significance for flood responding. In addition, through the extension mechanism, FLCNDEM can also be applied in other environmental applications, providing important support for full life cycle information sharing and rapid responding.
Modern digital media already permeate the physical world. The portability of information devices and the ubiquity of networks allow us to access information practically anyplace, creating digital overlays on reality. This also allows us to bring information we routinely archive in museums and
The advantage of the photon's mobility makes optical quantum system ideally suited for delegated quantum computation. I will present results for the realization for a measurement-based quantum network in a client-server environment, where quantum information is securely communicated and computed. Related to measurement-based quantum computing I will discuss a recent experiment showing that quantum discord can be used as resource for the remote state preparation, which might shine new light on the requirements for quantum-enhanced information processing. Finally, I will briefly review recent photonic quantum simulation experiments of four frustrated Heisenberg-interactions spins and present an outlook of feasible simulation experiments with more complex interactions or random walk structures. As outlook I will discuss the current status of new quantum technology for improving the scalability of photonic quantum systems by using superconducting single-photon detectors and tailored light-matter interactions. (author)
Information Sharing Environment — This is a survey of federal departments and agencies who share terrorism information and are therefore considered part of the Information Sharing Environment. The...
Information Sharing Environment — This is a survey of federal departments and agencies who share terrorism information and are therefore considered part of the Information Sharing Environment. The...
Kosarev, Yu G; Gusev, V D
Works are presented on automation systems for editing and publishing operations by methods of processing symbol information and information contained in training selection (ranking of objectives by promise, classification algorithm of tones and noise). The book will be of interest to specialists in the automation of processing textural information, programming, and pattern recognition.
Information retrieval is a central and essential activity. It is indeed difficult to find a human activity that does not need to retrieve information in an environment which is often increasingly digital: moving and navigating, learning, having fun, communicating, informing, making a decision, etc. Most human activities are intimately linked to our ability to search quickly and effectively for relevant information, the stakes are sometimes extremely important: passing an exam, voting, finding a job, remaining autonomous, being socially connected, developing a critical spirit, or simply surviv
John Sweller; Susan Sweller
Natural information processing systems such as biological evolution and human cognition organize information used to govern the activities of natural entities. When dealing with biologically secondary information, these systems can be specified by five common principles that we propose underlie natural information processing systems. The principles equate: (1) human long-term memory with a genome; (2) learning from other humans with biological reproduction; (3) problem solving through random ...
Leuchs, Gerd; Beth, Thomas
... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.5 SimulationofHamiltonians... References... 1 1 1 3 5 8 10 2 Quantum Information Processing and Error Correction with Jump Codes (G. Alber, M. Mussinger...
H. Javaheri, T. Nasrabadi, M. H. Jafarian, G. R. Rowshan, H. Khoshnam
Full Text Available Municipal solid waste generation is among the most significant sources which threaten the global environmental health. As an ideal selection depends on considering several independent factors concerning land use, socio economy and hydrogeology, the use of a multi criteria evaluation method seems inevitable. Taking benefit of geographic information system as a tool in combination with geographical information technology, equips the spatial decision support systems in appropriate site selection of sanitary landfills. The present study involves a kind of multi criteria evaluation method under the name of weighted linear combination by using geographical information technology as a practical instrument to evaluate the suitability of the vicinity of Giroft city in Kerman province of Iran for landfill. Water permeability, slope, distance from rivers, depth of underground watertable, distance from residential areas, distance from generation centers, general environmental criterion and distance from roads are the criteria which have been taken in to consideration in the process of analyzing. Superposing all of the raster type layers including geomorphologic, hydrologic, humanistic and land use criteria in land suitability, the final zoning of appropriate, fairly appropriate and inappropriate districts have been identified. Considering relative priority of all criteria in comparison with others, a specific weight is designated to each criterion according to their total influence on the whole process of decision making. The results from the application of the presented methodology are zones for landfill with varying zonal land suitability. Finally the zones will be ranked in descending order to indicate the priority of different options in front of the eyes of decision makers. The results achieved by this study may help policy makers of Giroft city by a variety of options for being considered as sanitary landfill locations.
Design of human environment is to be made with understanding human-human and human-environment relations and environmental behaviors of human beings, artifacts and natural things and overcoming their differences and contradictions. Information divide exists naturally due to various differences of human beings. Many problems in the area of nuclear energy seem to be derived from various differences and contradictions in central-local interests, between the particles concerned and unconcerned and also in human being-artifacts relations. In order to harmonize nuclear energy with the society, it is necessary to vanish differences and solve contradictions with redesigning environments of those problems in their context. Case studies are highly recommended with continuous efforts to develop more universal design methodology. Open access to information and data in science and technology is encouraged in the area of nuclear energy. (T. Tanaka)
Furusawa, Akira [Department of Applied Physics, School of Engineering, The University of Tokyo (Japan)
I will briefly explain the definition and advantage of hybrid quantum information processing, which is hybridization of qubit and continuous-variable technologies. The final goal would be realization of universal gate sets both for qubit and continuous-variable quantum information processing with the hybrid technologies. For that purpose, qubit teleportation with a continuousvariable teleporter is one of the most important ingredients.
Glynn, Pierre D.; Voinov, Alexey A.; Shapiro, Carl D.; White, Paul A.
Our different kinds of minds and types of thinking affect the ways we decide, take action, and cooperate (or not). Derived from these types of minds, innate biases, beliefs, heuristics, and values (BBHV) influence behaviors, often beneficially, when individuals or small groups face immediate, local, acute situations that they and their ancestors faced repeatedly in the past. BBHV, though, need to be recognized and possibly countered or used when facing new, complex issues or situations especially if they need to be managed for the benefit of a wider community, for the longer-term and the larger-scale. Taking BBHV into account, we explain and provide a cyclic science-infused adaptive framework for (1) gaining knowledge of complex systems and (2) improving their management. We explore how this process and framework could improve the governance of science and policy for different types of systems and issues, providing examples in the area of natural resources, hazards, and the environment. Lastly, we suggest that an "Open Traceable Accountable Policy" initiative that followed our suggested adaptive framework could beneficially complement recent Open Data/Model science initiatives.
McFadden, D.; Tavakkoli, A.; Regenbrecht, J.; Wilson, B.
Virtual Reality (VR) and Augmented Reality (AR) applications have recently seen an impressive growth, thanks to the advent of commercial Head Mounted Displays (HMDs). This new visualization era has opened the possibility of presenting researchers from multiple disciplines with data visualization techniques not possible via traditional 2D screens. In a purely VR environment researchers are presented with the visual data in a virtual environment, whereas in a purely AR application, a piece of virtual object is projected into the real world with which researchers could interact. There are several limitations to the purely VR or AR application when taken within the context of remote planetary exploration. For example, in a purely VR environment, contents of the planet surface (e.g. rocks, terrain, or other features) should be created off-line from a multitude of images using image processing techniques to generate 3D mesh data that will populate the virtual surface of the planet. This process usually takes a tremendous amount of computational resources and cannot be delivered in real-time. As an alternative, video frames may be superimposed on the virtual environment to save processing time. However, such rendered video frames will lack 3D visual information -i.e. depth information. In this paper, we present a technique to utilize a remotely situated robot's stereoscopic cameras to provide a live visual feed from the real world into the virtual environment in which planetary scientists are immersed. Moreover, the proposed technique will blend the virtual environment with the real world in such a way as to preserve both the depth and visual information from the real world while allowing for the sensation of immersion when the entire sequence is viewed via an HMD such as Oculus Rift. The figure shows the virtual environment with an overlay of the real-world stereoscopic video being presented in real-time into the virtual environment. Notice the preservation of the object
Full Text Available The paper systematizes several theoretical view-points on scientific information processing skill. It decomposes the processing skills into sub-skills. Several methods such analysis, synthesis, induction, deduction, document analysis were used to build up a theoretical framework. Interviews and survey to professional being trained and a case study was carried out to evaluate the results. All professional in the sample improved their performance in scientific information processing.
Mayer, Richard J. (Editor); Ackley, Keith A.; Wells, M. Sue; Mayer, Paula S. D.; Blinn, Thomas M.; Decker, Louis P.; Toland, Joel A.; Crump, J. Wesley; Menzel, Christopher P.; Bodenmiller, Charles A.
Information is one of an organization's most important assets. For this reason the development and maintenance of an integrated information system environment is one of the most important functions within a large organization. The Integrated Information Systems Evolution Environment (IISEE) project has as one of its primary goals a computerized solution to the difficulties involved in the development of integrated information systems. To develop such an environment a thorough understanding of the enterprise's information needs and requirements is of paramount importance. This document is the current release of the research performed by the Integrated Development Support Environment (IDSE) Research Team in support of the IISEE project. Research indicates that an integral part of any information system environment would be multiple modeling methods to support the management of the organization's information. Automated tool support for these methods is necessary to facilitate their use in an integrated environment. An integrated environment makes it necessary to maintain an integrated database which contains the different kinds of models developed under the various methodologies. In addition, to speed the process of development of models, a procedure or technique is needed to allow automatic translation from one methodology's representation to another while maintaining the integrity of both. The purpose for the analysis of the modeling methods included in this document is to examine these methods with the goal being to include them in an integrated development support environment. To accomplish this and to develop a method for allowing intra-methodology and inter-methodology model element reuse, a thorough understanding of multiple modeling methodologies is necessary. Currently the IDSE Research Team is investigating the family of Integrated Computer Aided Manufacturing (ICAM) DEFinition (IDEF) languages IDEF(0), IDEF(1), and IDEF(1x), as well as ENALIM, Entity
Quantum processing and communication is emerging as a challenging technique at the beginning of the new millennium. This is an up-to-date insight into the current research of quantum superposition, entanglement, and the quantum measurement process - the key ingredients of quantum information processing. The authors further address quantum protocols and algorithms. Complementary to similar programmes in other countries and at the European level, the German Research Foundation (DFG) started a focused research program on quantum information in 1999. The contributions - written by leading experts - bring together the latest results in quantum information as well as addressing all the relevant questions
This paper demonstrates, how cross-functional business processes may be aligned with product specification systems in an intra-organizational environment by integrating planning systems and expert systems, thereby providing an end-to-end integrated and an automated solution to the “build-to-order...
Dowdeswell, J. A; Scourse, James D
This volume examines the processes responsible for sedimentation in modern glaciomarine environments, and how such modern studies can be used as analogues in the interpretation of ancient glaciomarine sequences...
Patricia Hernández Salazar
Full Text Available Objective. Suggest the use of virtual learning environments as an Information Literacy (IL alternative. Method. Analysis of the main elements of web sites. To achieve this purpose the article includes the relationship between IL and the learning virtual environment (by defining both phrases; phases to create virtual IL programs; processes to elaborate didactic media; the applications that may support this plan; and the description of eleven examples of learning virtual environments IL experiences from four countries (Mexico, United States of America, Spain and United Kingdom these examples fulfill the conditions expressed. Results. We obtained four comparative tables examining five elements of each experience: objectives; target community; institution; country; and platform used. Conclusions. Any IL proposal should have a clear definition; IL experiences have to follow a didactic systematic process; described experiences are based on IL definition; the experiences analyzed are similar; virtual learning environments can be used as alternatives of IL.
Briggs, Andrew; Ferry, David; Stoneham, Marshall
Microelectronics and the classical information technologies transformed the physics of semiconductors. Photonics has given optical materials a new direction. Quantum information technologies, we believe, will have immense impact on condensed matter physics. The novel systems of quantum information processing need to be designed and made. Their behaviours must be manipulated in ways that are intrinsically quantal and generally nanoscale. Both in this special issue and in previous issues (see e.g., Spiller T P and Munro W J 2006 J. Phys.: Condens. Matter 18 V1-10) we see the emergence of new ideas that link the fundamentals of science to the pragmatism of market-led industry. We hope these papers will be followed by many others on quantum information processing in the Journal of Physics: Condensed Matter.
D'Ariano, Giacomo Mauro
I review some recent advances in foundational research at Pavia QUIT group. The general idea is that there is only Quantum Theory without quantization rules, and the whole Physics - including space-time and relativity - is emergent from the quantum-information processing. And since Quantum Theory itself is axiomatized solely on informational principles, the whole Physics must be reformulated in information-theoretical terms: this is the It from bit of J. A. Wheeler.The review is divided into four parts: a) the informational axiomatization of Quantum Theory; b) how space-time and relativistic covariance emerge from quantum computation; c) what is the information-theoretical meaning of inertial mass and of (ℎ/2π), and how the quantum field emerges; d) an observational consequence of the new quantum field theory: a mass-dependent refraction index of vacuum. I will conclude with the research lines that will follow in the immediate future.
Hartman, Carol R.; Burgess, Ann W.
This paper presents a neuropsychosocial model of information processing to explain a victimization experience, specifically child sexual abuse. It surveys the relation of sensation, perception, and cognition as a systematic way to provide a framework for studying human behavior and describing human response to traumatic events. (Author/JDD)
Attempts made to design and extend space system capabilities are reported. Special attention was given to establishing user needs for information or services which might be provided by space systems. Data given do not attempt to detail scientific, technical, or economic bases for the needs expressed by the users.
Processes are an integral part of nearly all organizations, driving their daily operations and support activities. Increasingly, these business processes are supported by some information system, e.g. Workflow Management Systems (WfMSs), or Enterprise Resource Planning (ERP) systems. Once a process
Dietel, Harvey M
An Introduction to Information Processing provides an informal introduction to the computer field. This book introduces computer hardware, which is the actual computing equipment.Organized into three parts encompassing 12 chapters, this book begins with an overview of the evolution of personal computing and includes detailed case studies on two of the most essential personal computers for the 1980s, namely, the IBM Personal Computer and Apple's Macintosh. This text then traces the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapte
Nielsen, Jørgen Lerche; Meyer, Kirsten
in an International Environment , “Construction and Communication of Knowledge” and RUC-online . Because of trends in late modern society traditional ways of acquiring knowledge are no longer efficient. Instead students should collaboratively work on projects with a high degree of mo-tivation. Competencies like......In this paper we discuss the opportunities and possibilities the new information environment offers for collaboration and participation in learning processes. The findings are based on four major sources: “Scenarios in computer-mediated and net-based education” , CLIENT – Collaborative Learning...
Intended for IT managers and assets protection professionals, this work aims to bridge the gap between information security, information systems security and information warfare. It covers topics such as the role of the corporate security officer; Corporate cybercrime; Electronic commerce and the global marketplace; Cryptography; and, more.
Tahir Shah, K.
There are an estimated 100,000 genes in the human genome of which 97% is non-coding. On the other hand, bacteria have little or no non-coding DNA. Non-coding region includes introns, ALU sequences, satellite DNA, and other segments not expressed as proteins. Why it exists? Why nature has kept non-coding during the long evolutionary period if it has no role in the development of complex life forms? Does complexity of a species somehow correlated to the existence of apparently useless sequences? What kind of capability is encoded within such nucleotide sequences that is a necessary, but not a sufficient condition for the evolution of complex life forms, keeping in mind the C-value paradox and the omnipresence of non-coding segments in higher eurkaryotes and also in many archea and prokaryotes. The physico-chemical description of biological processes is hardware oriented and does not highlight algorithmic or information processing aspect. However, an algorithm without its hardware implementation is useless as much as hardware without its capability to run an algorithm. The nature and type of computation an information-processing hardware can perform depends only on its algorithm and the architecture that reflects the algorithm. Given that enormously difficult tasks such as high fidelity replication, transcription, editing and regulation are all achieved within a long linear sequence, it is natural to think that some parts of a genome are involved is these tasks. If some complex algorithms are encoded with these parts, then it is natural to think that non-coding regions contain processing-information algorithms. A comparison between well-known automatic sequences and sequences constructed out of motifs is found in all species proves the point: noncoding regions are a sort of ''hardwired'' programs, i.e., they are linear representations of information-processing machines. Thus in our model, a noncoding region, e.g., an intron contains a program (or equivalently, it is
Barato, Andre C; Hartich, David; Seifert, Udo
We show that a rate of conditional Shannon entropy reduction, characterizing the learning of an internal process about an external process, is bounded by the thermodynamic entropy production. This approach allows for the definition of an informational efficiency that can be used to study cellular information processing. We analyze three models of increasing complexity inspired by the Escherichia coli sensory network, where the external process is an external ligand concentration jumping between two values. We start with a simple model for which ATP must be consumed so that a protein inside the cell can learn about the external concentration. With a second model for a single receptor we show that the rate at which the receptor learns about the external environment can be nonzero even without any dissipation inside the cell since chemical work done by the external process compensates for this learning rate. The third model is more complete, also containing adaptation. For this model we show inter alia that a bacterium in an environment that changes at a very slow time-scale is quite inefficient, dissipating much more than it learns. Using the concept of a coarse-grained learning rate, we show for the model with adaptation that while the activity learns about the external signal the option of changing the methylation level increases the concentration range for which the learning rate is substantial. (paper)
Lala, J. H.
Design and performance details of the advanced information processing system (AIPS) for fault and damage tolerant data processing on aircraft and spacecraft are presented. AIPS comprises several computers distributed throughout the vehicle and linked by a damage tolerant data bus. Most I/O functions are available to all the computers, which run in a TDMA mode. Each computer performs separate specific tasks in normal operation and assumes other tasks in degraded modes. Redundant software assures that all fault monitoring, logging and reporting are automated, together with control functions. Redundant duplex links and damage-spread limitation provide the fault tolerance. Details of an advanced design of a laboratory-scale proof-of-concept system are described, including functional operations.
Hemmatjo, Rasoul; Motamedzade, Majid; Aliabadi, Mohsen; Kalatpour, Omid; Farhadian, Maryam
Fire service workers often implement multiple duties in the emergency conditions, with such duties being mostly conducted in various ambient temperatures. The aim of the current study was to assess the firefighters' physiological responses, information processing, and working memory prior to and following simulated firefighting activities in three different hot environments. Seventeen healthy male firefighters performed simulated firefighting tasks in three separate conditions, namely (1) low heat (LH; 29-31°C, 55-60% relative humidity), (2) moderate heat (MH; 32-34°C, 55-60% relative humidity), and (3) severe heat (SH; 35-37°C, 55-60% relative humidity). It took about 45-50 minutes for each firefighter to finish all defined firefighting activities and the paced auditory serial addition test (PASAT). At the end of all the three experimental conditions, heart rate (HR) and tympanic temperature (TT) increased, while PASAT scores as a measure of information processing performance decreased relative to baseline. HR and TT were significantly higher at the end of the experiment in the SH (159.41 ± 4.25 beats/min; 38.22 ± 0.10°C) compared with the MH (156.59 ± 3.77 beats/min; 38.20 ± 0.10°C) and LH (154.24 ± 4.67 beats/min; 38.17 ± 0.10°C) conditions ( p 0.05). Nonetheless, there was a measurable difference in PASAT scores between LH and SH ( p information processing and working memory during firefighting activity.
Haase, Richard F.; Jome, LaRae M.; Ferreira, Joaquim Armando; Santos, Eduardo J. R.; Connacher, Christopher C.; Sendrowitz, Kerrin
The purpose of this study was to provide additional validity evidence for a model of person-environment fit based on polychronicity, stimulus load, and information processing capacities. In this line of research the confluence of polychronicity and information processing (e.g., the ability of individuals to process stimuli from the environment…
Science Communications International (SCI), formerly General Science Corporation, has developed several commercial products based upon experience acquired as a NASA Contractor. Among them are METPRO, a meteorological data acquisition and processing system, which has been widely used, RISKPRO, an environmental assessment system, and MAPPRO, a geographic information system. METPRO software is used to collect weather data from satellites, ground-based observation systems and radio weather broadcasts to generate weather maps, enabling potential disaster areas to receive advance warning. GSC's initial work for NASA Goddard Space Flight Center resulted in METPAK, a weather satellite data analysis system. METPAK led to the commercial METPRO system. The company also provides data to other government agencies, U.S. embassies and foreign countries.
Christiane Gomes dos Santos
Full Text Available This research deals with the process of search, navigation and retrieval of information by the person with blindness in web environment, focusing on knowledge of the areas of information recovery and architecture, to understanding the strategies used by these people to access the information on the web. It aims to propose the construction of an accessibility verification instrument, checklist, to be used to analyze the behavior of people with blindness in search actions, navigation and recovery sites and pages. It a research exploratory and descriptive of qualitative nature, with the research methodology, case study - the research to establish a specific study with the simulation of search, navigation and information retrieval using speech synthesis system, NonVisual Desktop Access, in assistive technologies laboratory, to substantiate the construction of the checklist for accessibility verification. It is considered the reliability of performed research and its importance for the evaluation of accessibility in web environment to improve the access of information for people with limited reading in order to be used on websites and pages accessibility check analysis.
Kruglanski, Michel; de Donder, Erwin; Messios, Neophytos; Hetey, Laszlo; Calders, Stijn; Evans, Hugh; Daly, Eamonn
SPENVIS is an ESA operational software developed and maintained at BIRA-IASB since 1996. It provides standardized access to most of the recent models of the hazardous space environment, through a user-friendly Web interface (http://www.spenvis.oma.be/). The system allows spacecraft engineers to perform a rapid analysis of environmental problems related to natural radiation belts, solar energetic particles, cosmic rays, plasmas, gases, magnetic fields and micro-particles. Various reporting and graphical utilities and extensive help facilities are included to allow engineers with relatively little familiarity to produce reliable results. SPENVIS also contains an active, integrated version of the ECSS Space Environment Standard and access to in-flight data on the space environment. Although SPENVIS in the first place is designed to help spacecraft designers, it is also used by technical universities in their educational programs. In the framework of the ESA Space Situational Awareness Preparatory Programme, SPENVIS will be part of the initial set of precursor services of the Space Weather segment. SPENVIS includes several engineering models to assess to effects of the space environment on spacecrafts such as surface and internal charging, energy deposition, solar cell damage and SEU rates. The presentation will review how such models could be connected to in situ measurements or forecasting models of the space environment in order to produce post event analysis or in orbit effects alert. The last developments and models implemented in SPENVIS will also be presented.
Focuses on the information needs of users that are changing as a results of changes in the availability of information content in electronic form. Highlights the trend and nature of the physical form in which information content is currently being made available for users' access and use in electronic information environments. (Author/LRW)
Stewart, L. J.
The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.
Kita, Nobuyuki; Kita, Yasuyo; Yang, Hai-quan
For the safety operation of nuclear power plants, it is important to store various information about plants for a long period and visualize those stored information as desired. The system called Environment Server is developed for realizing it. In this paper, the general concepts of Environment Server is explained and its partial implementation for archiving the image information gathered by inspection mobile robots into virtual world and visualizing them is described. An extension of Environment Server for supporting attention sharing is also briefly introduced. (author)
Елена Витальевна Комелина
Full Text Available In the article under consideration the structural and logical scheme of teaching school teams, which represents step-by-step professional development, is described. It considers the category of the pedagogues, their sphere of action and models of informative educational school environment. A programme called Complex of programmes for administration of the academic activity, training of the employees, that provide their implementation into the school activity, which is based on the idea of the competency building approach has been made.
business form in which information is entered by filling in blanks, or circling alternatives. The fields of the form cor- respond to the various pieces...power. Parallelism, rather than raw speed of the computing elements, seems to be the way that the 4-15 MACHINE INTELIGENCE brain gets such jobs done...MACHINE INTELIGENCE all intelligent systems. The purpose of this paper is to characterize the weak methods and to explain how and why they arise in
Farooq, Omar; Nielsen, Christian
they have more information. Our results also show that intellectual capital disclosure related to employees and strategic statements are the most important disclosures for analysts. Research limitations/implications: More relevant methods, such as survey or interviews with management, may be used to improve...... the information content of intellectual capital disclosure. Analysts, probably, deduce intellectual capital of a firm from interaction with management rather than financial statements. Practical implications: Firms in biotechnology sector can improve their information environment by disclosing more information...
applications in optical disk memory systems [91. This device is constructed in a glass /SiO2/Si waveguide. The choice of a Si substrate allows for the...contact mask) were formed in the photoresist deposited on all of the samples, we covered the unwanted gratings on each sample with cover glass slides...processing, let us consider TeO2 (v, = 620 m/s) as a potential substrate for applications requiring large time delays. This con- sideration is despite
Heynderickx, D.; Quaghebeur, B.; Evans, H. D. R.
The ESA SPace ENVironment Information System (SPENVIS) provides standardized access to models of the hazardous space environment through a user-friendly WWW interface. The interface includes parameter input with extensive defaulting, definition of user environments, streamlined production of results (both in graphical and textual form), background information, and on-line help. It is available on-line at http://www.spenvis.oma.be/spenvis/. SPENVIS Is designed to help spacecraft engineers perform rapid analyses of environmental problems and, with extensive documentation and tutorial information, allows engineers with relatively little familiarity with the models to produce reliable results. It has been developed in response to the increasing pressure for rapid-response tools for system engineering, especially in low-cost commercial and educational programmes. It is very useful in conjunction with radiation effects and electrostatic charging testing in the context of hardness assurance. SPENVIS is based on internationally recognized standard models and methods in many domains. It uses an ESA-developed orbit generator to produce orbital point files necessary for many different types of problem. It has various reporting and graphical utilities, and extensive help facilities. The SPENVIS radiation module features models of the proton and electron radiation belts, as well as solar energetic particle and cosmic ray models. The particle spectra serve as input to models of ionising dose (SHIELDOSE), Non-Ionising Energy Loss (NIEL), and Single Event Upsets (CREME). Material shielding is taken into account for all these models, either as a set of user-defined shielding thicknesses, or in combination with a sectoring analysis that produces a shielding distribution from a geometric description of the satellite system. A sequence of models, from orbit generator to folding dose curves with a shielding distribution, can be run as one process, which minimizes user interaction and
The reasons for the current widespread arguments between designers of advanced technological systems like, for instance, nuclear power plants and opponents from the general public concerning levels of acceptable risk may be found in incompatible definitions of risk, in differences in risk perception and criteria for acceptance, etc. Of importance may, however, also be the difficulties met in presenting the basis for risk analysis, such as the conceptual system models applied, in an explicit and credible form. Application of modern information technology for the design of control systems and human-machine interfaces together with the trends towards large centralised industrial installations have made it increasingly difficult to establish an acceptable model framework, in particular considering the role of human errors in major system failures and accidents. Different aspects of this problem are discussed in the paper, and areas are identified where research is needed in order to improve not only the safety of advanced systems, but also the basis for their acceptance by the general public. (author)
Reina Estupinan, John-Henry
identification of decoherence-free states in the collective decoherence limit. These states belong to subspaces of the system's Hilbert space that do not become entangled with the environment, making them ideal elements for the engineering of 'noiseless' quantum codes. The relations between decoherence of the quantum register and computational complexity based on the new dynamical results obtained for the register density matrix are also discussed. This thesis concludes by summarising and pointing out future directions, and in particular, by discussing some biological resonant energy transfer processes that may be useful for the processing of information at a quantum level. (author)
Schmidt, Erik Meineche
BRICS is a research centre and international PhD school in theoretical computer science, based at the University of Aarhus, Denmark. The centre has recently become engaged in quantum information processing in cooperation with the Department of Physics, also University of Aarhus. This extended...... abstract surveys activities at BRICS with special emphasis on the activities in quantum information processing....
Bartosz Mackowiak; Mirko Wiederholt
Decision-makers often face limited liability and thus know that their loss will be bounded. We study how limited liability affects the behavior of an agent who chooses how much information to acquire and process in order to take a good decision. We find that an agent facing limited liability processes less information than an agent with unlimited liability. The informational gap between the two agents is larger in bad times than in good times and when information is more costly to process.
Mahoney, John R; Ellison, Christopher J; Crutchfield, James P [Complexity Sciences Center and Physics Department, University of California at Davis, One Shields Avenue, Davis, CA 95616 (United States)], E-mail: email@example.com, E-mail: firstname.lastname@example.org, E-mail: email@example.com
We give a systematic expansion of the crypticity-a recently introduced measure of the inaccessibility of a stationary process's internal state information. This leads to a hierarchy of k-cryptic processes and allows us to identify finite-state processes that have infinite cryptic order-the internal state information is present across arbitrarily long, observed sequences. The crypticity expansion is exact in both the finite- and infinite-order cases. It turns out that k-crypticity is complementary to the Markovian finite-order property that describes state information in processes. One application of these results is an efficient expansion of the excess entropy-the mutual information between a process's infinite past and infinite future-that is finite and exact for finite-order cryptic processes. (fast track communication)
Lutters, Diederick; Wijnker, T.C.; Kals, H.J.J.
A recently proposed reference model indicates the use of structured information as the basis for the control of design and manufacturing processes. The model is used as a basis to describe the integration of design and process planning. A differentiation is made between macro- and micro process
Piccinini, Gualtiero; Scarantino, Andrea
Computation and information processing are among the most fundamental notions in cognitive science. They are also among the most imprecisely discussed. Many cognitive scientists take it for granted that cognition involves computation, information processing, or both - although others disagree vehemently. Yet different cognitive scientists use 'computation' and 'information processing' to mean different things, sometimes without realizing that they do. In addition, computation and information processing are surrounded by several myths; first and foremost, that they are the same thing. In this paper, we address this unsatisfactory state of affairs by presenting a general and theory-neutral account of computation and information processing. We also apply our framework by analyzing the relations between computation and information processing on one hand and classicism, connectionism, and computational neuroscience on the other. We defend the relevance to cognitive science of both computation, at least in a generic sense, and information processing, in three important senses of the term. Our account advances several foundational debates in cognitive science by untangling some of their conceptual knots in a theory-neutral way. By leveling the playing field, we pave the way for the future resolution of the debates' empirical aspects.
safe from the detrimental effects of noise and losses. In the present work we investigate continuous variables Gaussian quantum information in noisy environments, studying the effects of various noise sources in the cases of a quantum metrological task, an error correction scheme and discord...
opportunities for research into constitutional issues, constitutional development and the relationship ... Legal research is a fundamental skill in the legal profession.9 Although all areas of law do not require ..... 1999 Legal RSQ 78. 56 In the print information environment lawyers use standard citation formats, e.g. X v Z 1999.
National Academies Press, 2016
Chemistry plays a critical role in daily life, impacting areas such as medicine and health, consumer products, energy production, the ecosystem, and many other areas. Communicating about chemistry in informal environments has the potential to raise public interest and understanding of chemistry around the world. However, the chemistry community…
DOE O 232.1A, Occurrence Reporting and Processing of Operations Information, and 10 CFR 830.350, Occurrence Reporting and Processing of Operations Information (when it becomes effective), along with this manual, set forth occurrence reporting requirements for Department of Energy (DOE) Departmental Elements and contractors responsible for the management and operation of DOE-owned and -leased facilities. These requirements include categorization of occurrences related to safety, security, environment, health, or operations (``Reportable Occurrences``); DOE notification of these occurrences; and the development and submission of documented follow-up reports. This Manual provides detailed information for categorizing and reporting occurrences at DOE facilities. Information gathered by the Occurrence Reporting and processing System is used for analysis of the Department`s performance in environmental protection, safeguards and security, and safety and health of its workers and the public. This information is also used to develop lessons learned and document events that significantly impact DOE operations.
DOE O 232.1A, Occurrence Reporting and Processing of Operations Information, and 10 CFR 830.350, Occurrence Reporting and Processing of Operations Information (when it becomes effective), along with this manual, set forth occurrence reporting requirements for Department of Energy (DOE) Departmental Elements and contractors responsible for the management and operation of DOE-owned and -leased facilities. These requirements include categorization of occurrences related to safety, security, environment, health, or operations (''Reportable Occurrences''); DOE notification of these occurrences; and the development and submission of documented follow-up reports. This Manual provides detailed information for categorizing and reporting occurrences at DOE facilities. Information gathered by the Occurrence Reporting and processing System is used for analysis of the Department's performance in environmental protection, safeguards and security, and safety and health of its workers and the public. This information is also used to develop lessons learned and document events that significantly impact DOE operations
McDaniels, T.; Steyn, D. G.; Johnson, M. S.; Small, M.; Leclerc, G.; Vignola, R.; Chan, K.; Grossmann, I.; Wong-Parodi, G.
Improving resilience to drought in complex social-environmental systems (SES) is extraordinarily important, particularly for rural tropical locations where small changes in climate regimes can have dramatic SES impacts. Efforts to build drought resilience must necessarily be planned and implemented within SES governance systems that involve linkages in water and land use administration from local to national levels. These efforts require knowledge and understanding that links climate and weather forecasts to regional and local hydrology, to social-economic and environmental systems, and to governance processes. In order to provide structure for such complex choices and investments, we argue that a focus on structured decision processes that involve linkages among science, technological perspectives, and public values conducted with agencies and stakeholders will provide a crucial framework for comparing and building insight for pursuing alternative courses of action to build drought resilience. This paper focuses on a regional case study in the seasonally-dry northwest region of Costa Rica, in watersheds rated as most threatened in the country in terms of drought. We present the overall framework guiding the transdisciplinary efforts to link scientific and technical understanding to public values, in order to foster civil society actions that lead to improved drought resilience. Initial efforts to characterize hydrological and climate regimes will be reported along with our approach to linking natural science findings, social inventories in terms of perspectives on SES, and the psychology and patterns of reliance on forecast information that provide the basis for characterizing public understanding. The overall linkage of technical and value information is focused on creating and comparing alternative actions that can potentially build resilience in short and long time frames by building decision making processes involving stakeholders, agencies and interested
I consider the interaction of a small quantum system (a qubit) with a structured environment consisting of many levels. The qubit will experience a decoherence process, which implies that part of its initial information will be encoded into correlations between system and environment. I investigate how this information is distributed on a given subset of levels as a function of its size, using the mutual information between both entities, in the spirit of the partial-information plots studied by Zurek and co-workers. In this case we can observe some differences, which arise from the fact that I am partitioning just one quantum system and not a collection of them. However, some similar features, like redundancy (in the sense that a given amount of information is shared by many subsets), which increases with the size of the environment, are also found here.
In this article, I argue that consciousness is a unique way of processing information, in that: it produces information, rather than purely transmitting it; the information it produces is meaningful for us; the meaning it has is always individuated. This uniqueness allows us to process information on the basis of our personal needs and ever-changing interactions with the environment, and consequently to act autonomously. Three main basic cognitive processes contribute to realize this unique way of information processing: the self, attention and working memory. The self, which is primarily expressed via the central and peripheral nervous systems, maps our body, the environment, and our relations with the environment. It is the primary means by which the complexity inherent to our composite structure is reduced into the "single voice" of a unique individual. It provides a reference system that (albeit evolving) is sufficiently stable to define the variations that will be used as the raw material for the construction of conscious information. Attention allows for the selection of those variations in the state of the self that are most relevant in the given situation. Attention originates and is deployed from a single locus inside our body, which represents the center of the self, around which all our conscious experiences are organized. Whatever is focused by attention appears in our consciousness as possessing a spatial quality defined by this center and the direction toward which attention is focused. In addition, attention determines two other features of conscious experience: periodicity and phenomenal quality. Self and attention are necessary but not sufficient for conscious information to be produced. Complex forms of conscious experiences, such as the various modes of givenness of conscious experience and the stream of consciousness, need a working memory mechanism to assemble the basic pieces of information selected by attention.
Arnfred, Sidse M H
of the left somatosensory cortex and it was suggested to be in accordance with two theories of schizophrenic information processing: the theory of deficiency of corollary discharge and the theory of weakening of the influence of past regularities. No gating deficiency was observed and the imprecision...... Rado (1890-1972) suggested that one of two un-reducible deficits in schizophrenia was a disorder of proprioception. Exploration of proprioceptive information processing is possible through the measurement of evoked and event related potentials. Event related EEG can be analyzed as conventional time...... and amplitude attenuation was not a general phenomenon across the entire brain response. Summing up, in support of Rado's hypothesis, schizophrenia spectrum patients demonstrated abnormalities in proprioceptive information processing. Future work needs to extend the findings in larger un-medicated, non...
Mentink, R.J.; van Houten, Frederikus J.A.M.; Kals, H.J.J.
The research presented in this paper proposes a concept for dynamic process management as part of an integrated approach to engineering process support. The theory of information management is the starting point for the development of a process management system based on evolution of information
Scott, Mark R.; Michel, Kelly D.
For two decades, the IAEA has recognized the need for a comprehensive and strongly integrated Knowledge Management system to support its Information Driven Safeguards activities. In the past, plans for the development of such a system have progressed slowly due to concerns over costs and feasibility. In recent years, Los Alamos National Laboratory has developed a knowledge management system that could serve as the basis for an IAEA Collaborative Environment (ICE). The ICE derivative knowledge management system described in this paper addresses the challenge of living in an era of information overload coupled with certain knowledge shortfalls. The paper describes and defines a system that is flexible, yet ensures coordinated and focused collaboration, broad data evaluation capabilities, architected and organized work flows, and improved communications. The paper and demonstration of ICE will utilize a hypothetical scenario to highlight the functional features that facilitate collaboration amongst and between information analysts and inspectors. The scenario will place these two groups into a simulated planning exercise for a safeguards inspection drawing upon past data acquisitions, inspection reports, analyst conclusions, and a coordinated walk-through of a 3-D model of the facility. Subsequent to the conduct of the simulated facility inspection, the detection of an anomaly and pursuit of follow up activities will illustrate the event notification, information sharing, and collaborative capabilities of the system. The use of a collaborative environment such as ICE to fulfill the complicated knowledge management demands of the Agency and facilitate the completion of annual State Evaluation Reports will also be addressed.
Cain, Jeff; Policastri, Anne
To create, implement, and assess the effectiveness of an optional Facebook activity intended to expose students to contemporary business issues not covered in the core content of a pharmacy management and leadership course and to perspectives of experts and thought leaders external to their university. An informal learning strategy was used to create a Facebook group page and guest experts were identified and invited to submit posts pertaining to business-related topics. Students were given instructions for joining the Facebook group but informed that participation was optional. A mixed-methods approach using a student questionnaire, results on examination questions, and a student focus group was used to assess this activity. The informal design with no posting guidelines and no participation requirement was well received by students, who appreciated the unique learning environment and exposure to external experts. Facebook provides an informal learning environment for presenting contemporary topics and the thoughts of guest experts not affiliated with a college or school, thereby exposing students to relevant "real world" issues.
The ultimate goal of the classicality program is to quantify the amount of quantumness of certain processes. Here, classicality is studied for a restricted type of process: quantum information processing (QIP). Under special conditions, one can force some qubits of a quantum computer into a classical state without affecting the outcome of the computation. The minimal set of conditions is described and its structure is studied. Some implications of this formalism are the increase of noise robustness, a proof of the quantumness of mixed state quantum computing, and a step forward in understanding the very foundation of QIP
Baker, K. S.; Pennington, D. D.
Information infrastructure that supports collaborative science is a complex system of people, organizational arrangements, and tools that require co-management. Contemporary studies are exploring how to establish and characterize effective collaborative information environments. Collaboration depends on the flow of information across the human and technical system components through mechanisms that create linkages, both conceptual and technical. This transcends the need for requirements solicitation and usability studies, highlighting synergistic interactions between humans and technology that can lead to emergence of group level cognitive properties. We consider the ramifications of placing priority on establishing new metaphors and new types of learning environments located near-to-data-origin for the field sciences. In addition to changes in terms of participant engagement, there are implications in terms of innovative contributions to the design of information systems and data exchange. While data integration occurs in the minds of individual participants, it may be facilitated by collaborative thinking and community infrastructure. Existing learning frameworks - from Maslow’s hierarchy of needs to organizational learning - require modification and extension if effective approaches to decentralized information management and systems design are to emerge. Case studies relating to data integration include ecological community projects: development of cross-disciplinary conceptual maps and of a community unit registry.
Mandic, D.; Barbic, B.; Linke, B.; Colak, I.
Original NEK design was using several Process Computer Systems (PCS) for both process control and process supervision. PCS were built by different manufacturers around different hardware and software platforms. Operational experience and new regulatory requirements imposed new technical and functional requirements on the PCS. Requirements such as: - Acquisition of new signals from the technological processes and environment - Implementation of new application programs - Significant improvement of MMI (Man Machine Interface) - Process data transfer to other than Main Control Room (MCR) locations - Process data archiving and capability to retrieve same data for future analysis were impossible to be implemented within old systems. In order to satisfy new requirements, NEK has decided to build new Process Information System (PIS). During the design and construction of the PIS Project Phase I, in addition to the main foreign contractor, there was significant participation of local architect engineering and construction companies. This paper presents experience of NEK and local partners. (author)
... (CONTINUED) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.65 Process safety... 40 Protection of Environment 15 2010-07-01 2010-07-01 false Process safety information. 68.65... compilation of written process safety information before conducting any process hazard analysis required by...
Wickens, Christopher D.; Flach, John M.
Theoretical models of sensory-information processing by the human brain are reviewed from a human-factors perspective, with a focus on their implications for aircraft and avionics design. The topics addressed include perception (signal detection and selection), linguistic factors in perception (context provision, logical reversals, absence of cues, and order reversals), mental models, and working and long-term memory. Particular attention is given to decision-making problems such as situation assessment, decision formulation, decision quality, selection of action, the speed-accuracy tradeoff, stimulus-response compatibility, stimulus sequencing, dual-task performance, task difficulty and structure, and factors affecting multiple task performance (processing modalities, codes, and stages).
Seelen, Werner v
In this fundamental book the authors devise a framework that describes the working of the brain as a whole. It presents a comprehensive introduction to the principles of Neural Information Processing as well as recent and authoritative research. The books´ guiding principles are the main purpose of neural activity, namely, to organize behavior to ensure survival, as well as the understanding of the evolutionary genesis of the brain. Among the developed principles and strategies belong self-organization of neural systems, flexibility, the active interpretation of the world by means of construction and prediction as well as their embedding into the world, all of which form the framework of the presented description. Since, in brains, their partial self-organization, the lifelong adaptation and their use of various methods of processing incoming information are all interconnected, the authors have chosen not only neurobiology and evolution theory as a basis for the elaboration of such a framework, but also syst...
Pitts, Felix L.
Advanced Information Processing System (AIPS) is a computer systems philosophy, a set of validated hardware building blocks, and a set of validated services as embodied in system software. The goal of AIPS is to provide the knowledgebase which will allow achievement of validated fault-tolerant distributed computer system architectures, suitable for a broad range of applications, having failure probability requirements of 10E-9 at 10 hours. A background and description is given followed by program accomplishments, the current focus, applications, technology transfer, FY92 accomplishments, and funding.
As a planning activity, the objectives of the workshop were to list, prioritize and milestone the activities necessary to understand, interpret and control the mechanical behavior of candidate fusion reactor alloys. Emphasis was placed on flow and fracture processes which are unique to the fusion environment since the national fusion materials program must evaluate these effects without assistance from other reactor programs
Carolan, Fergal; Kyppö, Anna
This reflective practice paper offers some insights into teaching an interdisciplinary academic writing course aimed at promoting process writing. The study reflects on students' acquisition of writing skills and the teacher's support practices in a digital writing environment. It presents writers' experiences related to various stages of process…
Waste emplacement and activities associated with construction of a repository system potentially will change environmental conditions within the repository system. These environmental changes principally result from heat generated by the decay of the radioactive waste, which elevates temperatures within the repository system. Elevated temperatures affect distribution of water, increase kinetic rates of geochemical processes, and cause stresses to change in magnitude and orientation from the stresses resulting from the overlying rock and from underground construction activities. The recognition of this evolving environment has been reflected in activities, studies and discussions generally associated with what has been termed the Near-Field Environment (NFE). The NFE interacts directly with waste packages and engineered barriers as well as potentially changing the fluid composition and flow conditions within the mountain. As such, the NFE defines the environment for assessing the performance of a potential Monitored Geologic Repository at Yucca Mountain, Nevada. The NFe evolves over time, and therefore is not amenable to direct characterization or measurement in the ambient system. Analysis or assessment of the NFE must rely upon projections based on tests and models that encompass the long-term processes of the evolution of this environment. This NFE Process Model Report (PMR) describes the analyses and modeling based on current understanding of the evolution of the near-field within the rock mass extending outward from the drift wall.
Urquhart, Christine; Tbaishat, Dina; Yeoman, Alison
This book adopts a holistic interpretation of information architecture, to offer a variety of methods, tools, and techniques that may be used when designing websites and information systems that support workflows and what people require when 'managing information'.
Shin, Hyun Kook; Park, Jeong Seok; Baek, Seung Min; Kim, Young Jin; Joo, Jae Yoon; Lee, Sang Mok; Jeong, Young Woo; Seo, Ho Jun; Kim, Do Youn; Lee, Tae Hoon
The Operational Information Processing Platform(OIPP) is platform system which was designed to provide the development and operation environments for plant operation and plant monitoring. It is based on the Plant Computer Systems (PCS) of Yonggwang 3 and 4, Ulchin 3 and 4, and Yonggwang 5 and 6 Nuclear Power Plants (NPP). The UNIX based workstation, real time kernel and graphics design tool are selected and installed through the reviewing the function of PCS. In order to construct the development environment for open system architecture and distributed computer system, open computer system architecture was adapted both in hardware and software. For verification of system design and evaluation of technical methodologies, the PCS running under the OIPP is being designed and implemented. In this system, the man-machine interface and system functions are being designed and implemented to evaluate the differences between the UCN 3, 4 PCS and OIPP. 15 tabs., 32 figs., 11 refs. (Author)
Artem D. Beresnev
Full Text Available Subject of research. Information infrastructure for the training environment with application of technology of virtual computers for small pedagogical systems (separate classes, author's courses is created and investigated. Research technique. The life cycle model of information infrastructure for small pedagogical systems with usage of virtual computers in ARIS methodology is constructed. The technique of information infrastructure formation with virtual computers on the basis of process approach is offered. The model of an event chain in combination with the environment chart is used as the basic model. For each function of the event chain the necessary set of means of information and program support is defined. Technique application is illustrated on the example of information infrastructure design for the educational environment taking into account specific character of small pedagogical systems. Advantages of the designed information infrastructure are: the maximum usage of open or free components; the usage of standard protocols (mainly, HTTP and HTTPS; the maximum portability (application servers can be started up on any of widespread operating systems; uniform interface to management of various virtualization platforms, possibility of inventory of contents of the virtual computer without its start, flexible inventory management of the virtual computer by means of adjusted chains of rules. Approbation. Approbation of obtained results was carried out on the basis of training center "Institute of Informatics and Computer Facilities" (Tallinn, Estonia. Technique application within the course "Computer and Software Usage" gave the possibility to get half as much the number of refusals for components of the information infrastructure demanding intervention of the technical specialist, and also the time for elimination of such malfunctions. Besides, the pupils who have got broader experience with computer and software, showed better results
Chung, Chih-Hung; Angnakoon, Putthachat; Li, Jessica; Allen, Jeff
Purpose: The purpose of this study is to provide researchers with a better understanding of the cultural impact on information processing in virtual learning environment. Design/methodology/approach: This study uses a causal loop diagram to depict the cultural impact on information processing in the virtual human resource development (VHRD)…
Jelena Anđelković Labrović
Full Text Available Personal learning environments are a widely spared ways of learning, especially for the informal learning process. The aim of this research is to identify the elements of studens’ personal learning environment and to identify the extent to which students use modern technology for learning as part of their non-formal learning. A mapping system was used for gathering data and an analysis of percentages and frequency counts was used for data analysis in the SPSS. The results show that students’ personal learning environment includes the following elements: Wikipedia, Google, YouTube and Facebook in 75% of all cases, and an interesting fact is that all of them belong to a group of Web 2.0 tools and applications.
Natália Chaves Lessa Schots
Full Text Available Background: Process performance analysis is a key step for implementing continuous improvement in software organizations. However, the knowledge to execute such analysis is not trivial and the person responsible to executing it must be provided with appropriate support. Aim: This paper presents a knowledge-based environment, named SPEAKER, proposed for supporting software organizations during the execution of process performance analysis. SPEAKER comprises a body of knowledge and a set of activities and tasks for software process performance analysis along with supporting tools to executing these activities and tasks. Method: We conducted an informal literature reviews and a systematic mapping study, which provided basic requirements for the proposed environment. We implemented the SPEAKER environment integrating supporting tools for the execution of activities and tasks of performance analysis and the knowledge necessary to execute them, in order to meet the variability presented by the characteristics of these activities. Results: In this paper, we describe each SPEAKER module and the individual evaluations of these modules, and also present an example of use comprising how the environment can guide the user through a specific performance analysis activity. Conclusion: Although we only conducted individual evaluations of SPEAKER’s modules, the example of use indicates the feasibility of the proposed environment. Therefore, the environment as a whole will be further evaluated to verify if it attains its goal of assisting in the execution of process performance analysis by non-specialist people.
Quantum Information Processing (QIP) is expected to bring revolutionary enhancement to various technological areas. However, today's QIP applications are far from being practical. The problem involves both hardware issues, i.e., quantum devices are imperfect, and software issues, i.e., the functionality of some QIP applications is not fully understood. Aiming to improve the practicality of QIP, in my PhD research I have studied various topics in quantum cryptography and ion trap quantum computation. In quantum cryptography, I first studied the security of position-based quantum cryptography (PBQC). I discovered a wrong assumption in the previous literature that the cheaters are not allowed to share entangled resources. I proposed entanglement attacks that could cheat all known PBQC protocols. I also studied the practicality of continuous-variable (CV) quantum secret sharing (QSS). While the security of CV QSS was considered by the literature only in the limit of infinite squeezing, I found that finitely squeezed CV resources could also provide finite secret sharing rate. Our work relaxes the stringent resources requirement of implementing QSS. In ion trap quantum computation, I studied the phase error of quantum information induced by dc Stark effect during ion transportation. I found an optimized ion trajectory for which the phase error is the minimum. I also defined a threshold speed, above which ion transportation would induce significant error. In addition, I proposed a new application for ion trap systems as universal bosonic simulators (UBS). I introduced two architectures, and discussed their respective strength and weakness. I illustrated the implementations of bosonic state initialization, transformation, and measurement by applying radiation fields or by varying the trap potential. When comparing with conducting optical experiments, the ion trap UBS is advantageous in higher state initialization efficiency and higher measurement accuracy. Finally, I
Hurlburt, George; Hildreth, Bradley; Acevedo, Teresa
The Test and Evaluation Community Network (TECNET) is building a Multilevel Secure (MLS) system. This system features simultaneous access to classified and unclassified information and easy access through widely available communications channels. It provides the necessary separation of classification levels, assured through the use of trusted system design techniques, security assessments and evaluations. This system enables cleared T&E users to view and manipulate classified and unclassified information resources either using a single terminal interface or multiple windows in a graphical user interface. TECNET is in direct partnership with the National Security Agency (NSA) to develop and field the MLS TECNET capability in the near term. The centerpiece of this partnership is a state-of-the-art Concurrent Systems Security Engineering (CSSE) process. In developing the MLS TECNET capability, TECNET and NSA are providing members, with various expertise and diverse backgrounds, to participate in the CSSE process. The CSSE process is founded on the concepts of both Systems Engineering and Concurrent Engineering. Systems Engineering is an interdisciplinary approach to evolve and verify an integrated and life cycle balanced set of system product and process solutions that satisfy customer needs (ASD/ENS-MIL STD 499B 1992). Concurrent Engineering is design and development using the simultaneous, applied talents of a diverse group of people with the appropriate skills. Harnessing diverse talents to support CSSE requires active participation by team members in an environment that both respects and encourages diversity.
Jackson, Russell E; Calvillo, Dusti P
Visual search of the environment is a fundamental human behavior that perceptual load affects powerfully. Previously investigated means for overcoming the inhibitions of high perceptual load, however, generalize poorly to real-world human behavior. We hypothesized that humans would process evolutionarily relevant stimuli more efficiently than evolutionarily novel stimuli, and evolutionary relevance would mitigate the repercussions of high perceptual load during visual search. Animacy is a significant component to evolutionary relevance of visual stimuli because perceiving animate entities is time-sensitive in ways that pose significant evolutionary consequences. Participants completing a visual search task located evolutionarily relevant and animate objects fastest and with the least impact of high perceptual load. Evolutionarily novel and inanimate objects were located slowest and with the highest impact of perceptual load. Evolutionary relevance may importantly affect everyday visual information processing.
Russell E. Jackson
Full Text Available Visual search of the environment is a fundamental human behavior that perceptual load affects powerfully. Previously investigated means for overcoming the inhibitions of high perceptual load, however, generalize poorly to real-world human behavior. We hypothesized that humans would process evolutionarily relevant stimuli more efficiently than evolutionarily novel stimuli, and evolutionary relevance would mitigate the repercussions of high perceptual load during visual search. Animacy is a significant component to evolutionary relevance of visual stimuli because perceiving animate entities is time-sensitive in ways that pose significant evolutionary consequences. Participants completing a visual search task located evolutionarily relevant and animate objects fastest and with the least impact of high perceptual load. Evolutionarily novel and inanimate objects were located slowest and with the highest impact of perceptual load. Evolutionary relevance may importantly affect everyday visual information processing.
Haeffner, H.; Haensel, W.; Rapol, U.; Koerber, T.; Benhelm, J.; Riebe, M.; Chek-al-Kar, D.; Schmidt-Kaler, F.; Becher, C.; Roos, C.; Blatt, R.
Single Ca + ions and crystals of Ca + ions are confined in a linear Paul trap and are investigated for quantum information processing. Here we report on recent experimental advancements towards a quantum computer with such a system. Laser-cooled trapped ions are ideally suited systems for the investigation and implementation of quantum information processing as one can gain almost complete control over their internal and external degrees of freedom. The combination of a Paul type ion trap with laser cooling leads to unique properties of trapped cold ions, such as control of the motional state down to the zero-point of the trapping potential, a high degree of isolation from the environment and thus a very long time available for manipulations and interactions at the quantum level. The very same properties make single trapped atoms and ions well suited for storing quantum information in long lived internal states, e.g. by encoding a quantum bit (qubit) of information within the coherent superposition of the S 1/2 ground state and the metastable D 5/2 excited state of Ca + . Recently we have achieved the implementation of simple algorithms with up to 3 qubits on an ion-trap quantum computer. We will report on methods to implement single qubit rotations, the realization of a two-qubit universal quantum gate (Cirac-Zoller CNOT-gate), the deterministic generation of multi-particle entangled states (GHZ- and W-states), their full tomographic reconstruction, the realization of deterministic quantum teleportation, its quantum process tomography and the encoding of quantum information in decoherence-free subspaces with coherence times exceeding 20 seconds. (author)
Maggini, Marco; Jain, Lakhmi
This handbook presents some of the most recent topics in neural information processing, covering both theoretical concepts and practical applications. The contributions include: Deep architectures Recurrent, recursive, and graph neural networks Cellular neural networks Bayesian networks Approximation capabilities of neural networks Semi-supervised learning Statistical relational learning Kernel methods for structured data Multiple classifier systems Self organisation and modal learning Applications to ...
Timucin, Dogan Aslan
Low computational accuracy is an important obstacle for optical processors which blocks their way to becoming a practical reality and a serious challenger for classical computing paradigms. This research presents a comprehensive solution approach to the problem of accuracy enhancement in discrete analog optical information processing systems. Statistical analysis of a generic three-plane optical processor is carried out first, taking into account the effects of diffraction, interchannel crosstalk, and background radiation. Noise sources included in the analysis are photon, excitation, and emission fluctuations in the source array, transmission and polarization fluctuations in the modulator, and photoelectron, gain, dark, shot, and thermal noise in the detector array. Means and mutual coherence and probability density functions are derived for both optical and electrical output signals. Next, statistical models for a number of popular optoelectronic devices are studied. Specific devices considered here are light-emitting and laser diode sources, an ideal noiseless modulator and a Gaussian random-amplitude-transmittance modulator, p-i-n and avalanche photodiode detectors followed by electronic postprocessing, and ideal free-space geometrical -optics propagation and single-lens imaging systems. Output signal statistics are determined for various interesting device combinations by inserting these models into the general formalism. Finally, based on these special-case output statistics, results on accuracy limitations and enhancement in optical processors are presented. Here, starting with the formulation of the accuracy enhancement problem as (1) an optimal detection problem and (2) as a parameter estimation problem, the potential accuracy improvements achievable via the classical multiple-hypothesis -testing and maximum likelihood and Bayesian parameter estimation methods are demonstrated. Merits of using proper normalizing transforms which can potentially stabilize
Introduction: This paper examines the business external environment scanning theory for information in the context of Greece. Method. A questionnaire was developed to explore the relationships between general and task business environment, perceived uncertainty, scanning strategy, and sources of information with respect to type of environment,…
Jayakumar, C.; Narayanan, A.
The advent of Internet technology and its adoption by the organisations has resulted in the evolution of Intranets. Intranets ultimately use the technology to meet the information and computational needs to achieve organisational objectives and goals. Important services like E mail and Web are a handy solution to disseminate information for research and special libraries. The campus wide network and the creation of networked society are ubiquitous and an attempt has been made to extend the information service to the patrons by possible means. The role of library and information professionals in dissemination of information for the networked society is relevant and highly demanding. The right information for the right people at right time is to be achieved with available infrastructure. Few sample applications are described in this paper and the information need has to be met for the present and future networked information users. (author)
The material of the Jadwisin 93' seminar is the collection 20 of 19 articles discussing aspects of the subject of nuclear energy and natural environment. The lectures were presented at six sessions: 1) Nuclear energy applications in medicine, agriculture, industry, food preservation and protection of the environment; 2) Nuclear power in the world; 3) Public attitudes towards different energy options, the example of Sweden; 4) Nuclear power in neighbouring countries; 5) Radiation and human health; 6) Radioactive waste management and potential serious radiological hazards. The general conclusion of the seminar can be as follows. In some cases the nuclear power is a source of environment pollution but very often nuclear techniques are now used and certainly more often in the future will be used for environment and human health protection
Neal, P A
Realization of the unique potential of a health maintenance organization is dependent on the availability of adequate, accurate, and timely information. The particular data needed are determined by the structure of the organization; the physician compensation plans; requirements for state, federal, or other reporting; and many other factors. The author introduces the concept and objectives of the HMO, and presents the management information systems necessary for planning and monitoring HMO performance: patient information, utilization information, and management information for the staff and nonstaff HMO.
Andersen, Ulrik Lund; Leuchs, G.; Silberhorn, C.
the continuous degree of freedom of a quantum system for encoding, processing or detecting information, one enters the field of continuous-variable (CV) quantum information processing. In this paper we review the basic principles of CV quantum information processing with main focus on recent developments...... in the field. We will be addressing the three main stages of a quantum information system; the preparation stage where quantum information is encoded into CVs of coherent states and single-photon states, the processing stage where CV information is manipulated to carry out a specified protocol and a detection...... stage where CV information is measured using homodyne detection or photon counting....
Infochemistry: Information Processing at the Nanoscale, defines a new field of science, and describes the processes, systems and devices at the interface between chemistry and information sciences. The book is devoted to the application of molecular species and nanostructures to advanced information processing. It includes the design and synthesis of suitable materials and nanostructures, their characterization, and finally applications of molecular species and nanostructures for information storage and processing purposes. Divided into twelve chapters; the first three chapters serve as an int
Torres, Jesús; Saldaña, David; Rodríguez-Ortiz, Isabel R.
The goal of this study was to compare the processing of social information in deaf and hearing adolescents. A task was developed to assess social information processing (SIP) skills of deaf adolescents based on Crick and Dodge's (1994; A review and reformulation of social information-processing mechanisms in children's social adjustment.…
Reviewed by Yasin OZARSLAN
Full Text Available Collaboration in Virtual Learning Environment brings meaningful learning interactions between learners in virtual environments. This book collects case studies of collaborative virtual learning environments focusing on the nature of human interactions in virtual spaces and defining the types and qualities of learning processes in these spaces from the perspectives of learners, teachers, designers, and professional and academic developers in various disciplines, learning communities and universities from around the world. This book addresses the research cases on experiences, implementations, and applications of virtual learning environments.The book's broader audience is anyone who is interested in areas such as collaborative virtual learning environments, interactive technologies and virtual communities, social interaction and social competence, distance education and collaborative learning. The book is edited by Donna Russell who is an Assistant Professor at the University of Missouri-Kansas City and co-owner of Arete‘ Consulting, LLC. It is consisted of 358 pages covering 19 articles and provides information about context for characteristics and implications of the varied virtual learning environments. Topics covered in this book are argumentative interactions and learning, collaborative learning and work in digital libraries, collaborative virtual learning environments , digital communities to enhance retention, distance education ,interactive technologies and virtual communities, massively multi-user virtual environments, online graduate community, online training programs, social interaction and social competence and virtual story-worlds.
Hölscher, Christian; Munk, Matthias
... simultaneously recorded spike trains 120 Mark Laubach, Nandakumar S. Narayanan, and Eyal Y. Kimchi Part III Neuronal population information coding and plasticity in specific brain areas 149 7 F...
Majali, A.B.; Sabharwal, S.; Deshpande, R.S.; Sarma, K.S.S.; Bhardwaj, Y.K.; Dhanawade, B.R.
The increasing population and industrialization, worldover, is placing escalating demands for the development of newer technologies that are environment friendly and minimize the pollution associated with the development. Radiation technology can be of benefit in reducing the pollution levels associated with many processes. The sulphur vulcanization method for natural rubber latex vulcanization results in the formation of considerable amounts of nitrosoamines, both in the product as well as in the factory environment. Radiation vulcanization of natural rubber latex has emerged as a commercially viable alternative to produce sulphur and nitrosoamine free rubber. A Co-60 γ-radiation based pilot plant has been functioning since April 1993 to produce vulcanized natural rubber latex (RVNRL) using acrylate monomers as sensitizer. The role of sensitizer, viz. n-butyl acrylate in the vulcanization process has been elucidated using the pulse radiolysis technique. Emission of toxic sulphur containing gases form an inevitable part of viscose-rayon process and this industry is in search of ways to reduce the associated pollution levels. The irradiation of cellulose results in cellulose activation and reduction in the degree of polymerization (DP). These effects can reduce the solvents required to dissolve the paper pulp. There is a keen interest in utilizing radiation technology in viscose rayon production. We have utilized the electron beam (EB) accelerator for reducing the degree of polymerization (DP) of paper pulp. Laboratory scale tests have been carried out to standardize the conditions for production of pulp having the desired DP by EB irradiation. The use of irradiated paper pulp can result in ∼40% reduction in the consumption of CS 2 in the process that can be beneficial in reducing the pollution associated with the process. PTFE waste can be recycled into a low molecular weight microfine powder by irradiation. An EB based process has been standardized to produce
Kirby, John R.; Das, J. P.
The simultaneous and successive processing model of cognitive abilities was compared to a traditional primary mental abilities model. Simultaneous processing was found to be primarily related to spatial ability; and to a lesser extent, to memory and inductive reasoning. Subjects were 104 fourth-grade urban males. (Author/GD C)
Alessandria, V.; Rantsiou, K.; Cavallero, M. C.; Riva, S.; Cocolin, L.
Microbial contamination in food processing plants can play a fundamental role in food quality and safety. The description of the microbial consortia in the meat processing environment is important since it is a first step in understanding possible routes of product contamination. Furthermore, it may contribute in the development of sanitation programs for effective pathogen removal. The purpose of this study was to characterize the type of microbiota in the environment of meat processing plants: the microbiota of three different meat plants was studied by both traditional and molecular methods (PCR-DGGE) in two different periods. Different levels of contamination emerged between the three plants as well as between the two sampling periods. Conventional methods of killing free-living bacteria through antimicrobial agents and disinfection are often ineffective against bacteria within a biofilm. The use of gas-discharge plasmas potentially can offer a good alternative to conventional sterilization methods. The purpose of this study was to measure the effectiveness of Atmospheric Pressure Plasma (APP) surface treatments against bacteria in biofilms. Biofilms produced by three different L. monocytogenes strains on stainless steel surface were subjected to three different conditions (power, exposure time) of APP. Our results showed how most of the culturable cells are inactivated after the Plasma exposure but the RNA analysis by qPCR highlighted the entrance of the cells in the viable-but non culturable (VBNC) state, confirming the hypothesis that cells are damaged after plasma treatment, but in a first step, still remain alive. The understanding of the effects of APP on the L. monocytogenes biofilm can improve the development of sanitation programs with the use of APP for effective pathogen removal.
Harris, R. L., Sr.
The scanning behavior of pilots must be understood so that cockpit displays can be assembled which will provide the most information accurately and quickly to the pilot. The results of seven years of collecting and analyzing pilot scanning data are summarized. The data indicate that pilot scanning behavior is: (1) subsconscious; (2) situation dependent; and (3) can be disrupted if pilots are forced to make conscious decisions. Testing techniques and scanning analysis techniques have been developed that are sensitive to pilot workload.
The retina translates light into neuronal activity. Thus, it renders visual information of the external environment. The retina can only send a limited amount of information to the brain within a given period. To use this amount optimally, light stimuli are strongly processed in the retina. This
... Activities: Application To Use the Automated Commercial Environment (ACE) AGENCY: U.S. Customs and Border...: Application to Use the Automated Commercial Environment (ACE). This is a proposed extension of an information... Environment (ACE) is a trade processing system that will eventually replace the Automated Commercial System...
13.6%. looking for contracts amounted to 9.0% advertising services came to ... of women in small scale business in Botswana, and the identified information .... from ward and councilors and 224 (95.31%) said no, (4.26%) use social network.
Farooq, Omar; Taouss, Mohammed
Can information environment of a firm explain home bias in analysts’ recommendations? Can the extent of agency problems explain optimism difference between foreign and local analysts? This paper answers these questions by documenting the effect of information environment on home bias in analysts’...
In this presentation author presents structure, legislative basis and conception of the Information system of authorities of environment (ISAE) in the Slovak Republic. The ISAE is a component part of the information system of the Slovak Environmental Agency and Ministry of Environment of the Slovak Republic. Using of new technologies is discussed
Елена Вадимовна Журавлёва
Full Text Available The efficiency of pedagogical program means is considered through the correctness of a communication and information environment organization. The totality of pedagogical conditions is adduced; the communication and information environment answers these conditions. The main directions (didactic, psychological, ergonomic of analysis are determined and the methods choice for their diagnostics is grounded.
Blume-Kohout, Robin; Zurek, Wojciech H.
As quantum information science approaches the goal of constructing quantum computers, understanding loss of information through decoherence becomes increasingly important. The information about a system that can be obtained from its environment can facilitate quantum control and error correction. Moreover, observers gain most of their information indirectly, by monitoring (primarily photon) environments of the "objects of interest." Exactly how this information is inscribed in the environment is essential for the emergence of "the classical" from the quantum substrate. In this paper, we examine how many-qubit (or many-spin) environments can store information about a single system. The information lost to the environment can be stored redundantly, or it can be encoded in entangled modes of the environment. We go on to show that randomly chosen states of the environment almost always encode the information so that an observer must capture a majority of the environment to deduce the system's state. Conversely, in the states produced by a typical decoherence process, information about a particular observable of the system is stored redundantly. This selective proliferation of "the fittest information" (known as Quantum Darwinism) plays a key role in choosing the preferred, effectively classical observables of macroscopic systems. The developing appreciation that the environment functions not just as a garbage dump, but as a communication channel, is extending our understanding of the environment's role in the quantum-classical transition beyond the traditional paradigm of decoherence.
Calfee, Robert C.
"This paper consists of three sections--(a) the relation of theoretical analyses of learning to curriculum design, (b) the role of information-processing models in analyses of learning processes, and (c) selected examples of the application of information-processing models to curriculum design problems." (Author)
Hart, Eric W.
The mathematics of information processing and the Internet can be organized around four fundamental themes: (1) access (finding information easily); (2) security (keeping information confidential); (3) accuracy (ensuring accurate information); and (4) efficiency (data compression). In this article, the author discusses each theme with reference to…
S.S. Ficco (Stefano)
textabstractEconomic agents generally operate in uncertain environments and, prior to making decisions, invest time and resources to collect useful information. Consumers compare the prices charged by di..erent firms before purchasing a product. Politicians gather information from di..erent
This paper presents how a Geographical Information System (GIS) can be incorporated in an intelligent learning software system for environmental matters. The system is called ALGIS and incorporates the GIS in order to present effectively information about the physical and anthropogenic environment of Greece in a more interactive way. The system…
In the dynamic and interactive academic learning environment, students are required to have qualified information literacy competencies while critically reviewing print and electronic information. However, many undergraduates encounter difficulties in searching peer-reviewed information resources. Scholarly Information Discovery in the Networked Academic Learning Environment is a practical guide for students determined to improve their academic performance and career development in the digital age. Also written with academic instructors and librarians in mind who need to show their students how to access and search academic information resources and services, the book serves as a reference to promote information literacy instructions. This title consists of four parts, with chapters on the search for online and printed information via current academic information resources and services: part one examines understanding information and information literacy; part two looks at academic information delivery in the...
Horton, Rebecca; Carroll, Malcolm S.; Tarman, Thomas David
Qubits demonstrated using GaAs double quantum dots (DQD). The qubit basis states are the (1) singlet and (2) triplet stationary states. Long spin decoherence times in silicon spurs translation of GaAs qubit in to silicon. In the near term the goals are: (1) Develop surface gate enhancement mode double quantum dots (MOS & strained-Si/SiGe) to demonstrate few electrons and spin read-out and to examine impurity doped quantum-dots as an alternative architecture; (2) Use mobility, C-V, ESR, quantum dot performance & modeling to feedback and improve upon processing, this includes development of atomic precision fabrication at SNL; (3) Examine integrated electronics approaches to RF-SET; (4) Use combinations of numerical packages for multi-scale simulation of quantum dot systems (NEMO3D, EMT, TCAD, SPICE); and (5) Continue micro-architecture evaluation for different device and transport architectures.
Full Text Available Rice is the staple food for nearly two-thirds of the world’s population. Food components and environmental load of rice depends on the rice form that is resulted by different processing conditions. Brown rice (BR, germinated brown rice (GBR and partially-milled rice (PMR contains more health beneficial food components compared to the well milled rice (WMR. Although the arsenic concentration in cooked rice depends on the cooking methods, parboiled rice (PBR seems to be relatively prone to arsenic contamination compared to that of untreated rice, if contaminated water is used for parboiling and cooking. A change in consumption patterns from PBR to untreated rice (non-parboiled, and WMR to PMR or BR may conserve about 43–54 million tons of rice and reduce the risk from arsenic contamination in the arsenic prone area. This study also reveals that a change in rice consumption patterns not only supply more food components but also reduces environmental loads. A switch in production and consumption patterns would improve food security where food grains are scarce, and provide more health beneficial food components, may prevent some diseases and ease the burden on the Earth. However, motivation and awareness of the environment and health, and even a nominal incentive may require for a method switching which may help in building a sustainable society.
The essays in this dissertation study information disclosure and environmental policy. The first chapter challenges the longstanding result that firms will, in general, voluntarily disclose information about product quality, in light of the unrealism of the assumption, common to much of the literature, that consumers are identical. When this assumption is relaxed, an efficiency-enhancing role may emerge for disclosure regulation, insofar as it can improve information provision and thus help protect consumers with "moderately atypical" preferences. The paper also endogenizes firms's choice of quality and suggests that disclosure regulation may also raise welfare indirectly, by inducing firms to improve product quality. The second chapter explores the significance of policy-induced technological change (ITC) for the design of carbon-abatement policies. The paper considers both R&D-based and learning-by-doing-based knowledge accumulation, examining each specification under both a cost-effectiveness and a benefit-cost policy criterion. We show analytically that the presence of ITC generally implies a lower profile of optimal carbon taxes, a shifting of abatement effort into the future (in the R&D scenarios), and an increase in the scale of abatement (in the benefit-cost scenarios). Numerical simulations indicate that the impact of ITC on abatement timing is very slight, but the effects on costs, optimal carbon taxes, and cumulative abatement can be large. The third chapter uses a World Bank dataset on Chinese state-owned enterprises to estimate price elasticities of industrial coal demand. A simple coal-demand equation is estimated in many forms, and significant price sensitivity is almost always found: the own-price elasticity is estimated to be roughly -0.5. A cost-function/share-equation system is also estimated, and although the function is frequently ill-behaved, indicating that firms may not be minimizing costs, the elasticity estimates again are large and
Jaeschke, A.; Keller, H.; Orth, H.
On a production management level, a process information system in a nuclear reprocessing plant (NRP) has to fulfill conventional operating functions and functions for nuclear material surveillance (safeguards). Based on today's state of the art of on-line process control technology, the progress in hardware and software technology allows to introduce more process-specific intelligence into process information systems. Exemplified by an expert-system-aided laboratory management system as component of a NRP process information system, the paper demonstrates that these technologies can be applied already. (DG) [de
Nielsen, Jørgen Lerche; Meyer, Kirsten
and creation processes. The aim is to obtain a deeper comprehension of which factors determine whether the use of information technology becomes a success or a failure in relation to knowledge sharing and creation. The paper is based on three previous studies investigating the use of information technology......Do the knowledge sharing and creation processes in collaborating groups benefit from the use of new information environments or are the environments rather inhibitive to the development of these processes? A number of different studies have shown quite varied results when it comes to appraising...... the importance and value of using new information technology in knowledge sharing and creation processes. In this paper we will try to unveil the patterns appearing in the use of new information environment and the users' understanding of the significance of using information technology in knowledge sharing...
I. M. Krepkov
Full Text Available The article is devoted to the unified information environment of National Research University “Moscow Power Engineering Institute” (MPEI, and its most important component – the HR information system. The article describes the architecture of the unified information space of MPEI. The main objective of the development of the HR information system – to provide access to users, including other information systems, to actual information about employees of MPEI.HR information system are based on many years of operating experience of the previous system and the like, are available on the market today, taking into account the decisions of personnel. The earlier HR information system was developed in 1995–1997 and used until mid-2015. In the process of its using it has accumulated a large number of «patches» and requests for revision that was stopped by limitations in the platform and solutions architecture. Comparative analysis of 1C and SAP products showed that the cost of implementation, configuration and maintenance of these products is higher than developing new solutions. Package of Microsoft technology software was chosen as a platform. These technologies have proven themselves in the development of similar projects, and vendor solutions for a long time support all key processes of information systems. Important is the presence of the selected software Microsoft FSTEC certificates (Federal Service for Technical and Export Control, which support the use of these products for storing and processing information in accordance with the laws of the Russian Federation. The MPEI has already implemented a number of systems on the Microsoft platform – postgraduate register, an Internet portal, etc. The use of technology of one supplier facilitates the integration processes and products into a unified information environment. The article details the technical and hardware specifications of the HR information system. The result of the work on
and improve business processes. As a consequence, there is a growing need to address managerial aspects of the relationships between information technologies and business processes. The aim of this PhD study is to investigate how the practice of conjoint management of business processes and information...... technologies can be supported and improved. The study is organized into five research papers and this summary. Each paper addresses a different aspect of conjoint management of business processes and information technologies, i.e. problem development and managerial practices on software...... and information technologies in a project environment. It states that both elements are intrinsically related and should be designed and considered together. The second case examines the relationships between information technology management and business process management. It discusses the multi-faceted role...
Coenen, A.M.L.; Drinkenburg, W.H.I.M.
Information provided by external stimuli does reach the brain during sleep, although the amount of information is reduced during sleep compared to wakefulness. The process controlling this reduction is called `sensory' gating and evidence exists that the underlying neurophysiological processes take
Shi, Jia-Dong; Wang, Dong; Ye, Liu
In this paper, the dynamics of entanglement is investigated in the presence of a noisy environment. We reveal its revival behavior and probe the mechanisms of this behavior via an information-theoretic approach. By analyzing the correlation distribution and the information flow within the composite system including the qubit subsystem and a noisy environment, it has been found that the subsystem-environment coupling can induce the quasi-periodic entanglement revival. Furthermore, the dynamical relationship among tripartite correlations, bipartite entanglement and local state information is explored, which provides a new insight into the non-Markovian mechanisms during the evolution.
Wampler, Jason A.; Hsieh, Chien; Toth, Andrew; Sheatsley, Ryan
The inherent nature of unattended sensors makes these devices most vulnerable to detection, exploitation, and denial in contested environments. Physical access is often cited as the easiest way to compromise any device or network. A new mechanism for mitigating these types of attacks developed under the Assistant Secretary of Defense for Research and Engineering, ASD(R and E) project, "Smoke Screen in Cyberspace", was demonstrated in a live, over-the-air experiment. Smoke Screen encrypts, slices up, and disburses redundant fragments of files throughout the network. Recovery is only possible after recovering all fragments and attacking/denying one or more nodes does not limit the availability of other fragment copies in the network. This experiment proved the feasibility of redundant file fragmentation, and is the foundation for developing sophisticated methods to blacklist compromised nodes, move data fragments from risks of compromise, and forward stored data fragments closer to the anticipated retrieval point. This paper outlines initial results in scalability of node members, fragment size, file size, and performance in a heterogeneous network consisting of the Wireless Network after Next (WNaN) radio and Common Sensor Radio (CSR).
Full Text Available Introduction. This paper examines the business external environment scanning theory for information in the context of Greece. Method. A questionnaire was developed to explore the relationships between general and task business environment, perceived uncertainty, scanning strategy, and sources of information with respect to type of environment, size and industry.The research was based on a sample of 144 private organizations operating in North Greece. Analysis. Data collected were analysed using SPSS. The statistical procedures of chi-squared homogeneity test, ANOVA, Duncan's test of homogeneity of means, and related samples t-test were followed for testing the hypotheses developed. Results. The results show that perceived uncertainty of the general and task business external environment factors depend on the type of the environment, size of organization, and industry where the organizations operate; organizations adapt their scanning strategy to the complexity of the environment; personal sources of information seem to be more important than impersonal sources; external sources of information are equally important with internal sources; and higher levels of environmental uncertainty are associated with higher levels of scanning the various sources. Conclusion. Business external environment scanning of information is influenced by the characteristics of the organizations themselves and by the characteristics of the external environment within which the organizations operate. The study contributes to both environmental scanning theory and has important messages for practitioners.
This study examined the strategies commonly adopted by Osun state secondary school students in processing career information. It specifically examined the sources of career information available to the students, the uses to which the students put the information collected and how their career decision making skills can be ...
Aberer, Karl; Wombacher, Andreas
Automatizing information commerce requires languages to represent the typical information commerce processes. Existing languages and standards cover either only very specific types of business models or are too general to capture in a concise way the specific properties of information commerce
Kim, Tae Whan; Choi, Kwang; Oh, Jeong Hoon; Jeong, Hyun Sook; Keum, Jong Yong
The goal of this project is to establish integrated environment focused on enhanced information services to researchers through the providing of acquisition information, key phrase retrieval function, journal content information linked with various subsystems already developed. The results of the project are as follows. 1. It is possible to serve information on unreceivable materials among required materials throughout the system. 2. Retrieval efficiency is increased by the adding of key phrase retrieval function. 3. Rapidity of information service is enhanced by the providing of journal contents of each issue received and work performance of contents service is become higher. 4. It is possible to acquire, store, serve technical information needed in R and D synthetically and systematically throughout the development of total system linked with various subsystems required to technical information management and service. 21 refs. (Author)
Benchmarking has traditionally been viewed as a way to compare data only; however, its utilisation as a more investigative, research-informed process to add rigor to decision-making processes at the institutional level is gaining momentum in the higher education sector. Indeed, with recent changes in the Australian quality environment from the…
Orozco Munoz, Jose Miguel
The paper tries about the perspectives of the peace and the environment in the negotiation calendar with the armed groups and their thought about if the sustainable development is a common objective between the government and these groups
TKACH L. M.
Full Text Available Formulation of the problem. If public relations as a phenomenon of information management are examined, we deal with the question of knowledge content and nature of relationship of PR with environment, ability to manage the perception and attitude of people to events in the environment; ensure priority of information over other resources. Goal. To investigate the concept of "public relations" of foreign and domestic experts; consider the typology of the public and the "laws" of public opinion; define the basic principles according to which relations with public should be built, and to identify PR activities as a kind of social communication. Conclusions. Public relations on the basis of advanced information and communication technologies create fundamentally new opportunities for information control and influence on public consciousness.
Cardoso de Moraes, J.L.; Lopes de Souza, Wanderley; Ferreira Pires, Luis; Francisco do Prado, Antonio; Hammoudi, S.; Cordeiro, J.; Maciaszek, L.A.; Filipe, J.
This paper presents an architecture for health information exchange in pervasive healthcare environments meant to be generally applicable to different applications in the healthcare domain. Our architecture has been designed for message exchange by integrating ubiquitous computing technologies,
Informal sector, business environment and economic growth: A comparative analysis of West and Central Africa ... taxes, which undermines fair competition and puts formal enterprises at a disadvantage. ... Start Date. December 1, 2012 ...
Gallucci, Raymond H.V.
This paper examines a specific nuclear power plant modification performed in a risk-informed regulatory environment. It quantifies both the permanent and temporary effects of the modification, and performs a cost-benefit evaluation. (authors)
Related Information to Protect the Homeland (GAO 15- 290) (Washington, DC: U.S. Government Accountability Office, 2015), http://www.gao.gov/ assets...Government Accountability Office [GAO], Information Sharing Environment Better Road Map Needed to Guide Implementation and Investments (GAO-11-455...and its ISE PM would have clearer accountability for information sharing lapses and a faster ability to reform or develop domestic information -sharing
Leemans, S.J.J.; Fahland, D.; Van Der Aalst, W.M.P.; Reichert, M.; Reijers, H.A.
Understanding the performance of business processes is an important part of any business process intelligence project. From historical information recorded in event logs, performance can be measured and visualized on a discovered process model. Thereby the accuracy of the measured performance, e.g.,
Antonoaie Victor; Irimeş Adrian; Chicoş Lucia-Antoneta
The automation processes are used in organizations to speed up analyses processes and reduce manual labour. Robotic Automation of IT processes implemented in a modern corporate workspace provides an excellent tool for assisting professionals in making decisions, saving resources and serving as a know-how repository. This study presents the newest trends in process automation, its benefits such as security, ease of use, reduction of overall process duration, and provide examples of SAPERP proj...
Fredslund, Hanne; Strandgaard Pedersen, Jesper
or management perceptions and actions in implementing any intervention and their influence on the overall result of the intervention' (Nytrø, Saksvik, Mikkelsen, Bohle, and Quinlan, 2000). Process evaluation can be used to a) provide feedback for improving interventions, b) interpret the outcomes of effect......In recent years, intervention studies have become increasingly popular within occupational health psychology. The vast majority of such studies have focused on interventions themselves and their effects on the working environment and employee health and well-being. Few studies have focused on how...... the context and processes surrounding the intervention may have influenced the outcomes (Hurrell and Murphy, 1996). Thus, there is still relatively little published research that provides us with information on how to evaluate such strategies and processes (Saksvik, Nytrø, Dahl-Jørgensen, and Mikkelsen, 2002...
Guo Huifang; Wang Jingjing
Information resources construction is the primary task and critical measures for libraries. In the 2lst century, the knowledge economy era, with the continuous development of computer network technology, information resources have become an important part of libraries which have been a significant indicator of its capacity construction. The development of socialized Information, digitalization and internalization has put forward new requirements for library information resources construction. This paper describes the impact of network environment on construction of library information resources and proposes the measures of library information resources. (authors)
Acklin, M W
The psychoanalytic theory of religion has been seriously limited in its development, largely owing to Freud's emphasis on religion's neurotic elements and an overemphasis on the infantile origins of religious development. This paper offers a conceptual framework and advances the thesis, based on contemporary psychoanalytic, developmental theory, that 1) Erikson's concept of epigenesis has applicability across the life span; 2) that beyond-the-self identity is constituent to human maturation and self-completion; 3) that successful adult maturation requires a mirroring-facilitating environment; and 4) that religious values, meanings, images, and communities play an essential role-as-elements of the facilitating environment of later life.
In previous work we described how the process algebra based language PSF can be used in software engineering, using the ToolBus, a coordination architecture also based on process algebra, as implementation model. In this article we summarize that work and describe the software development process
Full Text Available The automation processes are used in organizations to speed up analyses processes and reduce manual labour. Robotic Automation of IT processes implemented in a modern corporate workspace provides an excellent tool for assisting professionals in making decisions, saving resources and serving as a know-how repository. This study presents the newest trends in process automation, its benefits such as security, ease of use, reduction of overall process duration, and provide examples of SAPERP projects where this technology was implemented and meaningful impact was obtained.
Rieffel, Eleanor G.
This survey, aimed at information processing researchers, highlights intriguing but lesser known results, corrects misconceptions, and suggests research areas. Themes include: certainty in quantum algorithms; the "fewer worlds" theory of quantum mechanics; quantum learning; probability theory versus quantum mechanics.
qubits, the 2n energy levels of the spin-system can be treated as an n-qubit system. ... Quantum information processing; qubit; nuclear magnetic resonance quantum comput- ing. ..... The equilibrium spectrum has theoretical intensities in the ra-.
Crowe, Sarah; Tully, Mary P; Cantrill, Judith A
The need for effective communication and handling of secondary care information in general practices is paramount. To explore practice processes on receiving secondary care correspondence in a way that integrates the information needs and perceptions of practice staff both clinical and administrative. Qualitative study using semi-structured interviews with a wide range of practice staff (n = 36) in nine practices in the Northwest of England. Analysis was based on the framework approach using N-Vivo software and involved transcription, familiarization, coding, charting, mapping and interpretation. The 'information processing model' was developed to describe the six stages involved in practice processing of secondary care information. These included the amendment or updating of practice records whilst simultaneously or separately actioning secondary care recommendations, using either a 'one-step' or 'two-step' approach, respectively. Many factors were found to influence each stage and impact on the continuum of patient care. The primary purpose of processing secondary care information is to support patient care; this study raises the profile of information flow and usage within practices as an issue requiring further consideration.
Reichenbach, Alexandra; Diedrichsen, Jörn
A recent study suggests that reafferent hand-related visual information utilizes a privileged, attention-independent processing channel for motor control. This process was termed visuomotor binding to reflect its proposed function: linking visual reafferences to the corresponding motor control centers. Here, we ask whether the advantage of processing reafferent over exafferent visual information is a specific feature of the motor processing stream or whether the improved processing also benefits the perceptual processing stream. Human participants performed a bimanual reaching task in a cluttered visual display, and one of the visual hand cursors could be displaced laterally during the movement. We measured the rapid feedback responses of the motor system as well as matched perceptual judgments of which cursor was displaced. Perceptual judgments were either made by watching the visual scene without moving or made simultaneously to the reaching tasks, such that the perceptual processing stream could also profit from the specialized processing of reafferent information in the latter case. Our results demonstrate that perceptual judgments in the heavily cluttered visual environment were improved when performed based on reafferent information. Even in this case, however, the filtering capability of the perceptual processing stream suffered more from the increasing complexity of the visual scene than the motor processing stream. These findings suggest partly shared and partly segregated processing of reafferent information for vision for motor control versus vision for perception.
The coastal zone is a dynamic environment, so that in developing practical and effective oil spill response strategies it is necessary to understand the forces that contribute to shore-zone processs. The coasts of Canada encompass a wide range of environments and are characterized by a variety of shoreline types that include the exposed, resistant cliffs of eastern Newfoundland and the sheltered marshes of the Beaufort Sea. A report is presented to provide an understanding of the dynamics and physical processes as they vary on the different coasts of Canada, including the Great Lakes. An outline of the general character and processes on a regional basis describes the coastal environments and introduces the literature that can be consulted for more specific information. The likely fate and persistence of oil that reaches the shoreline is discussed to provide the framework for development of spill response strategies and for the selection of appropriate shoreline cleanup or treatment countermeasures. Lessons learned from recent experience with major oil spills and field experiments are integrated into the discussion. Separate abstracts have been prepared for each of the four sections of this report. 502 refs., 5 figs
The chapter discusses the potential of personal learning environments (PLE) based on Web 2.0 applications for language courses in higher education (HE). This novel approach to the use of information and communication technologies (ICT) in education involves learners in the design of learning environments, tools and processes. The chapter begins…
Full Text Available Securing sensitive organizational data has become increasingly vital to organizations. An Information Security Management System (ISMS is a systematic approach for establishing, implementing, operating, monitoring, reviewing, maintaining and improving an organization's information security. Key elements of the operation of an ISMS are ISMS processes. However, and in spite of its importance, an ISMS process framework with a description of ISMS processes and their interaction as well as the interaction with other management processes is not available in the literature. Cost benefit analysis of information security investments regarding single measures protecting information and ISMS processes are not in the focus of current research, mostly focused on economics. This article aims to fill this research gap by proposing such an ISMS process framework as the main contribution. Based on a set of agreed upon ISMS processes in existing standards like ISO 27000 series, COBIT and ITIL. Within the framework, identified processes are described and their interaction and interfaces are specified. This framework helps to focus on the operation of the ISMS, instead of focusing on measures and controls. By this, as a main finding, the systemic character of the ISMS consisting of processes and the perception of relevant roles of the ISMS is strengthened.
Full Text Available The purpose of this study was to evaluate the information processing of 43 business managers with a professional superior performance. The theoretical framework considers three models: the Theory of Managerial Roles of Henry Mintzberg, the Theory of Information Processing, and Process Model Response to Rorschach by John Exner. The participants have been evaluated by Rorschach method. The results show that these managers are able to collect data, evaluate them and establish rankings properly. At same time, they are capable of being objective and accurate in the problems assessment. This information processing style permits an interpretation of the world around on basis of a very personal and characteristic processing way or cognitive style.
V. A. Matyushenko
Full Text Available Information technology is rapidly conquering the world, permeating all spheres of human activity. Education is not an exception. An important direction of information of education is the development of university management systems. Modern information systems improve and facilitate the management of all types of activities of the institution. The purpose of this paper is development of system, which allows automating process of formation of accounting documents. The article describes the problem of preparation of the educational process documents. Decided to project and create the information system in Microsoft Access environment. The result is four types of reports obtained by using the developed system. The use of this system now allows you to automate the process and reduce the effort required to prepare accounting documents. All reports was implement in Microsoft Excel software product and can be used for further analysis and processing.
enable ISE participants to share information and data (see ISE Implementation Plan, p. 51, ISE Enterprise Architecture Framework, pp. 67, 73–74 and...of frontiers. This article shall not prevent States from requiring the licensing of broadcasting, television or cinema enterprises. 2. The exercise...5 U.S.C. § 552a, as amended. Program Manager-Information Sharing Environment. (2008). Information Sharing Enterprise Architecture Framework
Horton, Joanne; Serafeim, Georgios; Serafeim, Ioanna
More than 120 countries require or permit the use of International Financial Reporting Standards (‘IFRS’) by publicly listed companies on the basis of higher information quality and accounting comparability from IFRS application. However, the empirical evidence about these presumed benefits are often conflicting and fail to separate between information quality and comparability. In this paper we examine the effect of mandatory IFRS adoption on firms’ information environment. We find that afte...
Reading, H. G; Reading, Harold G
... and chemical systems, 6 2.1.2 Climate, 7 2.1.3 Tectonic movements and subsidence, 11 2.1.4 Sea-level changes, 11 2.1.5 Milankovitch processes and orbital forcing, 14 2.1.6 Intrinsic sedimentary processes,...
Tan, Wee-Kek; Tan, Chuan-Hoo
Acquiring the knowledge to assemble an integrated Information System (IS) development process that is tailored to the specific needs of a project has become increasingly important. It is therefore necessary for educators to impart to students this crucial skill. However, Situational Method Engineering (SME) is an inherently complex process that…
Song, X.M.; Zang, F.; Bij, van der J.D.; Weggeman, M.C.D.P.
Despite the obvious linkage between information technologies (IT) and knowledge processes and the apparent strategic importance of both, little research has done to explicitly examine how, if at all, IT and knowledge processes affect firm outcomes. The purpose of this study is to bridge this
Full Text Available Main laws of technical thermodynamics are universal and could be applied to processes other than thermodynamic ones. The results of the comparison of peculiarities of irreversible informational and thermodynamic processes are presented in the article and a new term “Infopy” is used. A more precise definition of “infopy” as an energetic charge is given in the article.
The concept of organisational knowledge as a valuable strategic asset has become quite popular recently. Increased competition, globalisation and the emergence of new organisational models built on process-based organisational structures require organisations to create, capture, share and apply
Quantum information processors exploit the quantum features of superposition and entanglement for applications not possible in classical devices, offering the potential for significant improvements in the communication and processing of information. Experimental realization of large-scale quantum information processors remains a long term vision, as the required nearly pure quantum behaviour is observed only in exotic hardware such as individual laser-cooled atoms and isolated photons. But recent theoretical and experimental advances suggest that cold atoms and individual photons may lead the way towards bigger and better quantum information processors, effectively building mesoscopic versions of Schroedinger's cat' from the bottom up. (author)
Treats the Mathematics of many important areas in digital information processing. This book covers, in a unified presentation, five topics: Data Compression, Cryptography, Sampling (Signal Theory), Error Control Codes, Data Reduction. It is useful for teachers, students and practitioners in Electronic Engineering, Computer Science and Mathematics.
Mikhail O. Kolbanev
Full Text Available Subject of study. The paper describes basic information technologies for automating of information processes of data storage, distribution and processing in terms of required physical resources. It is shown that the study of these processes with such traditional objectives of modern computer science, as the ability to transfer knowledge, degree of automation, information security, coding, reliability, and others, is not enough. The reasons are: on the one hand, the increase in the volume and intensity of information exchange in the subject of human activity and, on the other hand, drawing near to the limit of information systems efficiency based on semiconductor technologies. Creation of such technologies, which not only provide support for information interaction, but also consume a rational amount of physical resources, has become an actual problem of modern engineering development. Thus, basic information technologies for storage, distribution and processing of information to support the interaction between people are the object of study, and physical temporal, spatial and energy resources required for implementation of these technologies are the subject of study. Approaches. An attempt is made to enlarge the possibilities of traditional cybernetics methodology, which replaces the consideration of material information component by states search for information objects. It is done by taking explicitly into account the amount of physical resources required for changes in the states of information media. Purpose of study. The paper deals with working out of a common approach to the comparison and subsequent selection of basic information technologies for storage, distribution and processing of data, taking into account not only the requirements for the quality of information exchange in particular subject area and the degree of technology application, but also the amounts of consumed physical resources. Main findings. Classification of resources
Spee, J.B.R.M.; Bijwaard, D.; Laan, D.J.
Companies are recognising that innovative processes are determining factors in competitiveness. Two examples from projects in aircraft development describe the introduction of collaborative engineering environments as a way to improve engineering processes. A multi-disciplinary simulation
In the past, most of the regulations regarding egg processing are concerned with quality rather than safety. Hazard Analysis and Critical Control Point (HACCP) will be required by retailers or by the federal government. GMPs (Good Manufacturing Practices) and SSOPs (Sanitation Standard Operating P...
Frey, Seth; Albino, Dominic K; Williams, Paul L
There is a tendency in decision-making research to treat uncertainty only as a problem to be overcome. But it is also a feature that can be leveraged, particularly in social interaction. Comparing the behavior of profitable and unprofitable poker players, we reveal a strategic use of information processing that keeps decision makers unpredictable. To win at poker, a player must exploit public signals from others. But using public inputs makes it easier for an observer to reconstruct that player's strategy and predict his or her behavior. How should players trade off between exploiting profitable opportunities and remaining unexploitable themselves? Using a recent multivariate approach to information theoretic data analysis and 1.75 million hands of online two-player No-Limit Texas Hold'em, we find that the important difference between winning and losing players is not in the amount of information they process, but how they process it. In particular, winning players are better at integrative information processing-creating new information from the interaction between their cards and their opponents' signals. We argue that integrative information processing does not just produce better decisions, it makes decision-making harder for others to reverse engineer, as an expert poker player's cards act like the private key in public-key cryptography. Poker players encrypt their reasoning with the way they process information. The encryption function of integrative information processing makes it possible for players to exploit others while remaining unexploitable. By recognizing the act of information processing as a strategic behavior in its own right, we offer a detailed account of how experts use endemic uncertainty to conceal their intentions in high-stakes competitive environments, and we highlight new opportunities between cognitive science, information theory, and game theory. Copyright © 2018 Cognitive Science Society, Inc.
Full Text Available Personal learning environments (PLEs and critical information literacies (CILs are two concepts that have been presented as responses to the challenges of the rich and complex information landscape. While both approaches support learners’ critical engagement with new information environments, each was developed within a different field. This paper connects and contrasts PLEs and CILs in order to explore the design of pedagogical responses to the information environment. Through a careful examination of PLE and CIL literature, the paper demonstrates that information literacy education intersects with the concepts and goals of PLEs. As such, the authors suggest that PLE scholarship informed by CIL scholarship, and vice versa, will yield a deeper understanding of modern learning contexts as well as a more holistic and responsive learner framework. The example of the research assignment will be used to demonstrate the viability of this approach. With these propositions, the authors invite educators, librarians and information technologists to engage in a dialogue about these concepts and the potential for pedagogical change.
Faist, Philippe; Dupuis, Frédéric; Oppenheim, Jonathan; Renner, Renato
Irreversible information processing cannot be carried out without some inevitable thermodynamical work cost. This fundamental restriction, known as Landauer's principle, is increasingly relevant today, as the energy dissipation of computing devices impedes the development of their performance. Here we determine the minimal work required to carry out any logical process, for instance a computation. It is given by the entropy of the discarded information conditional to the output of the computation. Our formula takes precisely into account the statistically fluctuating work requirement of the logical process. It enables the explicit calculation of practical scenarios, such as computational circuits or quantum measurements. On the conceptual level, our result gives a precise and operational connection between thermodynamic and information entropy, and explains the emergence of the entropy state function in macroscopic thermodynamics.
Rock fracture is traditionally viewed as a brittle process involving damage nucleation and growth in a zone ahead of a larger fracture, resulting in fracture propagation once a threshold loading stress is exceeded. It is now increasingly recognized that coupled chemical-mechanical processes influence fracture growth in wide range of subsurface conditions that include igneous, metamorphic, and geothermal systems, and diagenetically reactive sedimentary systems with possible applications to hydrocarbon extraction and CO2 sequestration. Fracture processes aided or driven by chemical change can affect the onset of fracture, fracture shape and branching characteristics, and fracture network geometry, thus influencing mechanical strength and flow properties of rock systems. We are investigating two fundamental modes of chemical-mechanical interactions associated with fracture growth: 1. Fracture propagation may be aided by chemical dissolution or hydration reactions at the fracture tip allowing fracture propagation under subcritical stress loading conditions. We are evaluating effects of environmental conditions on critical (fracture toughness KIc) and subcritical (subcritical index) fracture properties using double torsion fracture mechanics tests on shale and sandstone. Depending on rock composition, the presence of reactive aqueous fluids can increase or decrease KIc and/or subcritical index. 2. Fracture may be concurrent with distributed dissolution-precipitation reactions in the hostrock beyond the immediate vicinity of the fracture tip. Reconstructing the fracture opening history recorded in crack-seal fracture cement of deeply buried sandstone we find that fracture length growth and fracture opening can be decoupled, with a phase of initial length growth followed by a phase of dominant fracture opening. This suggests that mechanical crack-tip failure processes, possibly aided by chemical crack-tip weakening, and distributed solution-precipitation creep in the
Xiaohua Wei; Ge Sun; James Vose; Kyoichi Otsuki; Zhiqiang Zhang; Keith Smetterm
The papers in this issue are a selection of the presentations made at the second International Conference on Forests and Water in a Changing Environment. This special issue âForest Ecohydrological Processes in a Changing Environmentâ covers the topics regarding the effects of forest, land use and climate changes on ecohydrological processes across forest stand,...
Since the terms Data Warehouse and On-Line Analytical Processing were proposed by Inmon and Codd, Codd, Sally respectively the traditional ideas of creating information systems in support of management¿s decision became interesting again in theory and practice. Today information warehousing is a strategic market for any data base systems vendor. Nevertheless the theoretical discussions of this topic go back to the early years of the 20th century as far as management science and accounting the...
Jillian R. Griffiths
Full Text Available The Joint Information Systems Committee (JISC Information Environment (IE, a development from the DNER - Distributed National Electronic Resource, is intended to help users in the UK academic sector maximise the value of published information resources by developing a coherent environment out of the confusing array of systems and services currently available. The EDNER Project (Formative Evaluation of the DNER is funded to undertake ongoing evaluation of the developing IE over the full three years of the JISC 5/99 Learning & Teaching and Infrastructure Programme i.e. from 2000 to 2003. The EDNER Project is led by the Centre for Research in Library & Information Management (CERLIM at the Manchester Metropolitan University; the Centre for Studies in Advanced Learning Technologies (CSALT at Lancaster University is a partner. This paper reports on work in progress and some of the initial findings of the evaluation team.
Lichtenwalner, Peter F.; White, Edward V.; Baumann, Erwin W.
Structural health monitoring (SHM) technology provides a means to significantly reduce life cycle of aerospace vehicles by eliminating unnecessary inspections, minimizing inspection complexity, and providing accurate diagnostics and prognostics to support vehicle life extension. In order to accomplish this, a comprehensive SHM system will need to acquire data from a wide variety of diverse sensors including strain gages, accelerometers, acoustic emission sensors, crack growth gages, corrosion sensors, and piezoelectric transducers. Significant amounts of computer processing will then be required to convert this raw sensor data into meaningful information which indicates both the diagnostics of the current structural integrity as well as the prognostics necessary for planning and managing the future health of the structure in a cost effective manner. This paper provides a description of the key types of information processing technologies required in an effective SHM system. These include artificial intelligence techniques such as neural networks, expert systems, and fuzzy logic for nonlinear modeling, pattern recognition, and complex decision making; signal processing techniques such as Fourier and wavelet transforms for spectral analysis and feature extraction; statistical algorithms for optimal detection, estimation, prediction, and fusion; and a wide variety of other algorithms for data analysis and visualization. The intent of this paper is to provide an overview of the role of information processing for SHM, discuss various technologies which can contribute to accomplishing this role, and present some example applications of information processing for SHM implemented at the Boeing Company.
Quantification of information leakage is a successful approach for evaluating the security of a system. It models the system to be analyzed as a channel with the secret as the input and an output as observable by the attacker as the output, and applies information theory to quantify the amount...... and randomized processes with Markovian models and to compute their information leakage for a very general model of attacker. We present the QUAIL tool that automates such analysis and is able to compute the information leakage of an imperative WHILE language. Finally, we show how to use QUAIL to analyze some...... of information transmitted through such channel, thus effectively quantifying how many bits of the secret can be inferred by the attacker by analyzing the system’s output. Channels are usually encoded as matrices of conditional probabilities, known as channel matrices. Such matrices grow exponentially...
A variety of experimental results over the past decades provide examples of near-optimal information processing in biological networks, including in biochemical and transcriptional regulatory networks. Computing information-theoretic quantities requires first choosing or computing the joint probability distribution describing multiple nodes in such a network --- for example, representing the probability distribution of finding an integer copy number of each of two interacting reactants or gene products while respecting the `intrinsic' small copy number noise constraining information transmission at the scale of the cell. I'll given an overview of some recent analytic and numerical work facilitating calculation of such joint distributions and the associated information, which in turn makes possible numerical optimization of information flow in models of noisy regulatory and biochemical networks. Illustrating cases include quantification of form-function relations, ideal design of regulatory cascades, and response to oscillatory driving.
Miculescu Marius Nicolae
Full Text Available This article provides information on business and therefore need managers to obtain information relevant accounting, reliable, clear, accurate and lowest costs to optimize decision making. This need derives from the current economic environment. The survival of organizations in a competitive environment, to which they must adapt, is conditioned by obtaining accounting information which should be qualitative, opportune, vital, and in a short time. This information is related to patrimony, analytical results, the market (dynamics, dimensions, and structure, and relationships with business partners, competitors, suppliers. Therefore focus more intensely on the quality of accounting information. Definition of quality of accounting information but leave the boundaries and features of accounting communication process and aims to determine \\\\\\"quality criteria\\\\\\" or \\\\\\"qualitative characteristics\\\\\\" to develop a measurement tool. Note that the reviewliterature was found that the normalization and accounting dotrine, criteria for definition of quality of accounting infornation are not identical, their selection and ranking is different. Theory and practice also identifies the fact that information itself is worthless. Instead it is valuable once it is used in a decisional process. Thus, the economic value of the accounting information depends on the earnings obtained after making a decision, diminished by information cost. To be more specific, it depends on the table or on the implemented decision tree, on the informational cost and on the optimal condition established by the decision maker (due to the fact that producing accounting information implies costs which are often considerable and profits arise only form shares. The problem of convergence between content and interpretation of information sent by users also take, and the quality of information to be intelligible. In this case, those who use, say users should have sufficient
Chen Yingxi; Huang Daifu; Yang Lifeng
With the development of internet and information technology, the work of scientific and technological information is faced with great challenge. This article expounds the new changes of scientific and technological information in enterprise under network environment by giving a minute description on the situation the work faced and characteristic of the work. Not only does it carry out enthusiastic discussion upon problems which are present in the work of scientific and technological information in the company, but puts forward proposals and specific measures as well. Service theory is also offered by adjusting and reforming the resources construction, service ways and the job of providing contents. We should take vigorous action to the research work of scientific and technological information, changing the information directional service into knowledge providing service. (authors)
Full Text Available Introduction. When children are adjudicated by a court of law as being maltreated, they are summarily removed from their homes, resulting in a disruption of their daily lives. This pilot study examines the context in which maltreated children seek and use information to cope with this stressful period of their lives. Method. This study applies Taylor's four components of information use environments to look at the user and the uses of information and the contexts within which those users make choices about what information is useful to them at particular times. Analysis. The characteristics of foster children as a population are examined and the settings in which such children seek information are described. The problems experienced by children, which are linked to information seeking, are articulated as are problem resolutions. Results. The most important finding of this study is the determination that there are three clearly differentiated phases of information needs and seeking corresponding to the three phases of adjustment the children experience. Conclusion. Understanding problem phases underpinning everyday life contexts in foster care environments afford support personnel who provide information to these children better insights into what helps and what results in increasing anxiety or causes more trauma.
Casado-Lumbreras, Cristina; Soto-Acosta, Pedro; Colomo-Palacios, Ricardo; de Pablos, Patricia Ordonez
Purpose: The aim of this paper is to present a tool which uses semantic technologies for personnel performance and workplace learning assessment in outsourced information technology environments. Design/methodology/approach: The paper presents the tool from a technical perspective and introduces a use case that depicts the main features related to…
Navratil, R.; Kmet, M.
In this presentation author deals with history, conception and exploitation of information portal about environment in the Slovak Republic. This portal - Enviroportal.sk was introduced into service in testing operation in April 2005. Perspectives of Enviroportal.sk are discussed
Oei, J.L.H.; Proper, H.A.; Falkenberg, E.D.
To meet the demands of organizations and their ever-changing environment, information systems are required which are able to evolve to the same extent as organizations do. Such a system has to support changes in all time-and application-dependent aspects. In this paper, requirements and a conceptual
Full Text Available The Wigner Ville distribution offers a visual display of quantitative information about the way a signal’s energy is distributed in both, time and frequency. Through that, this distribution embodies the fundamentally concepts of the Fourier and time-domain analysis. The energy of the signal is distributed so that specific frequencies are localized in time by the group delay time and at specifics instants in time the frequency is given by the instantaneous frequency. The net positive volum of the Wigner distribution is numerically equal to the signal’s total energy. The paper shows the application of the Wigner Ville distribution, in the field of signal processing, using Scilab environment.
Full Text Available Abstract: We present a model of information processing which is based on two concurrent ways of describing the world, where a description in one of the languages limits the possibilities for realisations in the other language. The two describing dimensions appear in our common sense as dichotomies of perspectives: subjective - objective; diversity - similarity; individual - collective. We abstract from the subjective connotations and treat the test theoretical case of an interval on which several concurrent categories can be introduced. We investigate multidimensional partitions as potential carriers of information and compare their efficiency to that of sequenced carriers. We regard the same assembly once as a contemporary collection, once as a longitudinal sequence and find promising inroads towards understanding information processing by auto-regulated systems. Information is understood to point out that what is the case from among alternatives, which could be the case. We have translated these ideas into logical operations on the set of natural numbers and have found two equivalence points on N where matches between sequential and commutative ways of presenting a state of the world can agree in a stable fashion: a flip-flop mechanism is envisioned. By following this new approach, a mathematical treatment of some poignant biomathematical problems is allowed. Also, the concepts presented in this treatise may well have relevance and applications within the information processing and the theory of language fields.
Leach, Janice; Torres, Teresa M.
The Springfield Processing Plant is a hypothetical facility. It has been constructed for use in training workshops. Information is provided about the facility and its surroundings, particularly security-related aspects such as target identification, threat data, entry control, and response force data.
Nijstad, Bernard A.; Oltmanns, Jan
Group decision making has attracted much scientific interest, but few studies have investigated group decisions that do not get made. Based on the Motivated Information Processing in Groups model, this study analysed the effect of epistemic motivation (low vs. high) and social motivation (proself
Cells receive signaling molecules by receptors and relay information via sensory networks so that they can respond properly depending on the type of signal. Recent studies have shown that cells can extract multidimensional information from dynamical concentration patterns of signaling molecules. We herein study how biochemical systems can process multidimensional information embedded in dynamical patterns. We model the decoding networks by linear response functions, and optimize the functions with the calculus of variations to maximize the mutual information between patterns and output. We find that, when the noise intensity is lower, decoders with different linear response functions, i.e., distinct decoders, can extract much information. However, when the noise intensity is higher, distinct decoders do not provide the maximum amount of information. This indicates that, when transmitting information by dynamical patterns, embedding information in multiple patterns is not optimal when the noise intensity is very large. Furthermore, we explore the biochemical implementations of these decoders using control theory and demonstrate that these decoders can be implemented biochemically through the modification of cascade-type networks, which are prevalent in actual signaling pathways.
Fore, Stephanie; Palumbo, Fabrizio; Pelgrims, Robbrecht; Yaksi, Emre
The habenula is a brain region that has gained increasing popularity over the recent years due to its role in processing value-related and experience-dependent information with a strong link to depression, addiction, sleep and social interactions. This small diencephalic nucleus is proposed to act as a multimodal hub or a switchboard, where inputs from different brain regions converge. These diverse inputs to the habenula carry information about the sensory world and the animal's internal state, such as reward expectation or mood. However, it is not clear how these diverse habenular inputs interact with each other and how such interactions contribute to the function of habenular circuits in regulating behavioral responses in various tasks and contexts. In this review, we aim to discuss how information processing in habenular circuits, can contribute to specific behavioral programs that are attributed to the habenula. Copyright © 2017 Elsevier Ltd. All rights reserved.
Zwolak, Michael; Quan, H. T.; Zurek, Wojciech H.
Quantum Darwinism provides an information-theoretic framework for the emergence of the objective, classical world from the quantum substrate. The key to this emergence is the proliferation of redundant information throughout the environment where observers can then intercept it. We study this process for a purely decohering interaction when the environment, E, is in a nonideal (e.g., mixed) initial state. In the case of good decoherence, that is, after the pointer states have been unambiguously selected, the mutual information between the system, S, and an environment fragment, F, is given solely by F's entropy increase. This demonstrates that the environment's capacity for recording the state of S is directly related to its ability to increase its entropy. Environments that remain nearly invariant under the interaction with S, either because they have a large initial entropy or a misaligned initial state, therefore have a diminished ability to acquire information. To elucidate the concept of good decoherence, we show that, when decoherence is not complete, the deviation of the mutual information from F's entropy change is quantified by the quantum discord, i.e., the excess mutual information between S and F is information regarding the initial coherence between pointer states of S. In addition to illustrating these results with a single-qubit system interacting with a multiqubit environment, we find scaling relations for the redundancy of information acquired by the environment that display a universal behavior independent of the initial state of S. Our results demonstrate that Quantum Darwinism is robust with respect to nonideal initial states of the environment: the environment almost always acquires redundant information about the system but its rate of acquisition can be reduced.
Full Text Available Future conflicts will necessitate the ability to conduct effective military operations in a contested information environment. The building and maintaining of robust situational awareness, protection of decision-making effectiveness of individuals and teams, fighting through information attacks from both in, and through, the cyberspace domain, will be essential. Increasing the knowledge of the mechanisms involved in degrading task performance and decision-making during cyber attacks will enable the development of advanced human-centered defensive techniques that aid fight-through capability. In this position paper, the development and evaluation of software that simulates real-time and persistent manipulation of the information environment is discussed. Results of the evaluation indicated that the task performance of a team of decision-makers performing collaborative tasks could be degraded through real-time manipulation of cyberspace content and operation. The paper concludes with a discussion of focus and direction for future research and development. It is suggested that the building of a deeper understanding of the perceptual and cognitive factors that are significant in the relationship between information environment manipulation and reduction in task performance is required. This understanding will aid in the defence of cyberspace attacks, will aid in fight through and mission assurance, and will aid the Information Operations community.
Yates, Bernice C; Dodendorf, Diane; Lane, Judy; LaFramboise, Louise; Pozehl, Bunny; Duncan, Kathleen; Knodel, Kendra
One of the main problems in conducting clinical trials is low participation rate due to potential participants' misunderstanding of the rationale for the clinical trial or perceptions of loss of control over treatment decisions. The objective of this study was to test an alternate informed consent process in cardiac rehabilitation participants that involved the use of a multimedia flip chart to describe a future randomized clinical trial and then asked, hypothetically, if they would participate in the future trial. An attractive and inviting visual presentation of the study was created in the form of a 23-page flip chart that included 24 color photographs displaying information about the purpose of the study, similarities and differences between the two treatment groups, and the data collection process. We tested the flip chart in 35 cardiac rehabilitation participants. Participants were asked if they would participate in this future study on two occasions: immediately after the description of the flip chart and 24 hours later, after reading through the informed consent document. Participants were also asked their perceptions of the flip chart and consent process. Of the 35 participants surveyed, 19 (54%) indicated that they would participate in the future study. No participant changed his or her decision 24 hours later after reading the full consent form. The participation rate improved 145% over that of an earlier feasibility study where the recruitment rate was 22%. Most participants stated that the flip chart was helpful and informative and that the photographs were effective in communicating the purpose of the study. Participation rates could be enhanced in future clinical trials by using a visual presentation to explain and describe the study as part of the informed consent process. More research is needed to test alternate methods of obtaining informed consent.
Lucarella, D; Zanzi, A [ENEL s.p.a., Centro Ricerca di Automatica, Cologno Monzese, Milan (Italy)
The authors a graph-based object model that may be used as a uniform framework for direct manipulation of multimedia information. After an introduction motivating the need for abstraction and structuring mechanisms in hypermedia systems, the authors introduce the data model and the notion of perspective, a form of data abstraction that acts as a user interface to the system, providing control over the visibility of the objects and their properties. A perspective is defined to include an intention and an extension. The authors present a visual retrieval environment that effectively combines filtering, browsing, and navigation to provide an integrated view of the retrieval problem. Design and implementation issues are outlined for MORE (Multimedia Object Retrieval Environment), a prototype system relying on the proposed model. The focus is on the main user interface functionalities, and actual interaction sessions are presented including schema creation, information loading, and information retrieval
Dambre, Joni; Verstraeten, David; Schrauwen, Benjamin; Massar, Serge
Many dynamical systems, both natural and artificial, are stimulated by time dependent external signals, somehow processing the information contained therein. We demonstrate how to quantify the different modes in which information can be processed by such systems and combine them to define the computational capacity of a dynamical system. This is bounded by the number of linearly independent state variables of the dynamical system, equaling it if the system obeys the fading memory condition. It can be interpreted as the total number of linearly independent functions of its stimuli the system can compute. Our theory combines concepts from machine learning (reservoir computing), system modeling, stochastic processes, and functional analysis. We illustrate our theory by numerical simulations for the logistic map, a recurrent neural network, and a two-dimensional reaction diffusion system, uncovering universal trade-offs between the non-linearity of the computation and the system's short-term memory.
Dambre, Joni; Verstraeten, David; Schrauwen, Benjamin; Massar, Serge
Many dynamical systems, both natural and artificial, are stimulated by time dependent external signals, somehow processing the information contained therein. We demonstrate how to quantify the different modes in which information can be processed by such systems and combine them to define the computational capacity of a dynamical system. This is bounded by the number of linearly independent state variables of the dynamical system, equaling it if the system obeys the fading memory condition. It can be interpreted as the total number of linearly independent functions of its stimuli the system can compute. Our theory combines concepts from machine learning (reservoir computing), system modeling, stochastic processes, and functional analysis. We illustrate our theory by numerical simulations for the logistic map, a recurrent neural network, and a two-dimensional reaction diffusion system, uncovering universal trade-offs between the non-linearity of the computation and the system's short-term memory. PMID:22816038
Oprea, I.; Oprea, M.; Stoica, M.; Badea, E.; Guta, V.
The real-time information and processing system has as main task to record, collect, process and transmit the radiation level and weather data, being proposed for radiation protection, environmental monitoring around nuclear facilities and for civil defence. Such a system can offer information in order to provide mapping, data base, modelling and communication and to assess the consequences of nuclear accidents. The system incorporates a number of stationary or mobile radiation monitoring equipment, weather parameter measuring station, a GIS-based information processing center and the communication network, all running on a real-time operating system. It provides the automatic data collection on-line and off-line, remote diagnostic, advanced presentation techniques, including a graphically oriented executive support, which has the ability to respond to an emergency by geographical representation of the hazard zones on the map.The system can be integrated into national or international environmental monitoring systems, being based on local intelligent measuring and transmission units, simultaneous processing and data presentation using a real-time operating system for PC and geographical information system (GIS). Such an integrated system is composed of independent applications operating under the same computer, which is capable to improve the protection of the population and decision makers efforts, updating the remote GIS data base. All information can be managed directly from the map by multilevel data retrieving and presentation by using on-line dynamic evolution of the events, environment information, evacuation optimization, image and voice processing
Full Text Available The Information Service is a fundamental component in a grid environment. It has to meet a lot of requirements such as access to static and dynamic information related to grid resources, efficient and secure access to dynamic data, decentralized maintenance, fault tolerance etc., in order to achieve better performance, scalability, security and extensibility. Currently there are two different major approaches. One is based on a directory infrastructure and another one on a novel approach that exploits a relational DBMS. In this paper we present a performance comparison analysis between Grid Resource Information Service (GRIS and Local Dynamic Grid Catalog relational information service (LDGC, providing also information about two projects (iGrid and Grid Relational Catalog in the grid data management area.
Simchy-Gross, Rhimmon; Margulis, Elizabeth Hellmuth
In research on psychological time, it is important to examine the subjective duration of entire stimulus sequences, such as those produced by music (Teki, Frontiers in Neuroscience, 10, 2016). Yet research on the temporal oddball illusion (according to which oddball stimuli seem longer than standard stimuli of the same duration) has examined only the subjective duration of single events contained within sequences, not the subjective duration of sequences themselves. Does the finding that oddballs seem longer than standards translate to entire sequences, such that entire sequences that contain oddballs seem longer than those that do not? Is this potential translation influenced by the mode of information processing-whether people are engaged in direct or indirect temporal processing? Two experiments aimed to answer both questions using different manipulations of information processing. In both experiments, musical sequences either did or did not contain oddballs (auditory sliding tones). To manipulate information processing, we varied the task (Experiment 1), the sequence event structure (Experiments 1 and 2), and the sequence familiarity (Experiment 2) independently within subjects. Overall, in both experiments, the sequences that contained oddballs seemed shorter than those that did not when people were engaged in direct temporal processing, but longer when people were engaged in indirect temporal processing. These findings support the dual-process contingency model of time estimation (Zakay, Attention, Perception & Psychophysics, 54, 656-664, 1993). Theoretical implications for attention-based and memory-based models of time estimation, the pacemaker accumulator and coding efficiency hypotheses of time perception, and dynamic attending theory are discussed.
Hamada, Mohamed; Hassan, Mohammed
Interactive learning tools are emerging as effective educational materials in the area of computer science and engineering. It is a research domain that is rapidly expanding because of its positive impacts on motivating and improving students' performance during the learning process. This paper introduces an interactive learning environment for…
Yakov Mikhajlovich Dalinger
Full Text Available The organization of service production attributed to airports activity is analyzed. The importance and the actuality of information interaction problem solution between productive processes as a problem of organization of modern produc- tion are shown.Possibilities and features of information interaction system construction in form of multi-level hierarchical struc- ture have been shown. The airport is considered as an enterprise aimed at service production where it is necessary to analyze much in- formation in a limited time-frame. The production schedule often changes under the influence of many factors. This leads to the increase of the role of computerization and informatization of production processes what predetermines automation of production, creation of information environment and organization of information interaction needed for realization of production processes. The integrated organization form is proposed because it is oriented to the integration of different processes into a universal production system and it allows to conduct the coordination of local goals of particular processes in the context of the global purpose aimed at the improvement of the effectiveness of the airport activity. The main conditions needed for organization of information interaction between production processes and techno- logical operations are considered, and the list of the following problems is determined. The attention is paid to the necessity of compatibility of structure and organization of interaction system in the conditions of the airline and the necessity of be- ing its reflection in the information space of the airline. The usefulness of the intergrated organization form of information interaction based on information exchange between processes and service customers according to the network structure is explained. Multi-level character of this structure confirms its advantage over other items, however it also has a series of features presented
Li, Ximeng; Nielson, Flemming; Nielson, Hanne Riis
The security validation of practical computer systems calls for the ability to specify and verify information flow policies that are dependent on data content. Such policies play an important role in concurrent, communicating systems: consider a scenario where messages are sent to different...... processes according to their tagging. We devise a security type system that enforces content-dependent information flow policies in the presence of communication and concurrency. The type system soundly guarantees a compositional noninterference property. All theoretical results have been formally proved...
Zobrist, A. L.; Bryant, N. A.
Current trends in the use of remotely sensed data include integration of multiple data sources of various formats and use of complex models. These trends have placed a strain on information processing systems because an enormous number of capabilities are needed to perform a single application. A solution to this problem is to create a general set of capabilities which can perform a wide variety of applications. General capabilities for the Image-Based Information System (IBIS) are outlined in this report. They are then cross-referenced for a set of applications performed at JPL.
Bhattacherjee, Anol; Sanford, Clive Carlton
This study examines how processes of external influence shape information technology acceptance among potential users, how such influence effects vary across a user population, and whether these effects are persistent over time. Drawing on the elaboration-likelihood model (ELM), we compared two...... alternative influence processes, the central and peripheral routes, in motivating IT acceptance. These processes were respectively operationalized using the argument quality and source credibility constructs, and linked to perceived usefulness and attitude, the core perceptual drivers of IT acceptance. We...... further examined how these influence processes were moderated by users' IT expertise and perceived job relevance and the temporal stability of such influence effects. Nine hypotheses thus developed were empirically validated using a field survey of document management system acceptance at an eastern...
Huck, Friedrich O. (Editor); Park, Stephen K. (Editor)
This publication is a compilation of the papers presented at the NASA conference on Visual Information Processing for Television and Telerobotics. The conference was held at the Williamsburg Hilton, Williamsburg, Virginia on May 10 to 12, 1989. The conference was sponsored jointly by NASA Offices of Aeronautics and Space Technology (OAST) and Space Science and Applications (OSSA) and the NASA Langley Research Center. The presentations were grouped into three sessions: Image Gathering, Coding, and Advanced Concepts; Systems; and Technologies. The program was organized to provide a forum in which researchers from industry, universities, and government could be brought together to discuss the state of knowledge in image gathering, coding, and processing methods.
This report documents the impact analysis of a proposed Defense Waste Processing Facility (DWPF) for immobilizing high-level waste currently being stored on an interim basis at the Savannah River Plant (SRP). The DWPF will process the waste into a form suitable for shipment to and disposal in a federal repository. The DWPF will convert the high-level waste into: a leach-resistant form containing above 99.9% of all the radioactivity, and a residue of slightly contaminated salt. The document describes the SRP site and environs, including population, land and water uses; surface and subsurface soils and waters; meteorology; and ecology. A conceptual integrated facility for concurrently producing glass waste and saltcrete is described, and the environmental effects of constructing and operating the facility are presented. Alternative sites and waste disposal options are addressed. Also environmental consultations and permits are discussed
Barrett, M.D.; Schaetz, T.; Chiaverini, J.; Leibfried, D.; Britton, J.; Itano, W.M.; Jost, J.D.; Langer, C.; Ozeri, R.; Wineland, D.J.; Knill, E.
We summarize two experiments on the creation and manipulation of multi-particle entangled states of trapped atomic ions - quantum dense coding and quantum teleportation. The techniques used in these experiments constitute an important step toward performing large-scale quantum information processing. The techniques also have application in other areas of physics, providing improvement in quantum-limited measurement and fundamental tests of quantum mechanical principles, for example
errors (that is of the output of the human operator). There is growing evidence (Senders, personal communication; Norman , personal communication...relates to the relative tendency to depend on sensory information or to be more analytic and independent. Norman (personal communication) has referred...decision process model. Ergonomics, 12, 543-557. Senders, J., Elkid, J., Grignetti, M., & Smallwood , R. 1966. An investigation of the visual sampling
The Fifth Generation Computer Project in Japan intends to develop a new generation of computers by extensive research in many areas. This paper discusses many research topics which the Japanese are hoping will lead to a radical new knowledge information processing system. Topics discussed include new computer architecture, programming styles, semantics of programming languages, relational databases, linguistics theory, artificial intelligence, functional images and interference systems.
Yukalov, V. I.; Sornette, D.
A survey is given summarizing the state of the art of describing information processing in Quantum Decision Theory, which has been recently advanced as a novel variant of decision making, based on the mathematical theory of separable Hilbert spaces. This mathematical structure captures the effect of superposition of composite prospects, including many incorporated intended actions. The theory characterizes entangled decision making, non-commutativity of subsequent decisions, and intention int...
Full text: I will describe how cold atoms can be manipulated to realize arrays of addressable qbits as prototype quantum registers, focussing on how atom chips can be used in combination with cavity qed techniques to form such an array. I will discuss how the array can be generated and steered using optical lattices and the Mott transition, and describe the sources of noise and how these place limits on the use of such chips in quantum information processing. (author)
Quantum computers are information processing devices which operate by and exploit the laws of quantum mechanics, potentially allowing them to solve problems which are intractable using classical computers. This dissertation considers the practical issues involved in one of the more successful implementations to date, nuclear magnetic resonance (NMR). Techniques for dealing with systematic errors are presented, and a quantum protocol is implemented. Chapter 1 is a brief introduction to quantum computation. The physical basis of its efficiency and issues involved in its implementation are discussed. NMR quantum information processing is reviewed in more detail in Chapter 2. Chapter 3 considers some of the errors that may be introduced in the process of implementing an algorithm, and high-level ways of reducing the impact of these errors by using composite rotations. Novel general expressions for stabilising composite rotations are presented in Chapter 4 and a new class of composite rotations, tailored composite rotations, presented in Chapter 5. Chapter 6 describes some of the advantages and pitfalls of combining composite rotations. Experimental evaluations of the composite rotations are given in each case. An actual implementation of a quantum information protocol, approximate quantum cloning, is presented in Chapter 7. The dissertation ends with appendices which contain expansions of some equations and detailed calculations of certain composite rotation results, as well as spectrometer pulse sequence programs. (author)
A five-year design process of a continuous process wok has been studied with the aim of elucidating the conditions for integrating working environment aspects. The design process is seen as a network building activity and as a social shaping process of the artefact. A working environment log...... is suggested as a tool designers can use to integrate considerations of future operators' working environment....
Y. K. Khenner
Full Text Available The paper regards the development of the information and education environment of Russian universities as an important condition for successful reorganization of higher education. Taking as an example one of theUSuniversities, the author demonstrates the capacity of information education environment and its impact on the multilevel educational process. The comparative analysis of the existing information education environments of Russian an American universities reveals that such effective improvements as increasing number of students working on the individual curricula,__ implementation of the learning outcome monitoring, inclusive education, etc require immediate and substantial development of the information education environment of Russian universities. Both the development level and informational content of the environment in question remain unsatisfactory due to the financial, economic and staff related reasons. Consequently, the higher school reorganization is lagging behind retarded by the lack of synchronization between the attempts to improve the education quality and competitiveness on the one hand, and insufficient level of the information education environment characteristic of Russian universities on the other hand.
To achieve radiationless dd fusion and/or other LENR reactions via chemistry: some focus on environment of interior or altered near-surface volume of bulk metal; some on environment inside metal nanocrystals or on their surface; some on the interface between nanometal crystals and ionic crystals; some on a momentum shock-stimulation reaction process. Experiment says there is also a spontaneous reaction process.
Robertson, R. John; Barton, Jane
The different purposes present within a distributed information environment create the potential for repositories to enhance their metadata by capitalising on the diversity of metadata available for any given object. This paper presents three conceptual reference models required to achieve this optimisation of metadata workflow: the ecology of repositories, the object lifecycle model, and the metadata lifecycle model. It suggests a methodology for developing the metadata lifecycle model, and ...
Ros Izquierdo, María
Ambient Intelligence is a new research line in Artificial Intelligence field. Under this paradigm, users interact with an environment that is equipped with different kinds of sensors and actuators. Thanks to those devices, applications in AmI collect users?? activities and exploit that information, in order to learn users?? activities and to be able to anticipate their needs. In this thesis, we present a method to understand user daily activities in order to help people to improve their ...
Living organisms perform incredibly well in detecting a signal present in the environment. This information processing is achieved near optimally and quite reliably, even though the sources of signals are highly variable and complex. The work in the last few decades has given us a fair understanding of how individual signal processing units like neurons and cell receptors process signals, but the principles of collective information processing on biological networks are far from clear. Information processing in biological networks, like the brain, metabolic circuits, cellular-signaling circuits, etc., involves complex interactions among a large number of units (neurons, receptors). The combinatorially large number of states such a system can exist in makes it impossible to study these systems from the first principles, starting from the interactions between the basic units. The principles of collective information processing on such complex networks can be identified using coarse graining approaches. This could provide insights into the organization and function of complex biological networks. Here I study models of biological networks using continuum dynamics, renormalization, maximum likelihood estimation and information theory. Such coarse graining approaches identify features that are essential for certain processes performed by underlying biological networks. We find that long-range connections in the brain allow for global scale feature detection in a signal. These also suppress the noise and remove any gaps present in the signal. Hierarchical organization with long-range connections leads to large-scale connectivity at low synapse numbers. Time delays can be utilized to separate a mixture of signals with temporal scales. Our observations indicate that the rules in multivariate signal processing are quite different from traditional single unit signal processing.
Globalization process has created important changes and transformations across the world. These political, social, economic and cultural changes have considerably affected communication. The number of mass media instruments have increased, informatics has improved and also reaching information has become easier after the globalization of communication. New communication instruments and environments have been created. Globalised communication has also affected people, reaching the information ...
Bochulya Tetyana V.
Full Text Available The goal of the article lies in the study of logical and methodological justification of formation of the integrated system of accounting based on realities of the co-ordinated transformation of the society and economy and development of a new knowledge about formation and adjustment of the accounting system in it’s a priori new information competence with expansion of functionality for the justified idea of existence and development of business. Taking developments of the best representatives of the leading scientific society as a basis, the article offers a new vision of organisation of the accounting system, based on the modern projection of information competence and harmonisation of main processes of information service for adaptation of the system for multi-vector inquiries of consumers of information. Pursuant to results of the conducted study, the article makes an effort to change the established opinion about information and professional competences of the accounting system and attach a new qualitative significance to them. The article makes a proposal with respect to calculation of quality of the information system on the basis of key indicators of its information service. It lays the foundation of the prospective study of the problems of building the accounting system in such a projection, so that realities of internal and external processes were maximally co-ordinated based on the idea of their information development.
Aalst, van der W.M.P.; Wah, B.W.
Process-aware information systems support operational business processes by combining advances in information technology with recent insights from management science. Workflow management systems are typical examples of such systems. However, many other types of information systems are also "process
Hutchison, Catherine; McCreaddie, May
The aim of this project was to produce audiovisual patient information, which was user friendly and fit for purpose. The purpose of the audiovisual patient information is to inform patients about randomized controlled trials, as a supplement to their trial-specific written information sheet. Audiovisual patient information is known to be an effective way of informing patients about treatment. User involvement is also recognized as being important in the development of service provision. The aim of this paper is (i) to describe and discuss the process of developing the audiovisual patient information and (ii) to highlight the challenges and opportunities, thereby identifying implications for practice. A future study will test the effectiveness of the audiovisual patient information in the cancer clinical trial setting. An advisory group was set up to oversee the project and provide guidance in relation to information content, level and delivery. An expert panel of two patients provided additional guidance and a dedicated operational team dealt with the logistics of the project including: ethics; finance; scriptwriting; filming; editing and intellectual property rights. Challenges included the limitations of filming in a busy clinical environment, restricted technical and financial resources, ethical needs and issues around copyright. There were, however, substantial opportunities that included utilizing creative skills, meaningfully involving patients, teamworking and mutual appreciation of clinical, multidisciplinary and technical expertise. Developing audiovisual patient information is an important area for nurses to be involved with. However, this must be performed within the context of the multiprofessional team. Teamworking, including patient involvement, is crucial as a wide variety of expertise is required. Many aspects of the process are transferable and will provide information and guidance for nurses, regardless of specialty, considering developing this
Denton, T A; Matloff, J M
The rapid change occurring in American healthcare is a direct response to rising costs. Managed care is the fastest growing model that attempts to control escalating costs through limitations in patient choice, the active use of guidelines, and placing providers at risk. Managed care is an information intensive system, and those providers who use information effectively will be at an advantage in the competitive healthcare marketplace. There are five classes of information that providers must collect to be competitive in a managed care environment: patient satisfaction, medical outcomes, continuous quality improvement, quality of the decision, and financial data. Each of these should be actively used in marketing, assuring the quality of patient care, and maintaining financial stability. Although changes in our healthcare system are occurring rapidly, we need to respond to the marketplace to maintain our viability, but as physicians, we have the singular obligation to maintain the supremacy of the individual patient and the physician-patient relationship.
Gallistel, C R.
The framework provided by Claude Shannon's [Bell Syst. Technol. J. 27 (1948) 623] theory of information leads to a quantitatively oriented reconceptualization of the processes that mediate conditioning. The focus shifts from processes set in motion by individual events to processes sensitive to the information carried by the flow of events. The conception of what properties of the conditioned and unconditioned stimuli are important shifts from the tangible properties to the intangible properties of number, duration, frequency and contingency. In this view, a stimulus becomes a CS if its onset substantially reduces the subject's uncertainty about the time of occurrence of the next US. One way to represent the subject's knowledge of that time of occurrence is by the cumulative probability function, which has two limiting forms: (1) The state of maximal uncertainty (minimal knowledge) is represented by the inverse exponential function for the random rate condition, in which the US is equally likely at any moment. (2) The limit to the subject's attainable certainty is represented by the cumulative normal function, whose momentary expectation is the CS-US latency minus the time elapsed since CS onset. Its standard deviation is the Weber fraction times the CS-US latency.
van der Meer, Matthijs; Kurth-Nelson, Zeb; Redish, A David
Decisions result from an interaction between multiple functional systems acting in parallel to process information in very different ways, each with strengths and weaknesses. In this review, the authors address three action-selection components of decision-making: The Pavlovian system releases an action from a limited repertoire of potential actions, such as approaching learned stimuli. Like the Pavlovian system, the habit system is computationally fast but, unlike the Pavlovian system permits arbitrary stimulus-action pairings. These associations are a "forward'' mechanism; when a situation is recognized, the action is released. In contrast, the deliberative system is flexible but takes time to process. The deliberative system uses knowledge of the causal structure of the world to search into the future, planning actions to maximize expected rewards. Deliberation depends on the ability to imagine future possibilities, including novel situations, and it allows decisions to be taken without having previously experienced the options. Various anatomical structures have been identified that carry out the information processing of each of these systems: hippocampus constitutes a map of the world that can be used for searching/imagining the future; dorsal striatal neurons represent situation-action associations; and ventral striatum maintains value representations for all three systems. Each system presents vulnerabilities to pathologies that can manifest as psychiatric disorders. Understanding these systems and their relation to neuroanatomy opens up a deeper way to treat the structural problems underlying various disorders.
Full Text Available The recognition heuristic (RH; Goldstein and Gigerenzer, 2002 suggests that, when applicable, probabilistic inferences are based on a noncompensatory examination of whether an object is recognized or not. The overall findings on the processes that underlie this fast and frugal heuristic are somewhat mixed, and many studies have expressed the need for considering a more compensatory integration of recognition information. Regardless of the mechanism involved, it is clear that recognition has a strong influence on choices, and this finding might be explained by the fact that recognition cues arouse affect and thus receive more attention than cognitive cues. To test this assumption, we investigated whether recognition results in a direct affective signal by measuring physiological arousal (i.e., peripheral arterial tone in the established city-size task. We found that recognition of cities does not directly result in increased physiological arousal. Moreover, the results show that physiological arousal increased with increasing inconsistency between recognition information and additional cue information. These findings support predictions derived by a compensatory Parallel Constraint Satisfaction model rather than predictions of noncompensatory models. Additional results concerning confidence ratings, response times, and choice proportions further demonstrated that recognition information and other cognitive cues are integrated in a compensatory manner.
This paper describes an approach of reusability of software engineering technology in the area of ground space system design. System engineers have lots of needs similar to software developers: sharing of a common data base, capitalization of knowledge, definition of a common design process, communication between different technical domains. Moreover system designers need to simulate dynamically their system as early as possible. Software development environments, methods and tools now become operational and widely used. Their architecture is based on a unique object base, a set of common management services and they host a family of tools for each life cycle activity. In late '92, CNES decided to develop a demonstrative software environment supporting some system activities. The design of ground space data processing systems was chosen as the application domain. ELISA (Integrated Software Environment for Architectures Specification) was specified as a 'demonstrator', i.e. a sufficient basis for demonstrations, evaluation and future operational enhancements. A process with three phases was implemented: system requirements definition, design of system architectures models, and selection of physical architectures. Each phase is composed of several activities that can be performed in parallel, with the provision of Commercial Off the Shelves Tools. ELISA has been delivered to CNES in January 94, currently used for demonstrations and evaluations on real projects (e.g. SPOT4 Satellite Control Center). It is on the way of new evolutions.
Doering, Aaron; Henrickson, Jeni
Self-directed, inquiry-based learning opportunities focused on transdisciplinary real-world problem solving have been shown to foster creativity in learners. What tools might we provide classroom teachers to scaffold them and their students through this creative process? This study examines an online informal learning environment and the role the…
Full Text Available Annotation In the article considered problems of theoretical meaning in the context of terminology that used in distance education, practical aspects of designing methods of didactic implementation and practical solutions as for to organization of information education environment and personal-oriented implementation of its educational process in secondary general education school.
Andersen, Lasse Mejling
This PhD thesis treats applications of nonlinear optical effects for quantum information processing. The two main applications are four-wave mixing in the form of Bragg scattering (BS) for quantum-state-preserving frequency conversion, and sum-frequency generation (SFG) in second-order nonlinear......-chirping the pumps. In the high-conversion regime without the effects of NPM, exact Green functions for BS are derived. In this limit, separability is possible for conversion efficiencies up to 60 %. However, the system still allows for selective frequency conversion as well as re-shaping of the output. One way...
Quantum wells, alternate thin layers of two different semiconductor materials, show an exceptional electric field dependence of the optical absorption, called the quantum-confined Stark effect (QCSE), for electric fields perpendicular to the layers. This enables electrically controlled optical modulators and optically controlled self-electro-optic-effect devices that can operate at high speed and low energy density. Recent developments in these QCSE devices are summarized, including new device materials and novel device structures. The variety of sophisticated devices now demonstrated is promising for applications to information processing
Burgess, Ann W; Clements, Paul T
Sexual abuse is considered to be a pandemic contemporary public health issue, with significant physical and psychosocial consequences for its victims. However, the incidence of elder sexual assault is difficult to estimate with any degree of confidence. A convenience sample of 284 case records were reviewed for Post-Traumatic Stress Disorder (PTSD) symptoms. The purpose of this paper is to present the limited data noted on record review on four PTSD symptoms of startle, physiological upset, anger, and numbness. A treatment model for information processing of intrapsychic trauma is presented to describe domain disruption within a nursing diagnosis of rape trauma syndrome and provide guidance for sensitive assessment and intervention.
Full Text Available Natural resources, especially energetical ones, have continuously influenced the evolution of human society, including the economical developement, and so the problem of their deficiency and their limited character is a problem of major interest for the human kind in their quest to find the balance betwen the need of economical expansion and the environment protection. The purpose of this paper work is to show the importancy of energy eficiency by asuming two main action directions: to encrease the quantity of renewable energy and to emprove the energetical efficiency. After the researches we made, we brought in attention the main mechanisms used in the insurance of sustainability security and competitiveness of the energy sector. These practices the objectives of the sustainable development principle, exemplified from accountancy point of view through a new instrument in the economical theory: environmental accountancy which ensures the background regarding the recognition, evaluation and presentation of environment information.
Full Text Available Internet provides a convenient environment to share geographic information. Web GIS (Geographic Information System even provides users a direct access environment to geographic databases through Internet. However, the complexity of geographic data makes it difficult for users to understand the real content and the limitation of geographic information. In some cases, users may misuse the geographic data and make wrong decisions. Meanwhile, geographic data are distributed across various government agencies, academic institutes, and private organizations, which make it even more difficult for users to fully understand the content of these complex data. To overcome these difficulties, this research uses metadata as a guiding mechanism for users to fully understand the content and the limitation of geographic data. We introduce three metadata standards commonly used for geographic data and metadata authoring tools available in the US. We also review the current development of geographic metadata standard in Taiwan. Two metadata authoring tools are developed in this research, which will enable users to build their own geographic metadata easily.[Article content in Chinese
Palau, Sandra; Pardo, Juan Carlos
We consider continuous state branching processes that are perturbed by a Brownian motion. These processes are constructed as the unique strong solution of a stochastic differential equation. The long-term extinction and explosion behaviours are studied. In the stable case, the extinction and explosion probabilities are given explicitly. We find three regimes for the asymptotic behaviour of the explosion probability and, as in the case of branching processes in random environment, we find five...
Gulli, Michael A.
Deneb Robotics is a leader in the development of commercially available, leading edge three- dimensional simulation software tools for virtual prototyping,, simulation-based design, manufacturing process simulation, and factory floor simulation and training applications. Deneb has developed and commercially released a preliminary Virtual Collaborative Engineering (VCE) capability for Integrated Product and Process Development (IPPD). This capability allows distributed, real-time visualization and evaluation of design concepts, manufacturing processes, and total factory and enterprises in one seamless simulation environment.
Scholten, Lotte; van Knippenberg, Daan; Nijstad, Bernard A.; De Dreu, Carsten K. W.
Integrating dual-process models [Chaiken, S., & Trope, Y. (Eds.). (1999). Dual-process theories in social psychology. NewYork: Guilford Press] with work on information sharing and group decision-making [Stasser, G., & Titus, W. (1985). Pooling of unshared information in group decision making: biased
This presentation deals with external electronic information sources (e-sources) i. e. about data bases which are formed no by users or their institutes. Data bases are compiled by producers of data which are publishing in different forms and offerer it for users by different form. In the first part of contribution e-sources are described at the first generally. In the second part, some most significant data bases about environment in on-line medium of Internet, are described in detail
Routray, Bijayalaxmi; Satpathy, Sunil
Stress is the changes which our bodies experience as we adjust to our continually changing environment. It has been an integral part of our daily life since prehistoric times and Library & Information science personnel are not exception to this. Thus we cannot avoid stress in our life; rather the best policy is to manage it properly to increase our efficiency. This article attempts to define stress in the light of LIS profession. It describes about the types of stress in libraries and its rea...
Du, Xiangyun; Kolmos, Anette
with the expected professional competencies. Based on the educational practice of PBL Aalborg Model, which is characterized by problem-orientation, project-organization and team work, this paper examines the process of developing process competencies through studying engineering in a PBL environment from...... process competencies through doing problem and project based work in teams? 2) How do students perceive their achievement of these process competencies?......Future engineers are not only required to master technological competencies concerning solving problems, producing and innovating technology, they are also expected to have capabilities of cooperation, communication, and project management in diverse social context, which are referred to as process...
Furusawa, Akira; Takei, Nobuyuki
Quantum teleportation is one of the most important subjects in quantum information science. This is because quantum teleportation can be regarded as not only quantum information transfer but also a building block for universal quantum information processing. Furthermore, deterministic quantum information processing is very important for efficient processing and it can be realized with continuous-variable quantum information processing. In this review, quantum teleportation for continuous variables and related quantum information processing are reviewed from these points of view
This manual draws together information on the environmental consequences of energy technologies that will be in use in the United States during the next 20 years. We hope it will prove useful to planners, policymakers, legislators, researchers, and environmentalists. The information on environmental issues, control technologies, and energy production and conservation processes should also be a convenient starting point for deeper exploration. Published references are given for the statements, data, and conclusions so that the interested reader can obtain more detailed information where necessary. Environmental aspects of energy technologies are presented in a form suitable for government and public use and are intended to assist decisionmakers, researchers, and the public with basic information and references that can be relied upon through changing policies and changing world energy prices
Ollivier, Harold; Poulin, David; Zurek, Wojciech H.
We study the role of the information deposited in the environment of an open quantum system in the course of the decoherence process. Redundant spreading of information--the fact that some observables of the system can be independently read off from many distinct fragments of the environment--is investigated as the key to effective objectivity, the essential ingredient of classical reality. This focus on the environment as a communication channel through which observers learn about physical systems underscores the importance of quantum Darwinism--selective proliferation of information about 'the fittest states' chosen by the dynamics of decoherence at the expense of their superpositions--as redundancy imposes the existence of preferred observables. We demonstrate that the only observables that can leave multiple imprints in the environment are the familiar pointer observables singled out by environment-induced superselection (einselection) for their predictability. Many independent observers monitoring the environment will therefore agree on properties of the system as they can only learn about preferred observables. In this operational sense, the selective spreading of information leads to appearance of an objective classical reality from within the quantum substrate
Shirley Guimarães Pimenta
Full Text Available The interaction amongst the ‘user’, ‘information’, and ‘text’ is of interest to Information Science although it has deserved insufficient attention in the literature. This issue is addressed by this paper whose main purpose is to contribute to the discussion of theoretical affinity between the cognitive viewpoint in Information Science and the information processing approach in Cognitive Psychology. Firstly, the interdisciplinary nature of Information Science is discussed and justified as a means to deepen and strengthen its theoretical framework. Such interdisciplinarity helps to avoid stagnation and keep pace with other disciplines. Secondly, the discussion takes into consideration the cognitive paradigm, which originates the cognitive viewpoint approach in Information Science. It is highlighted that the cognitive paradigm represented a change in the Social Sciences due to the shift of focus from the object and the signal to the individual. Besides that, it sheds light to the notion of models of worlds, i.e., the systems of categories and concepts that guide the interaction between the individual and his/her environment. Thirdly, the theoretical assumptions of the cognitive viewpoint approach are discussed, with emphasis on the concept of ‘information’, as resulting of cognitive processes and as related to the notion of ‘text’. This approach points out the relevance of understanding the interaction amongst users, information, and text. However, it lacks further development. Using notions which are common to both approaches, some of the gaps can be fulfilled. Finally, the concept of ‘text’, its constituents and structures are presented from the perspective of text comprehension models and according to the information processing approach. As a concluding remark, it is suggested that bringing together the cognitive viewpoint and the information processing approach can be enriching and fruitful to the both Information
A Virtual Building Environment (VBE) is a ''place'' where building industry project staffs can get help in creating Building Information Models (BIM) and in the use of virtual buildings. It consists of a group of industry software that is operated by industry experts who are also experts in the use of that software. The purpose of a VBE is to facilitate expert use of appropriate software applications in conjunction with each other to efficiently support multidisciplinary work. This paper defines BIM and virtual buildings, and describes VBE objectives, set-up and characteristics of operation. It informs about the VBE Initiative and the benefits from a couple of early VBE projects.
Wang, Xianwen; Liu, Zhiguo; Zhang, Wenchang; Wu, Qingfu; Tan, Shulin
We have designed a mobile operating room information management system. The system is composed of a client and a server. A client, consisting of a PC, medical equipments, PLC and sensors, provides the acquisition and processing of anesthesia and micro-environment data. A server is a powerful computer that stores the data of the system. The client gathers the medical device data by using the C/S mode, and analyzes the obtained HL7 messages through the class library call. The client collects the micro-environment information with PLC, and finishes the data reading with the OPC technology. Experiment results showed that the designed system could manage the patient anesthesia and micro-environment information well, and improve the efficiency of the doctors' works and the digital level of the mobile operating room.
Bapat, Shweta S; Patel, Harshali K; Sansgiry, Sujit S
In this study, we evaluate the role of information anxiety and information load on the intention to read information from prescription drug information leaflets (PILs). These PILs were developed based on the principals of information load and consumer information processing. This was an experimental prospective repeated measures study conducted in the United States where 360 (62% response rate) university students (>18 years old) participated. Participants were presented with a scenario followed by exposure to the three drug product information sources used to operationalize information load. The three sources were: (i) current practice; (ii) pre-existing one-page text only; and (iii) interventional one-page prototype PILs designed for the study. Information anxiety was measured as anxiety experienced by the individual when encountering information. The outcome variable of intention to read PILs was defined as the likelihood that the patient will read the information provided in the leaflets. A survey questionnaire was used to capture the data and the objectives were analyzed by performing a repeated measures MANOVA using SAS version 9.3. When compared to current practice and one-page text only leaflets, one-page PILs had significantly lower scores on information anxiety ( p information load ( p Information anxiety and information load significantly impacted intention to read ( p < 0.001). Newly developed PILs increased patient's intention to read and can help in improving the counseling services provided by pharmacists.
Palfreyman Niall M
Full Text Available Abstract Background Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs, which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Results Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Conclusion Narrator is a
Mandel, Johannes J; Fuss, Hendrik; Palfreyman, Niall M; Dubitzky, Werner
Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Narrator is a flexible and intuitive systems biology tool. It is
Shimizu, H.; Yamaguchi, Y.
Animals, including human beings, have ability to understand the meaning of indefinite information from environments. Thanks to this ability the animals have flexibility in their behaviors for the environmental changes. Staring from a hypothesis that understanding of the input (Shannonian) information is based on the self-organization of a neuronal representation, that is, a spatio-temporal pattern constituted of coherent activities of neurons encoding a ``figure'', being separated from the ``background'' encoded by incoherent activities, the conditions necessary for the understanding of indefinite information were discussed. The crucial conditions revealed are that the neuronal system is incomplete or indefinite in a sense that its rules for the self-organization of the neuronal activities are completed only after the input of the environmental information and that it has an additional system named "self-specific to relevantly self-organize dynamical ``constraints'' or ``boundary conditions'' for the self-organization of the representation. For the simultaneous self-organizations of the relevant constraints and the representation, a global circulation of activities must be self-organized between these two kinds of neuronal systems. Moreover, for the performance of these functions, a specific kind of synergetic elements, ``holon elements'', are also necessary. By means of a neuronal model, the visual perception of indefinite input signals is demonstrated. The results obtained are consistent with those recently observed in the visual cortex of cats.
It is difficult not to be amazed by the ability of the human brain to process, to structure and to memorize information. Even by the toughest standards the behaviour of this network of about 10 11 neurons qualifies as complex, and both the scientific community and the public take great interest in the growing field of neuroscience. The scientific endeavour to learn more about the function of the brain as an information processing system is here a truly interdisciplinary one, with important contributions from biology, computer science, physics, engineering and mathematics as the authors quite rightly point out in the introduction of their book. The role of the theoretical disciplines here is to provide mathematical models of information processing systems and the tools to study them. These models and tools are at the centre of the material covered in the book by Coolen, Kuehn and Sollich. The book is divided into five parts, providing basic introductory material on neural network models as well as the details of advanced techniques to study them. A mathematical appendix complements the main text. The range of topics is extremely broad, still the presentation is concise and the book well arranged. To stress the breadth of the book let me just mention a few keywords here: the material ranges from the basics of perceptrons and recurrent network architectures to more advanced aspects such as Bayesian learning and support vector machines; Shannon's theory of information and the definition of entropy are discussed, and a chapter on Amari's information geometry is not missing either. Finally the statistical mechanics chapters cover Gardner theory and the replica analysis of the Hopfield model, not without being preceded by a brief introduction of the basic concepts of equilibrium statistical physics. The book also contains a part on effective theories of the macroscopic dynamics of neural networks. Many dynamical aspects of neural networks are usually hard to find in the
Khoury, Antonio Z. [Universidade Federal Fluminense (UFF), Niteroi, RJ (Brazil)
Full text: In this work we discuss several proposals for quantum information processing using the transverse structure of paraxial beams. Different techniques for production and manipulation of optical vortices have been employed and combined with polarization transformations in order to investigate fundamental properties of quantum entanglement as well as to propose new tools for quantum information processing. As an example, we have recently proposed and demonstrated a controlled NOT (CNOT) gate based on a Michelson interferometer in which the photon polarization is the control bit and the first order transverse mode is the target. The device is based on a single lens design for an astigmatic mode converter that transforms the transverse mode of paraxial optical beams. In analogy with Bell's inequality for two-qubit quantum states, we propose an inequality criterion for the non-separability of the spin-orbit degrees of freedom of a laser beam. A definition of separable and non-separable spin-orbit modes is used in consonance with the one presented in Phys. Rev. Lett. 99, 2007. As the usual Bell's inequality can be violated for entangled two-qubit quantum states, we show both theoretically and experimentally that the proposed spin-orbit inequality criterion can be violated for non-separable modes. The inequality is discussed both in the classical and quantum domains. We propose a polarization to orbital angular momentum teleportation scheme using entangled photon pairs generated by spontaneous parametric down conversion. By making a joint detection of the polarization and angular momentum parity of a single photon, we are able to detect all the Bell-states and perform, in principle, perfect teleportation from a discrete to a continuous system using minimal resources. The proposed protocol implementation demands experimental resources that are currently available in quantum optics laboratories. (author)
Full Text Available In the conditions of globalization and the world economic communications, the role of information support of business processes increases in various branches and fields of activity. There is not an exception for the warehouse activity. Such information support is realized in warehouse logistic systems. In relation to territorial administratively education, the warehouse logistic system gets a format of difficult social and economic structure which controls the economic streams covering the intermediary, trade and transport organizations and the enterprises of other branches and spheres. Spatial movement of inventory items makes new demands to participants of merchandising. Warehousing (in the meaning – storage – is one of the operations entering into logistic activity, on the organization of a material stream, as a requirement. Therefore, warehousing as "management of spatial movement of stocks" – is justified. Warehousing, in such understanding, tries to get rid of the perception as to containing stocks – a business expensive. This aspiration finds reflection in the logistic systems working by the principle: "just in time", "economical production" and others. Therefore, the role of warehouses as places of storage is transformed to understanding of warehousing as an innovative logistic system.
Hoard, James E.
Integrating diverse information sources and application software in a principled and general manner will require a very capable advanced information management (AIM) system. In particular, such a system will need a comprehensive addressing scheme to locate the material in its docuverse. It will also need a natural language processing (NLP) system of great sophistication. It seems that the NLP system must serve three functions. First, it provides an natural language interface (NLI) for the users. Second, it serves as the core component that understands and makes use of the real-world interpretations (RWIs) contained in the docuverse. Third, it enables the reasoning specialists (RSs) to arrive at conclusions that can be transformed into procedures that will satisfy the users' requests. The best candidate for an intelligent agent that can satisfactorily make use of RSs and transform documents (TDs) appears to be an object oriented data base (OODB). OODBs have, apparently, an inherent capacity to use the large numbers of RSs and TDs that will be required by an AIM system and an inherent capacity to use them in an effective way.
Zhao, X.; Liu, Chengfei; Lin, T.; Ranasinghe, D.C.; Sheng, Q.Z.
As a tracking technology, Radio Frequency Identification (RFID) is now widely applied to enhance the context awareness of enterprise information systems. Such awareness provides great opportunities to facilitate business process automation and thereby improve operation efficiency and accuracy. With
The current external environment of information work has undergone profound changes. The world competition in Military fields, the development of national economy and the applications of high-technologies put forward higher requirements to intelligence work. Under the environment of global information integration, 'eyes and ears' roles of nuclear scientific and technical information will be more highlighted. In this context, we believe that the way of nuclear sci-tech information development should be focused on comprehensive research and be based on the building of information resources and advanced information technology methods, forming an integrated service system with capabilities of rapid resources, decision support, information assurance and sustainable development. The goal of nuclear sci-tech information development should be to serve the development of nuclear power and the popularization and application of nuclear technology in the fields of national economy, to implement the strategy of 'integrated intelligence support', to ensure the formation of rapid response capability, to enhance capabilities of decision support and science and technology development guidance, to build digital information resources and to constantly promote the research on network collaboration. This paper analyzes the characteristics of nuclear sci-tech information under new competitive environment, describes the ideas and goals of its development and proposes the way to achieve these goals. In order to provide the information support and service with the characteristics of rapid response, high quality and high efficiency, the paper also puts forward that under the new environment, we should optimize the service system, extend service functions, transform service mode, speed up the transformation on intelligence work, adjust and optimize business content and structure, promote business process re-engineering, and pay equal attention to 'ensuring demands' and 'guiding the future
Full Text Available We propose an intelligent and an efficient query processing approach for semantic mediation of information systems. We propose also a generic multi agent architecture that supports our approach. Our approach focuses on the exploitation of intelligent agents for query reformulation and the use of a new technology for the semantic representation. The algorithm is self-adapted to the changes of the environment, offers a wide aptitude and solves the various data conflicts in a dynamic way; it also reformulates the query using the schema mediation method for the discovered systems and the context mediation for the other systems.
Slovak Hydrometeorological Institute (SHMI) is operator of radiation monitoring from 1963. At present SHMI operates in its monitoring network 23 detectors GammaTracer fy Genitron, one mobile detector and one standby detector. Radiation data (dose rate in the unit nSv/h) from detectors in the automated meteorological stations are transmitted by data-logger and private institute network to National Telecommunication Centre in Bratislava. The data from MSS (message switch system) are inserted into the database. The 1 hours and 24 hours averages are computed on the server automatically. Delay between time of measurements and time of inserting data to database is only 10 min. Radiation files from SHMI network are on-line transmitted to information system of Nuclear Regulatory Authority of the Slovak Republic and to information system of Slovak Army. Transmission to to Crisis Centre of Civil Protection is under reconstruction at present. Database contains one table for radiation data and several tables for configurations, catalogues of stations and additional tables. Database works in environment client-server. On client PC runs the user front-end application. This application provides to display the data using many filters, to display tables with configurations concerning technical equipment, to display maps, graphs, etc. There is the possibility to store data into the archives, to make reports and to analyse data in the environment of professional statistical software. Precipitations values from meteorological stations were integrated do the information system of radiation monitoring for better interpretation of gamma dose rate values. SHMI cooperates in the radiation data exchange with European Commission Joint Research Centre in Ispra, Radiation Warning Centre in Vienna and Meteoservice Budapest. (author)
Hohn, Michael; Tang, Grant; Goodyear, Grant; Baldwin, P R; Huang, Zhong; Penczek, Pawel A; Yang, Chao; Glaeser, Robert M; Adams, Paul D; Ludtke, Steven J
SPARX (single particle analysis for resolution extension) is a new image processing environment with a particular emphasis on transmission electron microscopy (TEM) structure determination. It includes a graphical user interface that provides a complete graphical programming environment with a novel data/process-flow infrastructure, an extensive library of Python scripts that perform specific TEM-related computational tasks, and a core library of fundamental C++ image processing functions. In addition, SPARX relies on the EMAN2 library and cctbx, the open-source computational crystallography library from PHENIX. The design of the system is such that future inclusion of other image processing libraries is a straightforward task. The SPARX infrastructure intelligently handles retention of intermediate values, even those inside programming structures such as loops and function calls. SPARX and all dependencies are free for academic use and available with complete source.
Buonfiglio, Marzia; Di Sabato, Francesco; Mandillo, Silvia; Albini, Mariarita; Di Bonaventura, Carlo; Giallonardo, Annateresa; Avanzini, Giuliano
Relevant to the study of epileptogenesis is learning processing, given the pivotal role that neuroplasticity assumes in both mechanisms. Recently, evoked potential analyses showed a link between analytic cognitive style and altered neural excitability in both migraine and healthy subjects, regardless of cognitive impairment or psychological disorders. In this study we evaluated analytic/global and visual/auditory perceptual dimensions of cognitive style in patients with epilepsy. Twenty-five cryptogenic temporal lobe epilepsy (TLE) patients matched with 25 idiopathic generalized epilepsy (IGE) sufferers and 25 healthy volunteers were recruited and participated in three cognitive style tests: "Sternberg-Wagner Self-Assessment Inventory", the C. Cornoldi test series called AMOS, and the Mariani Learning style Questionnaire. Our results demonstrate a significant association between analytic cognitive style and both IGE and TLE and respectively a predominant auditory and visual analytic style (ANOVA: p values <0,0001). These findings should encourage further research to investigate information processing style and its neurophysiological correlates in epilepsy. Copyright © 2017 Elsevier Inc. All rights reserved.
A. N. Privalov
Full Text Available Introduction. One of the tendencies of modern higher education is the ubiquitous use of information and communication technologies. At the same time, the functioning of the electronic information and educational environment (IEE of the university should be based on the means of IEE and the condition of its information security.The aim of the research is conceptualization of a problem of the rational organization of the safe information and education environment of higher education institution wherein reliable protection of its infrastructure, the personal and unique information of a pupil and teacher and virtual space of their educational interaction is provided.Methodology and research methods. System-based approach is a key approach to organization of safe educational environment of the university. From the point of view of authors, personal-activity and functional approaches are expedient while designing and development of a safe IEE. Socio-historical and theoretical-methodological analysis, modeling, research and synthesis of experience of effective application of the systems approach in educational professional organizations are used.Results and scientific novelty. The concept «safe information educational environment of the university» is specified wherein the first word has to express a predominant quality of the system. Creating a safe information environment in educational professional organizations provides a convenient and safe educational environment in the process of professional training of university students. The components and directions for the organization of the safe IEE are highlighted. Practical recommendations for its design and successful functioning are given.Practical significance. The materials of the present research can be demanded by managers and administrative employees of educational organizations.
Bahjat Rashad Shahin
Full Text Available The research addresses smart city concept as it is the latest urban design trends, by the investment of the capabilities of human, and artificial intelligence for the sake of the advancement of the city. The concept of a smart city is described as one of the most important manifestations of the information revolution, with the end of the twentieth, and the beginning of twenty – first century, The research attributes the emergence of the concept to: deficiencies of means, and traditional methods in building and development of cities, as well as The significant increase in the number of city and global metropolises dwellers. So, smart city approach has been adopted, along with innovative principles and methods which consolidate the performance and efficiency of the city at services, health, economic, social, and environmental levels. Global studies indicate, to the urban contributions scarcity, in the area of smart city, so the need of vocabularies, elements, and innovative solutions studies have emerged and then the role of information’s in achieving the aim of smart city initiatives. The research problem is: The acknowledge gap about the impact of the informational environment, to establish smart city initiative. The research adopts the hypothesis: A multi-disciplinary informational thought plays an essential role in achieving smart city initiative. To address the research problem, the research starts with the definition of the concept of smart city, to provide the knowledge platform, then addresses the smart city approaches, as well as, smart urban environment, smart city structure, key elements and smart networks, to concludes key vocabulary, indicators and constituents of smart city establishment, Then applied to the case studies with analytical descriptive approach, to conclude the key constituents to establish smart city in Iraq. The research concluded to confirm the role of the informational thought, represented by global research
Queensland Library Board, Brisbane (Australia).
Currently being developed by the State Library of Queensland, Australia, ORACLE (On-Line Retrieval of Acquisitions, Cataloguing, and Circulation Details for Library Enquiries) is a computerized library system designed to provide rapid processing of library materials in a multi-user environment. It is based on the Australian MARC format and fully…
Патимат Магомедовна Исаева
Full Text Available The article describes aspects of the implementation of information and educational environment in order to implement the priorities for the development and implementation of innovative distance learning technologies for the transition from a traditional institution in the model of the innovation Institute, which connects professional, cultural or scientific competence of bachelors. The essence of the problem ongoing research is the need to achieve efficiency new quality of education, i.e. the transition to a higher level of preparation of bachelors.In accordance with the requirements of Federal state educational standard of higher education in conditions of modernization of education, teacher’s professional work is increasingly associated with innovative research innovative research of the educational process.It was the introduction of electronic information-educational environment is one of the main directions for the preparation of future bachelors and forces to reconsider their views on the content of the educational system. Describes the experience of creation of information-educational environment of Pyatigorsk state linguistic University on the basis of Moodle. Thanks to the creation of information-educational environment in high schools, teachers, parents, and students aware of all the events of the learning process and in the world of science and, moreover, this system gives you the opportunity to maintain close communication with teachers and between students.
The term environment refers to the internal and external context in which organizations operate. For some scholars, environment is defined as an arrangement of political, economic, social and cultural factors existing in a given context that have an impact on organizational processes and structures....... For others, environment is a generic term describing a large variety of stakeholders and how these interact and act upon organizations. Organizations and their environment are mutually interdependent and organizational communications are highly affected by the environment. This entry examines the origin...... and development of organization-environment interdependence, the nature of the concept of environment and its relevance for communication scholarships and activities....
Montello, Daniel R.
I review theories and research on the cognitive processing of environmental distance information by humans, particularly that acquired via direct experience in the environment. The cognitive processes I consider for acquiring and thinking about environmental distance information include working-memory, nonmediated, hybrid, and simple-retrieval processes. Based on my review of the research literature, and additional considerations about the sources of distance information and the situations in which it is used, I propose an integrative conceptual model to explain the cognitive processing of distance information that takes account of the plurality of possible processes and information sources, and describes conditions under which particular processes and sources are likely to operate. The mechanism of summing vista distances is identified as widely important in situations with good visual access to the environment. Heuristics based on time, effort, or other information are likely to play their most important role when sensory access is restricted.
Szymanski, Jacek; Wilson, David L; Zhang, Guo-Qiang
The rapid expansion of biomedical research has brought substantial scientific and administrative data management challenges to modern core facilities. Scientifically, a core facility must be able to manage experimental workflow and the corresponding set of large and complex scientific data. It must also disseminate experimental data to relevant researchers in a secure and expedient manner that facilitates collaboration and provides support for data interpretation and analysis. Administratively, a core facility must be able to manage the scheduling of its equipment and to maintain a flexible and effective billing system to track material, resource, and personnel costs and charge for services to sustain its operation. It must also have the ability to regularly monitor the usage and performance of its equipment and to provide summary statistics on resources spent on different categories of research. To address these informatics challenges, we introduce a comprehensive system called MIMI (multimodality, multiresource, information integration environment) that integrates the administrative and scientific support of a core facility into a single web-based environment. We report the design, development, and deployment experience of a baseline MIMI system at an imaging core facility and discuss the general applicability of such a system in other types of core facilities. These initial results suggest that MIMI will be a unique, cost-effective approach to addressing the informatics infrastructure needs of core facilities and similar research laboratories.
Perry, J. L.; Humphries, W. R.
The Space Station will provide a unique facility for conducting material-processing and life-science experiments under microgravity conditions. These conditions place special requirements on the U.S. Laboratory for storing and transporting chemicals and process fluids, reclaiming water from selected experiments, treating and storing experiment wastes, and providing vacuum utilities. To meet these needs and provide a safe laboratory environment, the Process Material Management System (PMMS) is being developed. Preliminary design requirements and concepts related to the PMMS are addressed, and the MSFC PMMS breadboard test facility and a preliminary plan for validating the overall system design are discussed.
Business processes and information systems mutually affect each other in non-trivial ways. Frequently, the business process design and the information system design are not well aligned. This means that business processes are designed without taking the information system impact into account, and vice versa. Missing alignment at design time often results in quality problems at runtime, such as large response times of information systems, large process execution times, overloaded information s...
The Secretary of Defense announced the Corporate Information Management initiative on November 16, 1990, to establish a DoD-wide concept for managing computer, communications, and information management functions...
A. A. Malyuk
Full Text Available The features of the information protection task solution in its modern statement as a complex problem that encompasses all aspects of information technology development are discussed. Such an interpretation would inevitably lead to an increase of the role of the systemic problems solution of which relies on advanced scientific and methodological basis, so called information protection processes’ intensification.
Ahmad R. Abu-El-Quran
Full Text Available We introduce a multiengine speech processing system that can detect the location and the type of audio signal in variable noisy environments. This system detects the location of the audio source using a microphone array; the system examines the audio first, determines if it is speech/nonspeech, then estimates the value of the signal to noise (SNR using a Discrete-Valued SNR Estimator. Using this SNR value, instead of trying to adapt the speech signal to the speech processing system, we adapt the speech processing system to the surrounding environment of the captured speech signal. In this paper, we introduced the Discrete-Valued SNR Estimator and a multiengine classifier, using Multiengine Selection or Multiengine Weighted Fusion. Also we use the SI as example of the speech processing. The Discrete-Valued SNR Estimator achieves an accuracy of 98.4% in characterizing the environment's SNR. Compared to a conventional single engine SI system, the improvement in accuracy was as high as 9.0% and 10.0% for the Multiengine Selection and Multiengine Weighted Fusion, respectively.
Atick, Joseph J
The sensory pathways of animals are well adapted to processing a special class of signals, namely stimuli from the animal's environment. An important fact about natural stimuli is that they are typically very redundant and hence the sampled representation of these signals formed by the array of sensory cells is inefficient. One could argue for some animals and pathways, as we do in this review, that efficiency of information representation in the nervous system has several evolutionary advantages. Consequently, one might expect that much of the processing in the early levels of these sensory pathways could be dedicated towards recoding incoming signals into a more efficient form. In this review, we explore the principle of efficiency of information representation as a design principle for sensory processing. We give a preliminary discussion on how this principle could be applied in general to predict neural processing and then discuss concretely some neural systems where it recently has been shown to be successful. In particular, we examine the fly's LMC coding strategy and the mammalian retinal coding in the spatial, temporal and chromatic domains.
NEA has for many years now been collating information on, and analysing, laws and regulations on the peaceful uses of nuclear energy, and this work has resulted in a series of publications. However, as seen by the multiplication of computer-based legal information centres, both at national and international level, conventional information systems are no longer adequate to deal with the increasing volume of information and with users' needs. In view of the particular aspects of nuclear law and of its own availabilities, NEA has endeavoured to make the best possible use of existing structures by opting for participation in the IAEA International Nuclear Information System rather than by creating a specialised centre. Before becoming operational, the arrangements concluded between NEA and IAEA required that the INIS rules be altered somewhat to take account of the specific problems raised by treatment of legal literature and also to improve the quality of information provided to users. (auth.) [fr
Kobayashi, Tetsuya J.; Sughiyama, Yuki
Adaptation in a fluctuating environment is a process of fueling environmental information to gain fitness. Living systems have gradually developed strategies for adaptation from random and passive diversification of the phenotype to more proactive decision making, in which environmental information is sensed and exploited more actively and effectively. Understanding the fundamental relation between fitness and information is therefore crucial to clarify the limits and universal properties of adaptation. In this work, we elucidate the underlying stochastic and information-thermodynamic structure in this process, by deriving causal fluctuation relations (FRs) of fitness and information. Combined with a duality between phenotypic and environmental dynamics, the FRs reveal the limit of fitness gain, the relation of time reversibility with the achievability of the limit, and the possibility and condition for gaining excess fitness due to environmental fluctuation. The loss of fitness due to causal constraints and the limited capacity of real organisms is shown to be the difference between time-forward and time-backward path probabilities of phenotypic and environmental dynamics. Furthermore, the FRs generalize the concept of the evolutionary stable state (ESS) for fluctuating environment by giving the probability that the optimal strategy on average can be invaded by a suboptimal one owing to rare environmental fluctuation. These results clarify the information-thermodynamic structures in adaptation and evolution.
Process modeling, the application of advanced computational techniques to simulate real processes as they occur in regular use, e.g., welding, casting and semiconductor crystal growth, is discussed. Using the low-gravity environment of space will accelerate the technical validation of the procedures and enable extremely accurate determinations of the many necessary thermophysical properties. Attention is given to NASA's centers for the commercial development of space; joint ventures of universities, industries, and goverment agencies to study the unique attributes of space that offer potential for applied R&D and eventual commercial exploitation.
Cowan, J.J.; Cameron, A.G.W.; Truran, J.W.
The results of an extended examination of r-process nucleosynthesis in helium-burning environments are presented. Using newly calculated nuclear rates, dynamical r-process calculations have been made of thermal runaways in helium cores typical of low-mass stars and in the helium zones of stars undergoing supernova explosions. These calculations show that, for a sufficient flux of neutrons produced by the 13 C neutron source, r-process nuclei in solar proportions can be produced. The conditions required for r-process production are found to be: 10 20 --10 21 neutrons cm -3 for times of 0.01--0.1 s and neutron number densities in excess of 10 19 cm -3 for times of approx.1 s. The amount of 13 C required is found to be exceedingly high: larger than is found to occur in any current stellar evolutionary model. It is thus unlikely that these helium-burning environments are responsible for producing the bulk of the r-process elements seen in the solar system
The process of designing high-quality software systems is one of the major issues in software engineering research. Over the years, this has resulted in numerous design methods, each with specific qualities and drawbacks. For example, the Rational Unified Process is a comprehensive design process,
Sanchez, Yerly; Pinzon, David; Zheng, Bin
To examine the reaction time when human subjects process information presented in the visual channel under both a direct vision and a virtual rehabilitation environment when walking was performed. Visual stimulus included eight math problems displayed on the peripheral vision to seven healthy human subjects in a virtual rehabilitation training (computer-assisted rehabilitation environment (CAREN)) and a direct vision environment. Subjects were required to verbally report the results of these math calculations in a short period of time. Reaction time measured by Tobii Eye tracker and calculation accuracy were recorded and compared between the direct vision and virtual rehabilitation environment. Performance outcomes measured for both groups included reaction time, reading time, answering time and the verbal answer score. A significant difference between the groups was only found for the reaction time (p = .004). Participants had more difficulty recognizing the first equation of the virtual environment. Participants reaction time was faster in the direct vision environment. This reaction time delay should be kept in mind when designing skill training scenarios in virtual environments. This was a pilot project to a series of studies assessing cognition ability of stroke patients who are undertaking a rehabilitation program with a virtual training environment. Implications for rehabilitation Eye tracking is a reliable tool that can be employed in rehabilitation virtual environments. Reaction time changes between direct vision and virtual environment.
Environmental information is presented relating to a staged version of the proposed Defense Waste Processing Facility (DWPF) at the Savannah River Plant. The information is intended to provide the basis for an Environmental Impact Statement. In either the integral or the staged design, the DWPF will convert the high-level waste currently stored in tanks into: a leach-resistant form containing about 99.9% of all the radioactivity, and a residual, slightly contaminated salt, which is disposed of as saltcrete. In the first stage of the staged version, the insoluble sludge portion of the waste and the long lived radionuclides contained therein will be vitrified. The waste glass will be sealed in canisters and stored onsite until shipped to a Federal repository. In the second stage, the supernate portion of the waste will be decontaminated by ion exchange. The recovered radionuclides will be transferred to the Stage 1 facility, and mixed with the sludge feed before vitrification. The residual, slightly contaminated salt solution will be mixed with Portland cement to form a concrete product (saltcrete) which will be buried onsite in an engineered landfill. This document describes the conceptual facilities and processes for producing glass waste and decontaminated salt. The environmental effects of facility construction, normal operations, and accidents are then presented. Descriptions of site and environs, alternative sites and waste disposal options, and environmental consultations and permits are given in the base Environmental Information Document
Full Text Available This article describes the development of a reaction database with the objective to collect data for multiphase reactions involved in small molecule pharmaceutical processes with a search engine to retrieve necessary data in investigations of reaction-separation schemes, such as the role of organic solvents in reaction performance improvement. The focus of this reaction database is to provide a data rich environment with process information available to assist during the early stage synthesis of pharmaceutical products. The database is structured in terms of reaction classification of reaction types; compounds participating in the reaction; use of organic solvents and their function; information for single step and multistep reactions; target products; reaction conditions and reaction data. Information for reactor scale-up together with information for the separation and other relevant information for each reaction and reference are also available in the database. Additionally, the retrieved information obtained from the database can be evaluated in terms of sustainability using well-known “green” metrics published in the scientific literature. The application of the database is illustrated through the synthesis of ibuprofen, for which data on different reaction pathways have been retrieved from the database and compared using “green” chemistry metrics.
Information is an important resource for new product development (NPD) process in subsidiary. However, we still lack of research to analyze NPD process from information perspective in subsidiary context. This research is an exploratory research and it exploited 8 cases of NPD process in consumer goods subsidiaries operating in Indonesian market. Three types of information have been identified and analyzed NPD process; global, regional and local information. The result of this research ...
Meiryani; Muhammad Syaifullah
Abstract The purpose of this study was to determine the influence of business process to the quality of the accounting information system. This study aims to examine the influence of business process on the quality of the information system of accounting information system. The study was theoritical research which considered the roles of business process on quality of accounting information system which use secondary data collection. The results showed that the business process have a signifi...
Modern digital automation techniques allow the application of demanding types of process control. These types of process control are characterized by their belonging to higher levels in a multilevel model. Functional and technical aspects of the performance of digital automation plants are presented and explained. A modern automation system is described considering special procedures of process control (e.g. real time diagnosis)
Nakada, Toyohisa; Itoh, Hideo; Kunifuji, Susumu; Nakashima, Hideyuki
The purpose of our study is to establish a robust communication, while keeping security and privacy, between a handheld communicator and the surrounding information environment. From the viewpoint of low power consumption, we have been developing a reflectivity modulating communication module composed of a liquid crystal light modulator and a corner-reflecting mirror sheet. We installed a corner-reflecting sheet instead of light scattering sheet in a handheld videogame machine with a display screen with a reflection-type liquid crystal. Infrared (IR) LED illuminator attached next to the IR camera of a base station illuminates all the room, and the terminal send their data to the base station by switching ON and OFF of the reflected IR beam. Intensity of reflected light differs with the position and the direction of the terminal, and sometimes the intensity of OFF signal at a certain condition is brighter than that of ON signal at another condition. To improve the communication quality, use of machine learning technique is a possibility of the solution. In this paper, we compare various machine learning techniques for the purpose of free space optical communication, and propose a new algorithm that improves the robustness of the data link. Evaluation using an actual free-space communication system is also described.
Chaffee, Ellen Earle
When a rational decision process is desired, information specialists can contribute information and also contribute to the process in which that information is used, thereby promoting rational decision-making. The contribution of Stanford's information specialists to rational decision-making is described. (MLW)
Nevill, Dorothy D.; And Others
Tested the assumptions that the structural features of vocational schemas affect vocational information processing and career self-efficacy. Results indicated that effective vocational information processing was facilitated by well-integrated systems that processed information along fewer dimensions. The importance of schematic organization on the…
Yeomans, Joanne; Baudic, Romain; Picchioli, Ingrid; International Conference on Nuclear Knowledge Management : Strategies, Information Management and Human Resource Development. Special Session : The Role of INIS in Knowledge Preservation
Information searchers from the high energy physics community expect an integrated information environment. The CERN Library offers its print and electronic collections through a combined Web interface and maintains the database by semi-automated processes to upload bibliographic and full-text records. Suggestions are offered by which INIS could develop its own Web interface and better match HEP users’ expectations. These include implementing full-text linking, increasing currency, expanding search and display functions and developing the richness of the data. Links with the National Nuclear Data Center and Crossref could also increase its visibility.
Vladlena Sergeevna Atkina
Full Text Available The article is devoted to the problem of the business continuity as a necessary element of the strategy of information security. The analysis of the requirements of federal legislation, standards, recommendations and guidelines to ensure the availability, disaster recovery, and recovery of data and information structures of organizations in their operation was conducted. The proposed approach to assessing the possible destabilizing factors and emergency situations includes the model of the environment where the information systems are operational processing key business processes of the organization, and the method for risk assessment to each of the destabilizing effects. Technique of risk assessment expects to use quantitative and qualitative approach to drawing three zone risk maps.
Vaidya, Vivek; Suryanarayanan, Srikanth; Krishnan, Kajoli; Mullick, Rakesh
As the imaging modalities used in medicine transition to increasingly three-dimensional data the question of how best to interact with and analyze this data becomes ever more pressing. Immersive virtual reality systems seem to hold promise in tackling this, but how individuals learn and interact in these environments is not fully understood. Here we will attempt to show some methods in which user interaction in a virtual reality environment can be visualized and how this can allow us to gain greater insight into the process of interaction/learning in these systems. Also explored is the possibility of using this method to improve understanding and management of ergonomic issues within an interface.
Dragnev Y. V.
Full Text Available A role and value of informative educational space in the professional becoming of future teacher of physical culture is considered. It is well-proven that such environment is characterized: by the volume of educational services, power, intensity, set of terms. It is shown that higher professional education requires perfection of the use of information technologies, programmatic and informative providing of educational process. It is set that modern information technologies are the mean of increase of efficiency of management all of spheres of public activity. It is marked that the process of forming of informative culture needs the personally oriented and differentiated going near the choice of the teaching programs. Directions of the use of information technologies in the controlled from distance teaching are selected. The ways of intensification of educational process are recommended through the increase of interest of students to the study of concrete discipline, increase of volume of independent work, increase of closeness of educational material.
Full Text Available Having access to SAR data can be highly important and critical especially for disaster mapping. Updating a GIS with contemporary information from SAR data allows to deliver a reliable set of geospatial information to advance civilian operations, e.g. search and rescue missions. Therefore, we present in this paper the operational processing of SAR data within a GIS environment for rapid disaster mapping. This is exemplified by the November 2010 flash flood in the Veneto region, Italy. A series of COSMO-SkyMed acquisitions was processed in ArcGIS® using a single-sensor, multi-mode, multi-temporal approach. The relevant processing steps were combined using the ArcGIS ModelBuilder to create a new model for rapid disaster mapping in ArcGIS, which can be accessed both via a desktop and a server environment.
Aa, van der J.H.; Leopold, H.; Mannhardt, F.; Reijers, H.A.; Gaaloul, K.; Schmidt, R.; Nurcan, S.; Guerreiro, S.; Ma, Q.
An organization’s knowledge on its business processes represents valuable corporate knowledge because it can be used to enhance the performance of these processes. In many organizations, documentation of process knowledge is scattered around various process information sources. Such information
Luis Eduardo Pérez Peregrino
Full Text Available The research project TEACH-ME (Technology, Engineering Calculus and Hewlett-Packard Mobile Environment presents an educational proposal that seeks to innovate the teaching and learning processes of mathematics, Logic Basic Programming and Management of Information, through the introduction of collaborative working environments, in order to provide the integrated development of learning methodologies, enhancing cognitive abilities in their students. As a case study, it presents the results obtained when applying this project to students in their first semester at the Faculty of Engineering at “Corporación Universitaria Minuto de Dios” University, which introduces the use of tablet PCs from Hewlett Packard to support the teaching process. This article presents the process of implementing of the TEACH-ME Project, developed as an academic environment that has allowed the implementation processes of research on the impact of the application of information technologies and communication technologies to the higher education teaching. We will present the project background, what the implementation process has so far done, the impact obtained from the learning and teaching processes, the integration of technologies at an academic meeting who has helped carry out the project and, finally, the contributions of the Tablet PC to the teaching-learning process at the University.
Lee, Ji Ho; Kim, Tae Whan; Kim, Sun Ja; Kim, Young Min; Choi, Kwang; Oh, Joung Hun; Choung, Hyun Suk; Keum, Jong Yong; Yoo, An Na; Harn, Deuck Haing; Choun, Young Chun
The major goal of this project is to develop a more efficient information management system by connecting the KAREI serials database which enable the users to access from their own laboratory facilities through KAREI-NET. The importance of this project is to make the serials information of KAERI easily accessible to users as valuable resources for R and D activities. The results of the project are as follows. 1) Development of the serials database and retrieval system enabled us to access to the serials holding information through KAERI-NET. 2) The database construction establishes a foundation for the management of 1,600 serials held in KAERI. 3) The system can be applied not only to KAERI but also to similar medium-level libraries. (Author)
Zhang, X.-S.; Xie, Hua
This paper presents a learning-by-doing method in the Internet environment to enhance the results of information technology education by experimental work in the classroom of colleges. In this research, an practical approach to apply the "learning by doing" paradigm in Internet-based learning, both for higher educational environments and life-long training systems, taking into account available computer and network resources, such as blogging, podcasting, social networks, wiki etc. We first introduce the different phases in the learning process, which aimed at showing to the readers that the importance of the learning by doing paradigm, which is not implemented in many Internet-based educational environments. Secondly, we give the concept of learning by doing in the different perfective. Then, we identify the most important trends in this field, and give a real practical case for the application of this approach. The results show that the attempt methods are much better than traditional teaching methods.
Tonfoni, G; Ichalkaranje, N S
The Internet/WWW has made it possible to easily access quantities of information never available before. However, both the amount of information and the variation in quality pose obstacles to the efficient use of the medium. Artificial intelligence techniques can be useful tools in this context. Intelligent systems can be applied to searching the Internet and data-mining, interpreting Internet-derived material, the human-Web interface, remote condition monitoring and many other areas. This volume presents the latest research on the interaction between intelligent systems (neural networks, adap
Kim, Jong Hyun; Seong, Poong Hyun
This paper proposes a quantitative approach to modeling the information processing of NPP operators. The aim of this work is to derive the amount of the information processed during a certain control task. The focus will be on i) developing a model for information processing of NPP operators and ii) quantifying the model. To resolve the problems of the previous approaches based on the information theory, i.e. the problems of single channel approaches, we primarily develop the information processing model having multiple stages, which contains information flows. Then the uncertainty of the information is quantified using the Conant's model, a kind of information theory
Full Text Available Mental images are mental representations of people, objects and situations that are not present and are formed by using the imagination. Many studies have addressed this psychological ability, its typology and its involvement in the academic environment. Along these lines, the aim of our study was to assess the information processing style (verbal, object, spatial scales, and mental rotation that is commonly used by students from different specialties of Compulsory Secondary Education. To that end, two tests: The Mental Rotation Test (MRT and the Object-Spatial Imagery and Verbal Questionnaire (OSIVQ were administered to a sample of 126 Compulsory Secondary Education students. MRT assessed any significant difference in the ability to mentally rotate images depending on gender and specialty. Significant differences were found by specialty, showing that science students had better ability to mentally rotate images than humanities ones. Significant differences were found by gender and specialty in the OSIVQ. Men showed better spatial and verbal processing style than women, and humanities students excelled in object processing (in comparison to science students and in verbal processing (in comparison to science and art students.
Full Text Available This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two fundamental questions: (1 What input information guides moral judgments?; and (2 What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework, critically evaluates them on empirical and theoretical grounds, outlines a general integrative model grounded in information processing, and offers conceptual and methodological suggestions for future research. The information processing perspective provides a useful theoretical framework for organizing extant and future work in the rapidly growing field of moral judgment.
Gross, Kenneth C.; Morreale, Patricia
A method and system for processing a set of data from an industrial process and/or a sensor. The method and system can include processing data from either real or calculated data related to an industrial process variable. One of the data sets can be an artificial signal data set generated by an autoregressive moving average technique. After obtaining two data sets associated with one physical variable, a difference function data set is obtained by determining the arithmetic difference between the two pairs of data sets over time. A frequency domain transformation is made of the difference function data set to obtain Fourier modes describing a composite function data set. A residual function data set is obtained by subtracting the composite function data set from the difference function data set and the residual function data set (free of nonwhite noise) is analyzed by a statistical probability ratio test to provide a validated data base.
Craik . F.I.M., & Lockhart , R.S. (1972). Levels of processing : A framework for memory research. Journal of Verbal Learning and Verbal Behavior, 11...task at both levels of performance, then one would, in both cases, postulate systems that had the ability to process symbols at the microscopic level ...821760s and early 70s. (cf. Atkinson & Shiffrin. 1968: Craik & Lockhart . 1972: Norman, Rumelhart, & LNR, 1975). This architecture is comprised of several
de Dreu, C.K.W.; de Vries, N.K.
In two experiments we studied the prediction that majority support induces stronger convergent processing than minority support for a persuasive message, the more so when recipients are explicitly forced to pay attention to the source's point of view; this in turn affects the amount of attitude change on related issues. Convergent processing is the systematic elaboration on the sources position, but with a stronger focus on verification and justification rather than falsification. In Exp 1 wi...
Mc Mahon, Siobhan S; Sim, Aaron; Filippi, Sarah; Johnson, Robert; Liepe, Juliane; Smith, Dominic; Stumpf, Michael P H
Sensing and responding to the environment are two essential functions that all biological organisms need to master for survival and successful reproduction. Developmental processes are marshalled by a diverse set of signalling and control systems, ranging from systems with simple chemical inputs and outputs to complex molecular and cellular networks with non-linear dynamics. Information theory provides a powerful and convenient framework in which such systems can be studied; but it also provides the means to reconstruct the structure and dynamics of molecular interaction networks underlying physiological and developmental processes. Here we supply a brief description of its basic concepts and introduce some useful tools for systems and developmental biologists. Along with a brief but thorough theoretical primer, we demonstrate the wide applicability and biological application-specific nuances by way of different illustrative vignettes. In particular, we focus on the characterisation of biological information processing efficiency, examining cell-fate decision making processes, gene regulatory network reconstruction, and efficient signal transduction experimental design. Copyright © 2014 Elsevier Ltd. All rights reserved.
Dr. Filiz Gürder
Full Text Available Nowadays, organizations are required to develop quick and accurate responses to internal and external changes that gain momentum. In this context, knowledge management activities become more important to all organizations. On the other hand, Geographic Information Systems (GIS become common more and more. GIS which address a broad spectrum of users such as public agencies, local communities, civil society organizations, the private sector, academic environment, and personal users have been aiming to solve problems which occurred in location-based areas. GIS are important to get, combine, analyze and transfer the spatial data. Common use of PCs for personal needs, digital geography and improvements of software technologies, also the need to make socially acceptable business decisions facilitated development and widespread use of GIS applications. The main purpose of this paper is to discuss using areas and contribution potentials of GIS in enterprise-wide knowledge management processes.
Brookhuis, Karel Anton
We set out to test the hypotheses generated by Shiffrin & Schneider’s model of information procesing with our new tool, the ERP. The experiments were devised to test hypotheses that were orginally based on performance data alone, i.e. reaction time and errors. Although the overt behaviour was
tradi- tionally called the "span of apprehension" (Kulpe, 1904; Durable Storage Wundt , 1899). However, a partial-report procedure demon- strates...Gehrig. P. (1992). On the time course Wundt . W. (1899). Zur Kritik tachistoskopischer Versuche [A crit- of perceptual information that results from a
Haras, Consuela; Sauquet, Dominique; Ameline, Philippe; Jaulent, Marie-Christine; Degoulet, Patrice
In a distributed patient record environment, we analyze the processes needed to ensure exchange and access to EHR data. We propose an adapted method and the tools for data synchronization. Our study takes into account the issues of user rights management for data access and of decreasing the amount of data exchanged over the network. We describe a XML-based synchronization model that is portable and independent of specific medical data models. The implemented platform consists of several servers, of local network clients, of workstations running user’s interfaces and of data exchange and synchronization tools. PMID:16779049
Clare Cooper Marcus
Full Text Available Having defined the topic and its related management effects in the healthcare environment, this paper reports considerations of specific design processes, including evidence-based design, Integrated Healthcare Strategies, participatory practices and post occupancy evaluation. Landscape of Italian examples follows before a case study of three Californian healing gardens dedicated to cancer patients, linked to a survey of this category of users’ needs in such spaces. Conclusions report the reflection of practical implications deriving from studying North American examples, underlining the opportunity for audit and certification of therapeutic gardens, as well as the chance to export them outside health infrastructures for social needs.
Full Text Available Mobile agents are programs that can move from one site to another in a network with their data and states. Mobile agents are expected to be an essential tool in pervasive computing. In multi platform environment, it is important to communicate with mobile agents only using their universal or logical name not using their physical locations. More, in an ad-hoc network environment, an agent can migrate autonomously and communicate with other agents on demand. It is difficult that mobile agent grasps the position information on other agents correctly each other, because mobile agent processes a task while moving a network successively. In order to realize on-demand mutual communication among mobile agents without any centralized servers, we propose a new information sharing mechanism within mobile agents. In this paper, we present a new information sharing mechanism within mobile agents. The method is a complete peer based and requires no agent servers to manage mobile agent locations. Therefore, a mobile agent can get another mobile agent, communicate with it and shares information stored in the agent without any knowledge of the location of the target mobile agent. The basic idea of the mechanism is an introduction of Agent Ring, Agent Chain and Shadow Agent. With this mechanism, each agent can communicate with other agents in a server-less environment, which is suitable for ad-hoc agent network and an agent system can manage agents search and communications efficiently.
Kim, Jong Hyun; Seong, Poong Hyun
This paper proposes a quantitative approach to modeling the information processing of NPP operators. The aim of this work is to derive the amount of the information processed during a certain control task under input information overload. We primarily develop the information processing model having multiple stages, which contains information flow. Then the uncertainty of the information is quantified using the Conant's model, a kind of information theory. We also investigate the applicability of this approach to quantifying the information reduction of operators under the input information overload
Cowan, J. J.; Cameron, A. G. W.; Truran, J. W.
The results of an extended examination of r-process nucleosynthesis in helium-burning enviroments are presented. Using newly calculated nuclear rates, dynamical r-process calculations have been made of thermal runaways in helium cores typical of low-mass stars and in the helium zones of stars undergoing supernova explosions. These calculations show that, for a sufficient flux of neutrons produced by the C-13 neutron source, r-process nuclei in solar proportions can be produced. The conditions required for r-process production are found to be 10 to the 20th-10 to the 21st neutrons per cubic centimeter for times of 0.01-0.1 s and neutron number densities in excess of 10 to the 19th per cubic centimeter for times of about 1 s. The amount of C-13 required is found to be exceedingly high - larger than is found to occur in any current stellar evolutionary model. It is thus unlikely that these helium-burning environments are responsible for producing the bulk of the r-process elements seen in the solar system.
Full Text Available This article considers institutional aspects of the organized agricultural market formation process. Theoretical base to distinguish institute and institutes is given. In order to find out main influential institutes of the “organization” phenomenon author analyses Ukrainian institutional environment that is under construction process. Author considers main processes which are running during the organized market formation. Author researches theoretical approaches to the institutional staff. In order to structure the most common approaches and theoretical knowledge of this problem author proposes few schemes. Author’s points of view for many questions of the organized market formation process are proposed. Researcher analyzes effectiveness of the institutes and governmental regulation of the agricultural market. Readers can find strategically new approach to the agricultural market formation policy from the governmental point of view. Essence of the socioeconomic formation of agricultural market is considered. Main factors of agriculture market formation are outlined. Agricultural market structural parts consideration systematic approach is proposed. Ineffectiveness of the agriculture market relations without regulation process is proved. The most unfavorable reasons of the agricultural market formation are determined.
Alexandria, Joao Carlos Soares de
The increase of the connectivity in the business environment, combined with the growing dependency of information systems, has become the information security management an important governance tool. Information security has as main goal to protect the business transactions in order to work normally. In this way, It will be safeguarding the business continuity. The threats of information come from hackers' attacks, electronic frauds and spying, as well as fire, electrical energy interruption and humans fault. Information security is made by implementation of a set of controls, including of the others politics, processes, procedures, organizational structures, software and hardware, which require a continuous management and a well established structure to be able to face such challenges. This work tried to search the reasons why the organizations have difficulties to make a practice of information security management. Many of them just limit to adopt points measures, sometimes they are not consistent with their realities. The market counts on enough quantity of standards and regulations related to information security issues, for example, ISO/IEC 27002, American Sarbanes-Oxley act, Basel capital accord, regulations from regulatory agency (such as the Brazilians ones ANATEL, ANVISA and CVM). The market researches have showed that the information security implementation is concentrated on a well-defined group of organization mainly formed by large companies and from specifics sectors of economy, for example, financial and telecommunication. However, information security must be done by all organizations that use information systems to carry out their activities, independently of its size or economic area that it belongs. The situation of information security in the governmental sector of Brazil, and inside its research institutions, is considered worrying by the Brazilian Court of Accounts (TCU). This research work presents an assessment and diagnostic proposal of
The paper describes a systematic approach to the design of information interfaces for operator support in diagnosing complex systems faults. The need of interpreting primary measured plant variables within the framework of different system representations organized into an abstraction hierarchy is identified from an analysis of the problem of diagnosing complex systems. A formalized approach to the modelling of production systems, called Multilevel Flow Modelling, is described. A MFM model specifies plant control requirements and the associated need for plant information and provide a consistent context for the interpretation of real time plant signals in diagnosis of malfunctions. The use of MFM models as a basis for functional design of the plant instrumentation system is outlined, and the use of knowledge Based (Expert) Systems for the design of man-machine interfaces is mentioned. Such systems would allow an active user participation in diagnosis and thus provide the basis for cooperative problem solving. 14 refs. (author)
Mental Hygiene Bourdieu , Pierre 1977 Outline of a Theory of Practice. Richard Nice, trans. Cambridge: Cambridge University Press. Cain, Leo F. and Samuel...hospital posed a unique evacuation problem. When a fire occurs in a hospital, information is typically communicated to doctors , nurses, and other...bore no relation whatsoever to the emergencies they announced, and they differed from institution to institution. Thus doctors , nuerses and other staff
Full Text Available The article presents the analysis of the notion of “informational and educational environment”. The difference between the “informational environment” and the “informational and educational environment” has been shown. The main functions of the informational and educational environment and its facilities to enhance the quality of education have been revealed. There have been defined major components of the informational and educational environment. A connection has been detected between the informational and educational environment and the formation of the foundations of pedagogical skills. The research also presents a description of the electronic system “Socrates” of Vinnytsia State Agrarian University, and shows possible ways of its use in the educational process with an aim of the formation and improvement of higher educational institution teacher’s professional skills.
Pincus, J. David; Acharya, Lalit
Based on multidisciplinary research findings, this report proposes an information processing model of employees' response to highly stressful information environments arising during organizational crises. The introduction stresses the importance of management's handling crisis communication with employees skillfully. The second section points out…
Full Text Available Abstract The purpose of this study was to determine the influence of business process to the quality of the accounting information system. This study aims to examine the influence of business process on the quality of the information system of accounting information system. The study was theoritical research which considered the roles of business process on quality of accounting information system which use secondary data collection. The results showed that the business process have a significant effect on the quality of accounting information systems.
Kim, Du Gyu; Lee, JaeMu
This study examined an instruction method for the improvement of information processing abilities in elementary school students. Current elementary students are required to develop information processing abilities to create new knowledge for this digital age. There is, however, a shortage of instruction strategies for these information processing…
Khalid, Md. Saifuddin; Hossain, Mohammad Shahadat; Rongbutsri, Nikorn
administration and evaluation and assessment. Educational environments are flexible and not governed by standard operating procedures, making technology use lithe. Theory of diffusion of innovations‟ is recommended to be integrated to reason and measure acceptance or rejection of EPR selected technology......In technology mediated learning while relative advantages of technologies is proven, lack of contextualization and process centric change, and lack of user driven change has kept intervention and adoption of educational technologies among individuals and organizations as challenges. Reviewing...... the formal, informal and non-formal learning environments, this study focuses on the formal part. This paper coins the term 'Educational Process Reengineering (EPR) based on the established concept of 'Business Process Reengineering (BPR) for process improvement of teaching learning activities, academic...
Sadjadi, Firooz; Sadjadi, Farzad
Polarimetric sensing is an area of active research in a variety of applications. In particular, the use of polarization diversity has been shown to improve performance in automatic target detection and recognition. Within the diverse scope of polarimetric sensing, the field of passive polarimetric sensing is of particular interest. This chapter presents several new methods for gathering in formation using such passive techniques. One method extracts three-dimensional (3D) information and surface properties using one or more sensors. Another method extracts scene-specific algebraic expressions that remain unchanged under polariza tion transformations (such as along the transmission path to the sensor).
Jespersen, Kristina Risom
collection platform to obtain measurements from within the NPD process. 42 large, international companies participated in the data collecting simulation. Results revealed five different information paths that were not connecting all stages of the NPD process. Moreover, results show that the front......-end is not driving the information acquisition through the stages of the NPD process, and that environmental turbulence disconnects stages from the information paths in the NPD process. This implies that information is at the same time a key to success and a key to entrapment in the NPD process....
Vatansever, Deniz; Menon, David K; Stamatakis, Emmanuel A
Concurrent with mental processes that require rigorous computation and control, a series of automated decisions and actions govern our daily lives, providing efficient and adaptive responses to environmental demands. Using a cognitive flexibility task, we show that a set of brain regions collectively known as the default mode network plays a crucial role in such "autopilot" behavior, i.e., when rapidly selecting appropriate responses under predictable behavioral contexts. While applying learned rules, the default mode network shows both greater activity and connectivity. Furthermore, functional interactions between this network and hippocampal and parahippocampal areas as well as primary visual cortex correlate with the speed of accurate responses. These findings indicate a memory-based "autopilot role" for the default mode network, which may have important implications for our current understanding of healthy and adaptive brain processing.
Razvan Daniel ZOTA
Full Text Available Cloud computing represents one of the latest emerging trends in distributed computing that enables the existence of hardware infrastructure and software applications as services. The present paper offers a general approach to the cloud computing standardization as a mean of improving the speed of adoption for the cloud technologies. Moreover, this study tries to show out how organizations may achieve more consistent business processes while operating with cloud computing technologies.
Agriculture, a study centre of the National Open University (NOUN), a State ... to impart knowledge of the various subjects in students. ..... Information-seeking behaviour of International Islamic ... Malaysian Journal of Library & Information.
US Agency for International Development — The CARPE Information Management Tool (CARPE IMT), available in both French and English, organizes information and reports from its partners for the 12 CARPE/CBFP...
Mosier, K. L.; Hart, S. G.
State-of-the-art flight technology has restructured the task of human operators, decreasing the need for physical and sensory resources, and increasing the quantity of cognitive effort required, changing it qualitatively. Recent technological advances have the most potential for impacting a pilot in two areas: performance and mental workload. In an environment in which timing is critical, additional cognitive processing can cause performance decrements, and increase a pilot's perception of the mental workload involved. The effects of stimulus processing demands on motor response performance and subjective mental workload are examined, using different combinations of response selection and target acquisition tasks. The information processing demands of the response selection were varied (e.g., Sternberg memory set tasks, math equations, pattern matching), as was the difficulty of the response execution. Response latency as well as subjective workload ratings varied in accordance with the cognitive complexity of the task. Movement times varied according to the difficulty of the response execution task. Implications in terms of real-world flight situations are discussed.
Hendrickson, David A.; Bayer, Gregory W.
The ES&H Information Systems department, motivated by the numerous isolated information technology systems under its control, undertook a significant integration effort. This effort was planned and executed over the course of several years and parts of it still continue today. The effect was to help move the ES&H Information Systems department toward integration with the corporate Information Solutions and Services center.
Based upon quantum-inspired entanglement in quantum-classical hybrids, a simple algorithm for instantaneous transmissions of non-intentional messages (chosen at random) to remote distances is proposed. The idea is to implement instantaneous transmission of conditional information on remote distances via a quantum-classical hybrid that preserves superposition of random solutions, while allowing one to measure its state variables using classical methods. Such a hybrid system reinforces the advantages, and minimizes the limitations, of both quantum and classical characteristics. Consider n observers, and assume that each of them gets a copy of the system and runs it separately. Although they run identical systems, the outcomes of even synchronized runs may be different because the solutions of these systems are random. However, the global constrain must be satisfied. Therefore, if the observer #1 (the sender) made a measurement of the acceleration v(sub 1) at t =T, then the receiver, by measuring the corresponding acceleration v(sub 1) at t =T, may get a wrong value because the accelerations are random, and only their ratios are deterministic. Obviously, the transmission of this knowledge is instantaneous as soon as the measurements have been performed. In addition to that, the distance between the observers is irrelevant because the x-coordinate does not enter the governing equations. However, the Shannon information transmitted is zero. None of the senders can control the outcomes of their measurements because they are random. The senders cannot transmit intentional messages. Nevertheless, based on the transmitted knowledge, they can coordinate their actions based on conditional information. If the observer #1 knows his own measurements, the measurements of the others can be fully determined. It is important to emphasize that the origin of entanglement of all the observers is the joint probability density that couples their actions. There is no centralized source
Kodym, Oldřich; Unucka, Jakub
Paper deals with complex system for processing information from means of transport acting as parts of train (rail or road). It focuses on automated information gathering using AutoID technology, information transmission via Internet of Things networks and information usage in information systems of logistic firms for support of selected processes on MES and ERP levels. Different kinds of gathered information from whole transport chain are discussed. Compliance with existing standards is mentioned. Security of information in full life cycle is integral part of presented system. Design of fully equipped system based on synthesized functional nodes is presented.
Bell, Philip, Ed.; Lewenstein, Bruce, Ed.; Shouse, Andrew W., Ed.; Feder, Michael A., Ed.
Informal science is a burgeoning field that operates across a broad range of venues and envisages learning outcomes for individuals, schools, families, and society. The evidence base that describes informal science, its promise, and effects is informed by a range of disciplines and perspectives, including field-based research, visitor studies, and…
Conforti, R.; Leoni, de M.; La Rosa, M.; Aalst, van der W.M.P.; Salinesi, C.; Norrie, M.C.; Pastor, O.
This paper proposes a technique that supports process participants in making risk-informed decisions, with the aim to reduce the process risks. Risk reduction involves decreasing the likelihood and severity of a process fault from occurring. Given a process exposed to risks, e.g. a financial process
Amy Roehl, MFA
Full Text Available Currently, major shifts occur in design processes effecting business practices for industries involved with designing and delivering the built environment. These changing conditions are a direct result of industry adoption of relatively new technologies called BIM or Building Information Modeling. This review of literature examines implications of these changing processes on interior design education.
de Keijzer, Ander; van Keulen, Maurice
At present, many information sources are available wherever you are. Most of the time, the information needed is spread across several of those information sources. Gathering this information is a tedious and time consuming job. Automating this process would assist the user in its task. Integration
Faherty, III, David E
...) the relationship between network density and group performance. The results of this exploration, though mostly inconclusive, call into question both intuition and social network analysis literature...
The goal is the construction of Asian Environment Information Network (AEInet) in accordance with a contract signed between Indonesia's LIPI (Indonesian Institute of Science) and NEDO under NEDO's Research Cooperation Project Concerning the Development of Environment Measuring Laser Radar (LR). The network is so designed and constituted as to operate on a private line between Indonesia and Japan via IP (Internet protocol) and to enable the exchange on the Internet network of the data collected/analyzed by the Indonesian LR system and of articles of e-mail between scientists of the two countries. The AEInet will be utilized for the collection/analysis of LR-collected data; exchange of observed data and the result of processing; provision of support to environment information scientists in exchanging e-mail and information; and the search of databases for the implementation of the project. In this paper, the outline and functions of the system, network system design, WWW server construction, network operating status, joint researches with Indonesia, etc., are described. (NEDO)
Kenneth T. Sullivan
Full Text Available Construction professionals have identified public contract law and bureaucratic procurement/contract offices as a source of problems in the construction industry. The culture within the United State's Federal Government Acquisitions is based on the Federal Acquisition Regulations (FARs and its interpretation, often placing organizations/agencies in the price-based environment and continuously resulting in poor performance. The United States Army Medical Command (MEDCOM (approximately $100 M in construction renovation awards per year attempted to overcome this obstacle through a partnership with the Performance-Based Studies Research Group (PBSRG at Arizona State University. The MEDCOM implemented the information environment portion of the Performance Information Procurement System (PIPS into Indefinite Delivery Indefinite Quantity (IDIQ contracts through the specifications. Without controlling the various contract/procurement processes, the developed information environment stimulated an atmosphere of accountability to all parties involved, while reducing the client's internal bureaucratic resistance. The concept has met with preliminary success, minimizing construction management issues by over 50%, raising owner satisfaction by 9%, resulting in 99% of projects ending with no contractor-generated change orders, and assisting MEDCOM leadership in measuring the performance of their infrastructure revitalization program.
Hayward, Tim; Broady, Judith E.
Presents research on the use of external information in the strategic management of retail banks in the United Kingdom. Explores the organizational role of the environmental analysis department, the character of business environment analysis, and the nature of information used in strategic management and its perceived importance. (Author/AEF)
Centre for Theoretical Studies and Supercomputer Education and Research Centre, ... the parent to the offspring, sensory information conveyed by the sense organ to the .... The task involved in genetic information processing is. ASSEMBLY.
Full Text Available Abstract Background Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. Results We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data and therefore are close to
Stropp, Thomas; McPhillips, Timothy; Ludäscher, Bertram; Bieda, Mark
Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or R
Background Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. Results We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or
Many companies today have to deal with business processes optimization, ongoing removal of system obstacles and identifying any bottlenecks laying on the way which are preventing us to reach our goals. I would like to show how we have dealt with similar problems in our case applied in the department of web application maintenance. In the first chapter I will introduce the basic information about the history, environment the company operate in and maintenance activities which are part of the p...
A. S. Garov
Full Text Available We are developing a unified distributed communication environment for processing of spatial data which integrates web-, desktop- and mobile platforms and combines volunteer computing model and public cloud possibilities. The main idea is to create a flexible working environment for research groups, which may be scaled according to required data volume and computing power, while keeping infrastructure costs at minimum. It is based upon the "single window" principle, which combines data access via geoportal functionality, processing possibilities and communication between researchers. Using an innovative software environment the recently developed planetary information system (http://cartsrv.mexlab.ru/geoportal will be updated. The new system will provide spatial data processing, analysis and 3D-visualization and will be tested based on freely available Earth remote sensing data as well as Solar system planetary images from various missions. Based on this approach it will be possible to organize the research and representation of results on a new technology level, which provides more possibilities for immediate and direct reuse of research materials, including data, algorithms, methodology, and components. The new software environment is targeted at remote scientific teams, and will provide access to existing spatial distributed information for which we suggest implementation of a user interface as an advanced front-end, e.g., for virtual globe system.
Garov, A. S.; Karachevtseva, I. P.; Matveev, E. V.; Zubarev, A. E.; Florinsky, I. V.
We are developing a unified distributed communication environment for processing of spatial data which integrates web-, desktop- and mobile platforms and combines volunteer computing model and public cloud possibilities. The main idea is to create a flexible working environment for research groups, which may be scaled according to required data volume and computing power, while keeping infrastructure costs at minimum. It is based upon the "single window" principle, which combines data access via geoportal functionality, processing possibilities and communication between researchers. Using an innovative software environment the recently developed planetary information system (geoportal"target="_blank">http://cartsrv.mexlab.ru/geoportal) will be updated. The new system will provide spatial data processing, analysis and 3D-visualization and will be tested based on freely available Earth remote sensing data as well as Solar system planetary images from various missions. Based on this approach it will be possible to organize the research and representation of results on a new technology level, which provides more possibilities for immediate and direct reuse of research materials, including data, algorithms, methodology, and components. The new software environment is targeted at remote scientific teams, and will provide access to existing spatial distributed information for which we suggest implementation of a user interface as an advanced front-end, e.g., for virtual globe system.
relationship between acuity and light sensitivity. Animals have evolved a wide variety of solutions to this problem such as folded membranes, to have a larger receptive surfaces, and lenses, to focus light onto the receptive membranes. On the neural capacity side, complex eyes demand huge processing network...... animals in a wide range of behaviours. It is intuitive that a complex eye is energetically very costly, not only in components but also in neural involvement. The increasing behavioural demand added pressure on design specifications and eye evolution is considered an optimization of the inverse...... fit their need. Visual neuroethology integrates optics, sensory equipment, neural network and motor output to explain how animals can perform behaviour in response to a specific visual stimulus. In this doctoral thesis, I will elucidate the individual steps in a visual neuroethological pathway...
Livshitz, I. I.; Kunakov, E.; Lontsikh, P. A.
This article is devoted to an improvement of the technological process's analysis with the information security requirements implementation. The aim of this research is the competition increase analysis in aircraft industry enterprises due to the information technology implementation by the example of the tube bending technological process. The article analyzes tube bending kinds and current technique. In addition, a potential risks analysis in a tube bending technological process is carried out in terms of information security.
Farnsworth, Keith D.; Nelson, John; Gershenson, Carlos
We extend the concept that life is an informational phenomenon, at every level of organisation, from molecules to the global ecological system. According to this thesis: (a) living is information processing, in which memory is maintained by both molecular states and ecological states as well as the more obvious nucleic acid coding; (b) this information processing has one overall function - to perpetuate itself; and (c) the processing method is filtration (cognition) of, and synthesis of, info...
Stephens, Karen; Herman, Melody; Griffin, Brand
This slide presentation reviews NASA's process for prioritizing technology requirements where there is a competitive environment. The In-Space Propulsion Technology (ISPT) project is used to exemplify the process. The ISPT project focuses on the mid level Technology Readiness Level (TRL) for development. These are TRL's 4 through 6, (i.e. Technology Development and Technology Demonstration. The objective of the planning activity is to identify the current most likely date each technology is needed and create ISPT technology development schedules based on these dates. There is a minimum of 4 years between flight and pacing mission. The ISPT Project needed to identify the "pacing mission" for each technology in order to provide funding for each area. Graphic representations show the development of the process. A matrix shows which missions are currently receiving pull from the both the Solar System Exploration and the Sun-Solar System Connection Roadmaps. The timeframes of the pacing missions technologies are shown for various types of propulsion. A pacing mission that was in the near future serves to increase the priority for funding. Adaptations were made when budget reductions precluded the total implementation of the plan.
Bass, Ellen J.; Baumgart, Leigh A.; Shepley, Kathryn Klein
Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noi...
O. F. Bryksina
Full Text Available The article substantiates the advantages of building information-educational environment of the basic professional educational program based on cloud technologies. Universal tool for building information-educational environment is Google Apps for Education services, which allows to organize the effective cooperation of all participants of the educational process, to plan collaborative activities, properly allocate resources and provide the solution of various learning tasks by necessary tools. Examples of using various Google-services in the organization of the collaborative activities of teachers of the department of applied informatics and information technologies in education of the Minin Nizhny Novgorod State Pedagogical University to improve implementation of the basic professional educational program in the direction of preparation "Information systems and technology". The core of the informational and educational environment of the basic professional educational program is Google-site that integrates different Google services and Google Apps applications.
Sharif, Amir M
This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University. Representing knowledge as information content alone is insufficient in providing us with an understanding of the world around us. A combination of context as well as reasoning of the information content is fundamental to representing knowledge in an information system. Knowledge Representation is typically concerned with providing structures and theories that are used as a basis for intellige...
The final stages of this work saw changes to the original framework, as well as the completion and integration of several data processing services. Initially, it was thought that a peer-to-peer architecture was necessary to make this work possible. The peer-to-peer architecture provided many benefits including the dynamic discovery of new services that would be continually added. A prototype example was built and while it showed promise, a major disadvantage was seen in that it was not easily integrated into the existing data environment. While the peer-to-peer system worked well for finding and accessing distributed data processing services, it was found that its use was limited by the difficulty in calling it from existing tools and services. After collaborations with members of the data community, it was determined that our data processing system was of high value and that a new interface should be pursued in order for the community to take full advantage of it. As such; the framework was modified from a peer-to-peer architecture to a more traditional web service approach. Following this change multiple data processing services were added. These services include such things as coordinate transformations and sub setting of data. Observatory (VHO), assisted with integrating the new architecture into the VHO. This allows anyone using the VHO to search for data, to then pass that data through our processing services prior to downloading it. As a second attempt at demonstrating the new system, a collaboration was established with the Collaborative Sun Earth Connector (CoSEC) group at Lockheed Martin. This group is working on a graphical user interface to the Virtual Observatories and data processing software. The intent is to provide a high-level easy-to-use graphical interface that will allow access to the existing Virtual Observatories and data processing services from one convenient application. Working with the CoSEC group we provided access to our data
Chase, Stan; Ocallaghan-Hay, Bridget; Housman, Ralph; Kindig, Michael; King, John; Montegrande, Kevin; Norris, Raymond; Vanscotter, Ryan; Willenborg, Jonathan; Staubs, Harry
The manufacture of construction materials from locally available resources in space is an important first step in the establishment of lunar and planetary bases. The objective of the CoMPULSIVE (Construction Material Processed Using Lunar Simulant In Various Environments) experiment is to develop a procedure to produce construction materials by sintering or melting Johnson Space Center Simulant 1 (JSC-1) lunar soil simulant in both earth-based (1-g) and microgravity (approximately 0-g) environments. The characteristics of the resultant materials will be tested to determine its physical and mechanical properties. The physical characteristics include: crystalline, thermal, and electrical properties. The mechanical properties include: compressive tensile, and flexural strengths. The simulant, placed in a sealed graphite crucible, will be heated using a high temperature furnace. The crucible will then be cooled by radiative and forced convective means. The core furnace element consists of space qualified quartz-halogen incandescent lamps with focusing mirrors. Sample temperatures of up to 2200 C are attainable using this heating method.
Argonne National laboratory (ANL) is refurbishing the hot cell facility originally constructed with the EBR-II reactor. When refurbishment is complete, the facility win demonstrate the complete fuel cycle for current generation high burnup metallic fuel elements. These are sodium bonded, stainless steel clad fuel pins of U-Zr or U-Pu-Zr composition typical of the fuel type proposed for a future Integral Fast Reactor (IFR) design. To the extent possible, the process equipment is being built at full commercial scale, and the facility is being modified to incorporate current DOE facility design requirements and modem remote maintenance principles. The current regulatory and safety environment has affected the design of the fuel fabrication equipment, most of which will be described in greater detail in subsequent papers in this session
Thatcher, W.W.; Collier, R.J.; Beede, D.K.; Wilcox, C.J.
Effects of environment on reproductive processes of female cattle are described. Variation in fertility responses is divided into environmental and genetic effects. Environmental effects are defined as all effects non-genetic, and emphasis is placed on the quantitation of various climatic measurements (e.g. maximum environmental temperature) associated with reproductive performance. Sensitivities of various reproductive events to thermal stress are defined and include such responses as: reproductive behaviour; hormonal balance related to both reproductive events and metabolic adaptations; alterations in uterine blood flow; embryonic death; conceptus development; placental function; and restoration of postpartum reproductive function. Also discussed are several management strategies to improve animal productivity, including environmental modification and the potential for genetic development of less heat-sensitive animals. (author)
Alsibyani, Hassan M.
Cloud computing usage is increasing and a common concern is the privacy and security of the data and computation. Third party cloud environments are not considered fit for processing private information because the data will be revealed to the cloud provider. However, Trusted Execution Environments (TEEs), such as Intel SGX, provide a way for applications to run privately and securely on untrusted platforms. Nonetheless, using a TEE by itself for stream processing systems is not sufficient since network communication patterns may leak properties of the data under processing. This work addresses leaky topology structures and suggests mitigation techniques for each of these. We create specific metrics to evaluate leaks occurring from the network patterns; the metrics measure information leaked when the stream processing system is running. We consider routing techniques for inter-stage communication in a streaming application to mitigate this data leakage. We consider a dynamic policy to change the mitigation technique depending on how much information is currently leaking. Additionally, we consider techniques to hide irregularities resulting from a filtering stage in a topology. We also consider leakages resulting from applications containing cycles. For each of the techniques, we explore their effectiveness in terms of the advantage they provide in overcoming the network leakage. The techniques are tested partly using simulations and some were implemented in a prototype SGX-based stream processing system.
Mwakalinga, G Jeffy; Kowalski, Stewart; Yngström, Louise
In this paper, we describe a methodology for considering culture of users and environments when developing information security systems. We discuss the problem of how researchers and developers of security for information systems have had difficulties in considering culture of users and environments when they develop information security systems. This has created environments where people serve technology instead of technology serving people. Users have been considered just as any other compo...
Rivoire, Olivier; Leibler, Stanislas
The notion of information pervades informal descriptions of biological systems, but formal treatments face the problem of defining a quantitative measure of information rooted in a concept of fitness, which is itself an elusive notion. Here, we present a model of population dynamics where this problem is amenable to a mathematical analysis. In the limit where any information about future environmental variations is common to the members of the population, our model is equivalent to known models of financial investment. In this case, the population can be interpreted as a portfolio of financial assets and previous analyses have shown that a key quantity of Shannon's communication theory, the mutual information, sets a fundamental limit on the value of information. We show that this bound can be violated when accounting for features that are irrelevant in finance but inherent to biological systems, such as the stochasticity present at the individual level. This leads us to generalize the measures of uncertainty and information usually encountered in information theory.
Wijnhoven, Fons; Dietz, Pim; Amrit, Chintan; Hercheui, Magda David; Whitehouse, Diane; McIver Jr., William J.; Phahlamohlaka, Jackie
Information technology is powered by electricity. Although its impact on Green House Gasses (GHG) is still rather limited, the next decade will show an explosion of its impact because technological innovations on data communication, information retrieval and datacenter operation will not compensate
Graph states are multiparticle states which are associated with graphs. Each vertex of the graph corresponds to a single system or particle. The links describe quantum correlations (entanglement) between pairs of connected particles. Graph states were initiated independently by two research groups: On the one hand, graph states were introduced by Briegel and Raussendorf as a resource for a new model of one-way quantum computing, where algorithms are implemented by a sequence of measurements at single particles. On the other hand, graph states were developed by the author of this thesis and ReinhardWerner in Braunschweig, as a tool to build quantum error correcting codes, called graph codes. The connection between the two approaches was fully realized in close cooperation of both research groups. This habilitation thesis provides a survey of the theory of graph codes, focussing mainly, but not exclusively on the author's own research work. We present the theoretical and mathematical background for the analysis of graph codes. The concept of one-way quantum computing for general graph states is discussed. We explicitly show how to realize the encoding and decoding device of a graph code on a one-way quantum computer. This kind of implementation is to be seen as a mathematical description of a quantum memory device. In addition to that, we investigate interaction processes, which enable the creation of graph states on very large systems. Particular graph states can be created, for instance, by an Ising type interaction between next neighbor particles which sits at the points of an infinitely extended cubic lattice. Based on the theory of quantum cellular automata, we give a constructive characterization of general interactions which create a translationally invariant graph state. (orig.)
Borrelli, G; Marchetti, A [ENEA, Centro Ricerche Casaccia, Rome (Italy). Dipt. Ambiente; Belli, M [WWF, Fondo Mondiale per la Natura, Rome (Italy)
Among ENEA (Italian National Agency for New Technologies, Energy, and the Environment) activities, one deals with analysis and strategies of environmental and scientific information. A questionnaire, created in collaboration with AIGA (Italian Environmental Journalist Association) and UGIS (Italian Scientific Journalist Association) and WWF has been realized. Purpose of the work was to check the level of sensitivity of the Italian journalists on environmental and scientific issues and to investigate the main obstacle facing to their professional activity.Environmental and scientific problems are usually not correctly perceived by the public. These problems, in fact, undergo a `closeness/distance` perception syndrome despite the fact that they are often presented and discussed in the media. The dichotomy may be explained according to the following phenomenology: 1. the existence of the problem is well known but the scientific and technological contest results to be of hard comprehension; 2. better and prolonged debating about the problem increments the attendance echo but simultaneously decrements the real understanding of it. The public opinion response to the diffusion of news related to environmental and scientific themes of no immediate understanding is of great concern; 3. media organizations are not always suited for dealing with the advanced matter of environmental and scientific information.
Full Text Available Physics and information are intimately connected, and the ultimate information processing devices will be those that harness the principles of quantum mechanics. Many physical systems have been identified as candidates for quantum information processing, but none of them are immune from errors. The challenge remains to find a path from the experiments of today to a reliable and scalable quantum computer. Here, we develop an architecture based on a simple module comprising an optical cavity containing a single negatively charged nitrogen vacancy center in diamond. Modules are connected by photons propagating in a fiber-optical network and collectively used to generate a topological cluster state, a robust substrate for quantum information processing. In principle, all processes in the architecture can be deterministic, but current limitations lead to processes that are probabilistic but heralded. We find that the architecture enables large-scale quantum information processing with existing technology.
Weeks, G.E.; Daniel, W.E.; Edwards, R.E.; Jannarone, R.J.; Joshi, S.N.; Palakodety, S.S.; Qian, D.
For several decades, computerized information processing systems and human information processing models have developed with a good deal of mutual influence. Any comprehensive psychology text in this decade uses terms that originated in the computer industry, such as ''cache'' and ''memory'', to describe human information processing. Likewise, many engineers today are using ''artificial intelligence''and ''artificial neural network'' computing tools that originated as models of human thought to solve industrial problems. This paper concerns a recently developed human information processing model, called ''concurrent information processing'' (CIP), and a related set of computing tools for solving industrial problems. The problem of focus is adaptive gauge monitoring; the application is pneumatic pressure repeaters (Holledge gauges) used to measure liquid level and density in the Defense Waste Processing Facility and the Integrated DWPF Melter System
Mirsafianf, Atefeh S.; Isfahani, Shirin N.; Kasaei, Shohreh; Mobasheri, Hamid
Here we present an approach for processing neural cells images to analyze their growth process in culture environment. We have applied several image processing techniques for: 1- Environmental noise reduction, 2- Neural cells segmentation, 3- Neural cells classification based on their dendrites' growth conditions, and 4- neurons' features Extraction and measurement (e.g., like cell body area, number of dendrites, axon's length, and so on). Due to the large amount of noise in the images, we have used feed forward artificial neural networks to detect edges more precisely.
Full Text Available The article examines the development of visual learning theory, states functions of accuracy and peculiarities of visual technique realization in modern studying process, it defines the concept of “Visual learning environment” and didactic role of interactive and multimedia visualization processes. Author examines the problem of determination of cognitive visualization potential in algorithmic training of students through information and communication technologies of educational environment. This article specifies functions of visual aids use and implementation features of the specified principle in modern educational process and proves the didactic role of interactive multimedia visualization process that stimulates cognitive activity of student and activates perceptive mechanism of teaching information. It analyzes problem of cognitive visualization potential capacity signification while training future marine personnel using informational communicative educational environment.
Sandra Cristina Riascos Erazo
Full Text Available The impact of technology on administrative processes has improved business strategies (especially regarding the e-ffect of information technology - IT, often leading to organisational success. Its effectiveness in this environment was thus modelled due to such importance; this paper describes studying a series of models aimed at assessing IT, its ad-vantages and disadvantages. A model is proposed involving different aspects for an integral assessment of IT effecti-veness and considering administrative activities’ particular characteristics. This analytical study provides guidelines for identifying IT effectiveness in a business environment and current key strategies in technological innovation. This stu-dy was based on ISO 9126, ISO 9001, ISO 15939 and ISO 25000 standards as well as COBIT and CMM stan-dards.
Grethe, Jeffrey S; Ross, Edward; Little, David; Sanders, Brian; Gupta, Amarnath; Astakhov, Vadim
This paper presents current progress in the development of semantic data integration environment which is a part of the Biomedical Informatics Research Network (BIRN; http://www.nbirn.net) project. BIRN is sponsored by the National Center for Research Resources (NCRR), a component of the National Institutes of Health (NIH). A goal is the development of a cyberinfrastructure for biomedical research that supports advance data acquisition, data storage, data management, data integration, data mining, data visualization, and other computing and information processing services over the Internet. Each participating institution maintains storage of their experimental or computationally derived data. Mediator-based data integration system performs semantic integration over the databases to enable researchers to perform analyses based on larger and broader datasets than would be available from any single institution's data. This paper describes recent revision of the system architecture, implementation, and capabilities of the semantically based data integration environment for BIRN.
Aalst, van der W.M.P.; Jensen, K.; Aalst, van der W.M.P.
A Process-Aware Information System (PAIS) is a software system that manages and executes operational processes involving people, applications, and/or information sources on the basis of process models. Example PAISs are workflow management systems, case-handling systems, enterprise information
Andreadis, Konstantinos M.; Storck, Pascal; Lettenmaier, Dennis P.
The effects of forest canopies on snow accumulation and ablation processes can be very important for the hydrology of midlatitude and high-latitude areas. A mass and energy balance model for snow accumulation and ablation processes in forested environments was developed utilizing extensive measurements of snow interception and release in a maritime mountainous site in Oregon. The model was evaluated using 2 years of weighing lysimeter data and was able to reproduce the snow water equivalent (SWE) evolution throughout winters both beneath the canopy and in the nearby clearing, with correlations to observations ranging from 0.81 to 0.99. Additionally, the model was evaluated using measurements from a Boreal Ecosystem-Atmosphere Study (BOREAS) field site in Canada to test the robustness of the canopy snow interception algorithm in a much different climate. Simulated SWE was relatively close to the observations for the forested sites, with discrepancies evident in some cases. Although the model formulation appeared robust for both types of climates, sensitivity to parameters such as snow roughness length and maximum interception capacity suggested the magnitude of improvements of SWE simulations that might be achieved by calibration.
Tao, R.; Poslad, S.; Moßgraber, J.; Middleton, S.; Hammitzsch, M.
The EU FP7 TRIDEC project focuses on enabling real-time, intelligent, information management of collaborative, complex, critical decision processes for earth management. A key challenge is to promote a communication infrastructure to facilitate interoperable environment information services during environment events and crises such as tsunamis and drilling, during which increasing volumes and dimensionality of disparate information sources, including sensor-based and human-based ones, can result, and need to be managed. Such a system needs to support: scalable, distributed messaging; asynchronous messaging; open messaging to handling changing clients such as new and retired automated system and human information sources becoming online or offline; flexible data filtering, and heterogeneous access networks (e.g., GSM, WLAN and LAN). In addition, the system needs to be resilient to handle the ICT system failures, e.g. failure, degradation and overloads, during environment events. There are several system middleware choices for TRIDEC based upon a Service-oriented-architecture (SOA), Event-driven-Architecture (EDA), Cloud Computing, and Enterprise Service Bus (ESB). In an SOA, everything is a service (e.g. data access, processing and exchange); clients can request on demand or subscribe to services registered by providers; more often interaction is synchronous. In an EDA system, events that represent significant changes in state can be processed simply, or as streams or more complexly. Cloud computing is a virtualization, interoperable and elastic resource allocation model. An ESB, a fundamental component for enterprise messaging, supports synchronous and asynchronous message exchange models and has inbuilt resilience against ICT failure. Our middleware proposal is an ESB based hybrid architecture model: an SOA extension supports more synchronous workflows; EDA assists the ESB to handle more complex event processing; Cloud computing can be used to increase and
My aim here is to present a broad-brush overview of some of the most important roles that information has been found to play as a tool for promoting environmentally responsible consumer behaviour. Because this workshop is organized by a network of economists, I will start with the importance...... of information for getting the full potential out of economic instruments. However, my main emphasis will be on the importance of information for creating and facilitating consumers' willing participation in solving environmental problems that are in some way related to their behaviour as consumers. Information...... may be even more important for furthering other important types of behaviour, such as voter behaviour or activist behaviour, but I won't discuss the specific issues about promoting these types of behaviour today....
four regional commands (Africa Command, Central Command, European Command, and Pacific Command). On behalf of their powerful 4-star combatant...nations, and the U.S. DoD’s Assistant Secretary of Defense ( ASD ) for Networks and Information Integration (NII).12 While this arrangement attempted to...leadership to seek a more coherent and enduring approach to information sharing. As ISAF expanded operations throughout the country, it added more
Vymětal, Dominik; Suchánek, Petr
In today´s digital 21st century, almost all businesses face intense competition from competitors all around the globe. There are no borders and business area for the all companies is almost unlimited. As the main supports of mentioned fact are globalization and ICT´s development. Influences such as globalization, increased popularity of outsourcing and offshoring have recently combined to produce an environment where ICT graduates need to have up-to-date and industry-relevant knowledge and sk...
It is generally accepted that human vision is an extremely powerful information processing system that facilitates our interaction with the surrounding world. However, despite extended and extensive research efforts, which encompass many exploration fields, the underlying fundamentals and operational principles of visual information processing in human brain remain unknown. We still are unable to figure out where and how along the path from eyes to the cortex the sensory input perceived by the retina is converted into a meaningful object representation, which can be consciously manipulated by the brain. Studying the vast literature considering the various aspects of brain information processing, I was surprised to learn that the respected scholarly discussion is totally indifferent to the basic keynote question: "What is information?" in general or "What is visual information?" in particular. In the old days, it was assumed that any scientific research approach has first to define its basic departure points. Why was it overlooked in brain information processing research remains a conundrum. In this paper, I am trying to find a remedy for this bizarre situation. I propose an uncommon definition of "information", which can be derived from Kolmogorov's Complexity Theory and Chaitin's notion of Algorithmic Information. Embracing this new definition leads to an inevitable revision of traditional dogmas that shape the state of the art of brain information processing research. I hope this revision would better serve the challenging goal of human visual information processing modeling.
Nijstad, B.A.; de Dreu, C.K.W.
Much of the research into group and team functioning looks at groups that perform cognitive tasks, such as decision making, problem solving, and innovation. The Motivated Information Processing in Groups Model (MIP-G; De Dreu, Nijstad, & Van Knippenberg, 2008) conjectures that information processing
Leach, Mark M; Stoltenberg, Cal D.
The relationship between mood and information processing, particularly when reviewing the Elaboration Likelihood Model of persuasion, lacks conclusive evidence. This study was designed to investigate the hypothesis that information processing would be greater for mood-topic congruence than non mood-topic congruence. Undergraduate students (N=216)…
Zimmermann, Peter; Iwanski, Alexandra
Attachment theory suggests that internal working models of self and significant others influence adjustment during development by controlling information processing and self-regulation. We provide a conceptual overview on possible mechanisms linking attachment and information processing and review the current literature in middle childhood.…
de Dreu, C.K.W.; Beersma, B.
According to the Motivated Information Processing in Groups (MIP-G) model, groups should perform ambiguous (non-ambiguous) tasks better when they have high (low) epistemic motivation and concomitant tendencies to engage in systematic (heuristic) information processing and exchange. The authors
Makatun, Dzmitry; Lauret, Jérôme; Šumbera, Michal
Processing data in distributed environment has found its application in many fields of science (Nuclear and Particle Physics (NPP), astronomy, biology to name only those). Efficiently transferring data between sites is an essential part of such processing. The implementation of caching strategies in data transfer software and tools, such as the Reasoner for Intelligent File Transfer (RIFT) being developed in the STAR collaboration, can significantly decrease network load and waiting time by reusing the knowledge of data provenance as well as data placed in transfer cache to further expand on the availability of sources for files and data-sets. Though, a great variety of caching algorithms is known, a study is needed to evaluate which one can deliver the best performance in data access considering the realistic demand patterns. Records of access to the complete data-sets of NPP experiments were analyzed and used as input for computer simulations. Series of simulations were done in order to estimate the possible cache hits and cache hits per byte for known caching algorithms. The simulations were done for cache of different sizes within interval 0.001 – 90% of complete data-set and low-watermark within 0-90%. Records of data access were taken from several experiments and within different time intervals in order to validate the results. In this paper, we will discuss the different data caching strategies from canonical algorithms to hybrid cache strategies, present the results of our simulations for the diverse algorithms, debate and identify the choice for the best algorithm in the context of Physics Data analysis in NPP. While the results of those studies have been implemented in RIFT, they can also be used when setting up cache in any other computational work-flow (Cloud processing for example) or managing data storages with partial replicas of the entire data-set
Faghihi, Faramarz; Kolodziejski, Christoph; Fiala, André; Wörgötter, Florentin; Tetzlaff, Christian
Fruit flies (Drosophila melanogaster) rely on their olfactory system to process environmental information. This information has to be transmitted without system-relevant loss by the olfactory system to deeper brain areas for learning. Here we study the role of several parameters of the fly's olfactory system and the environment and how they influence olfactory information transmission. We have designed an abstract model of the antennal lobe, the mushroom body and the inhibitory circuitry. Mutual information between the olfactory environment, simulated in terms of different odor concentrations, and a sub-population of intrinsic mushroom body neurons (Kenyon cells) was calculated to quantify the efficiency of information transmission. With this method we study, on the one hand, the effect of different connectivity rates between olfactory projection neurons and firing thresholds of Kenyon cells. On the other hand, we analyze the influence of inhibition on mutual information between environment and mushroom body. Our simulations show an expected linear relation between the connectivity rate between the antennal lobe and the mushroom body and firing threshold of the Kenyon cells to obtain maximum mutual information for both low and high odor concentrations. However, contradicting all-day experiences, high odor concentrations cause a drastic, and unrealistic, decrease in mutual information for all connectivity rates compared to low concentration. But when inhibition on the mushroom body is included, mutual information remains at high levels independent of other system parameters. This finding points to a pivotal role of inhibition in fly information processing without which the system efficiency will be substantially reduced.
Hansen, Steffen Foss; Hartmann, Nanna B.; Baun, Anders
assessment. Chemical fate modelling is one approach to fill this gap within a short time frame. To ensure the reliability of predicted environmental concentrations informed choices are needed during model formulation and development. A major knowledge gap, hampering the further development of such model...... present in the environment. Specific nanomaterials are used as case studies to illustrate these processes. Key environmental processes are identified and ranked and key knowledge gaps are identified, feeding into the longer-term goal of improving the existing models for predicted environmental...
Tempelmans Plat, H.; Deiman, E.P.; Beheshti, M.R.; Zreik, K.
Adequate decision making in the design process needs information about oost oonsequences over the life of the designed object. In succeeding stages the types of decisions change; as a consequence the type of oost information will differ as well. For each stage oost information about realized
Information is definite by the basic resource of activity of enterprises. Suggestion in relation to the selection of informative subsystems of strategic, tactical, operative management is borne. The list of indexes in relation to estimation of the informative providing of functional processes of enterprise is offered.
de Dreu, C.K.W.; Nijstad, B.A.; van Knippenberg, D.
This article expands the view of groups as information processors into a motivated information processing in groups (MIP-G) model by emphasizing, first, the mixedmotive structure of many group tasks and, second, the idea that individuals engage in more or less deliberate information search and
De Dreu, Carsten K. W.; Nijstad, Bernard A.; van Knippenberg, Daan
This article expands the view of groups as information processors into a motivated information processing in groups (MIP-G) model by emphasizing, first, the mixed-motive structure of many group tasks and, second, the idea that individuals engage in more or less deliberate information search and
Hu, Dewen; Liu, Huaping
"Foundations and Practical Applications of Cognitive Systems and Information Processing" presents selected papers from the First International Conference on Cognitive Systems and Information Processing, held in Beijing, China on December 15-17, 2012 (CSIP2012). The aim of this conference is to bring together experts from different fields of expertise to discuss the state-of-the-art in artificial cognitive systems and advanced information processing, and to present new findings and perspectives on future development. This book introduces multidisciplinary perspectives on the subject areas of Cognitive Systems and Information Processing, including cognitive sciences and technology, autonomous vehicles, cognitive psychology, cognitive metrics, information fusion, image/video understanding, brain-computer interfaces, visual cognitive processing, neural computation, bioinformatics, etc. The book will be beneficial for both researchers and practitioners in the fields of Cognitive Science, Computer Science and Cogni...
logistics assessment generates some of this information, its relevance for the decision makers, and relationship to their unpredictability from foreign national logistics systems remains indefinite. This paper identifies and categorises the relevant, available information on country logistics environments...... by using a content analysis approach. We demonstrate the immensity and nature of this information, are able to confirm the changing spatial transaction cost structures, and to reflect upon the overall conditions of information-related complexity and globalisation in the environment. Besides making...
This paper analyzes the effect of the network environment on the uranium mining and metallurgy of the information service. Introduces some measures such as strengthening professional characteristic literature resources construction, changing the service mode, building up information navigation, deepening service, meet the individual needs of users, raising librarian's quality, promoting the co-construction and sharing of library information resources, and puts forward the development idea of uranium mining and metallurgy library information service under the network environment. (author)
increments : 1.0 by the end of the first quarter of fiscal year 2017, 1.5 in the third quarter of fiscal year 2017, and 2.0 by the end of fiscal year 2019...planning, programming, budgeting , and execution process. The purpose of the process is to allocate resources to programs within the department. An...Institute, effective program financial management includes integrating the budgets of program components (e.g., projects), developing an overall
Angelarosa Longo; Viviana Ventre
Rational models in decision processes are marked out by many anomalies, caused by behavioral issues. We point out the importance of information in causing inconsistent preferences in a decision process. In a single or multi agent decision process each mental model is influenced by the presence, the absence or false information about the problem or about other members of the decision making group. The difficulty in modeling these effects increases because behavioral biases influence also the m...
Fenichel, Marilyn; Schweingruber, Heidi A.
Practitioners in informal science settings--museums, after-school programs, science and technology centers, media enterprises, libraries, aquariums, zoos, and botanical gardens--are interested in finding out what learning looks like, how to measure it, and what they can do to ensure that people of all ages, from different backgrounds and cultures,…
Information Technology (IT) business value research is suggested as fundamental to the contribution of the IS discipline. The IS research community has accumulated a critical mass of IT business value studies, but only limited or mixed results have been found on the direct relationship between IT and firm performance. Extant studies mostly focus…
Lunin, Luis F., Ed.; D'Elia, George, Ed.
Introduces eight articles on the Integrated Information Center (IIC) Project, which investigated significant behavioral, technological, organizational, financial, and legal factors involved in the management of IICs. Four articles address design and management issues of general interest, and four focus on specific design considerations and a…
Agrawal, Prathima; Hyden, Eoin; Krzyzanowsji, Paul; Srivastava, Mani B.; Trotter, John
Anytime anywhere wireless access to databases, such as medical and inventory records, can simplify workflow management in a business, and reduce or even eliminate the cost of moving paper documents. Moreover, continual progress in wireless access technology promises to provide per-user bandwidths of the order of a few Mbps, at least in indoor environments. When combined with the emerging high-speed integrated service wired networks, it enables ubiquitous and tetherless access to and processing of multimedia information by mobile users. To leverage on this synergy an indoor wireless network based on room-sized cells and multimedia mobile end-points is being developed at AT&T Bell Laboratories. This research network, called SWAN (Seamless Wireless ATM Networking), allows users carrying multimedia end-points such as PDAs, laptops, and portable multimedia terminals, to seamlessly roam while accessing multimedia data streams from the wired backbone network. A distinguishing feature of the SWAN network is its use of end-to-end ATM connectivity as opposed to the connectionless mobile-IP connectivity used by present day wireless data LANs. This choice allows the wireless resource in a cell to be intelligently allocated amongst various ATM virtual circuits according to their quality of service requirements. But an efficient implementation of ATM in a wireless environment requires a proper mobile network architecture. In particular, the wireless link and medium-access layers need to be cognizant of the ATM traffic, while the ATM layers need to be cognizant of the mobility enabled by the wireless layers. This paper presents an overview of SWAN's network architecture, briefly discusses the issues in making ATM mobile and wireless, and describes initial multimedia applications for SWAN.
Ostapczuk, Peter; Zoriy, Petro; Dederichs, Herbert; Lennartz, Reinhard
Based on Thorium and Uranium determination in soil and plants samples collected in the region of Aktau, Kazakhstan the distribution pattern of environmental pollution by these elements was correlated with the radiation dose. The main radiation source was the waste deposit of the equipment used by the uranium processing (dose higher than 5 μSv/h). The mining area and also the transportation way from mine to the uranium factory has also an radiation impact which is difficult to estimate. Based on the data found by plants and soil samples all the area under study has a higher pollution level by Thorium and Uranium than the control area (about 0.1μSv/h). Due to observed strong wind blowing in different directions it is possible that the particle of uranium ore has been transported for long distance and polluted the plants and upper soil layer. The further investigations should get more information about this supposition. (author)
Schwier, Richard A.; Seaton, J. X.
Does learner participation vary depending on the learning context? Are there characteristic features of participation evident in formal, non-formal, and informal online learning environments? Six online learning environments were chosen as epitomes of formal, non-formal, and informal learning contexts and compared. Transcripts of online…
Dykas, Matthew J; Ehrlich, Katherine B; Cassidy, Jude
This chapter describes theory and research on intergenerational connections between parents' attachment and children's social information processing, as well as between parents' social information processing and children's attachment. The chapter begins with a discussion of attachment theorists' early insights into the role that social information processing plays in attachment processes. Next, current theory about the mechanisms through which cross-generational links between attachment and social information processing might emerge is presented. The central proposition is that the quality of attachment and/or the social information processing of the parent contributes to the quality of attachment and/or social information processing in the child, and these links emerge through mediating processes related to social learning, open communication, gate-keeping, emotion regulation, and joint attention. A comprehensive review of the literature is then presented. The chapter ends with the presentation of a current theoretical perspective and suggestions for future empirical and clinical endeavors.
Hamilton, Rachel K B; Newman, Joseph P
Hamilton and colleagues (2015) recently proposed that an integrative deficit in psychopathy restricts simultaneous processing, thereby leaving fewer resources available for information encoding, narrowing the scope of attention, and undermining associative processing. The current study evaluated this parallel processing deficit proposal using the Simultaneous-Sequential paradigm. This investigation marks the first a priori test of the Hamilton et al.'s theoretical framework. We predicted that psychopathy would be associated with inferior performance (as indexed by lower accuracy and longer response time) on trials requiring simultaneous processing of visual information relative to trials necessitating sequential processing. Results were consistent with these predictions, supporting the proposal that psychopathy is characterized by a reduced capacity to process multicomponent perceptual information concurrently. We discuss the potential implications of impaired simultaneous processing for the conceptualization of the psychopathic deficit. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Full Text Available Abstract Background In daily life, we are exposed to different sound inputs simultaneously. During neural encoding in the auditory pathway, neural activities elicited by these different sounds interact with each other. In the present study, we investigated neural interactions elicited by masker and amplitude-modulated test stimulus in primary and non-primary human auditory cortex during ipsi-lateral and contra-lateral masking by means of magnetoencephalography (MEG. Results We observed significant decrements of auditory evoked responses and a significant inter-hemispheric difference for the N1m response during both ipsi- and contra-lateral masking. Conclusion The decrements of auditory evoked neural activities during simultaneous masking can be explained by neural interactions evoked by masker and test stimulus in peripheral and central auditory systems. The inter-hemispheric differences of N1m decrements during ipsi- and contra-lateral masking reflect a basic hemispheric specialization contributing to the processing of complex auditory stimuli such as speech signals in noisy environments.
Lamp, Sandra A.
There is information available in the literature that discusses information technology (IT) governance and investment decision making from an executive-level perception, yet there is little information available that offers the perspective of process owners and process managers pertaining to their role in IT process improvement and investment…
The present article highlights the importance of the motivational construct for the foreign language learning (FLL) process. More specifically, in the present article it is argued that motivation is likely to play a significant role at all three stages of the FLL process as they are discussed within the information processing model of FLL, namely,…
Occelli, Valeria; Spence, Charles; Zampini, Massimiliano
We highlight the results of those studies that have investigated the plastic reorganization processes that occur within the human brain as a consequence of visual deprivation, as well as how these processes give rise to behaviorally observable changes in the perceptual processing of auditory and tactile information. We review the evidence showing…
Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang
This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view. PMID:27876847
Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang
This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view.
Full Text Available Recent psychophysical evidence indicates that the vertical arrangement of horizontal information is particularly important for encoding facial identity. In this paper we extend this notion to examine the role that information at different (particularly cardinal orientations might play in a number of established phenomena each a behavioural “signature” of face processing. In particular we consider (a the face inversion effect (FIE, (b the facial identity after-effect, (c face-matching across viewpoint, and (d interactive, so-called holistic, processing of face parts. We report that filtering faces to remove all but the horizontal information largely preserves these effects but conversely, retaining vertical information generally diminishes or abolishes them. We conclude that preferential processing of horizontal information is a central feature of human face processing that supports many of the behavioural signatures of this critical visual operation.
for reducing the burden, to Department of Defense, Washington Headquarters Services , Directorate for Information Operations and Reports (0704-0188...which emerges across a wide range of time scales, is often ignored in human–autonomy systems.1 Current technologies to estimate individual humans...actuated Hokuyo LiDAR, and a Garmin GPS), traversed these paths at a speed of about 1.2 m/s while recording all sensor data. Orange soccer balls were
How do humans make moral judgments about others' behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment.
How do humans make moral judgments about others’ behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment. PMID:26579022
Sondersorg, Anna Christina; Busse, Daniela; Kyereme, Jessica; Rothermel, Markus; Neufang, Gitta; Gisselmann, Günter; Hatt, Hanns; Conrad, Heike
Trigeminal fibers terminate within the facial mucosa and skin and transmit tactile, proprioceptive, chemical, and nociceptive sensations. Trigeminal sensations can arise from the direct stimulation of intraepithelial free nerve endings or indirectly through information transmission from adjacent cells at the peripheral innervation area. For mechanical and thermal cues, communication processes between skin cells and somatosensory neurons have already been suggested. High concentrations of most odors typically provoke trigeminal sensations in vivo but surprisingly fail to activate trigeminal neuron monocultures. This fact favors the hypothesis that epithelial cells may participate in chemodetection and subsequently transmit signals to neighboring trigeminal fibers. Keratinocytes, the major cell type of the epidermis, express various receptors that enable reactions to multiple environmental stimuli. Here, using a co-culture approach, we show for the first time that exposure to the odorant chemicals induces a chemical communication between human HaCaT keratinocytes and mouse trigeminal neurons. Moreover, a supernatant analysis of stimulated keratinocytes and subsequent blocking experiments with pyrodoxalphosphate-6-azophenyl-2′,4′-disulfonate revealed that ATP serves as the mediating transmitter molecule released from skin cells after odor stimulation. We show that the ATP release resulting from Javanol® stimulation of keratinocytes was mediated by pannexins. Consequently, keratinocytes act as chemosensors linking the environment and the trigeminal system via ATP signaling. PMID:24790106
Smerecnik, Chris M R; Mesters, Ilse; Candel, Math J J M; De Vries, Hein; De Vries, Nanne K
The role of information processing in understanding people's responses to risk information has recently received substantial attention. One limitation of this research concerns the unavailability of a validated questionnaire of information processing. This article presents two studies in which we describe the development and validation of the Information-Processing Questionnaire to meet that need. Study 1 describes the development and initial validation of the questionnaire. Participants were randomized to either a systematic processing or a heuristic processing condition after which they completed a manipulation check and the initial 15-item questionnaire and again two weeks later. The questionnaire was subjected to factor reliability and validity analyses on both measurement times for purposes of cross-validation of the results. A two-factor solution was observed representing a systematic processing and a heuristic processing subscale. The resulting scale showed good reliability and validity, with the systematic condition scoring significantly higher on the systematic subscale and the heuristic processing condition significantly higher on the heuristic subscale. Study 2 sought to further validate the questionnaire in a field study. Results of the second study corresponded with those of Study 1 and provided further evidence of the validity of the Information-Processing Questionnaire. The availability of this information-processing scale will be a valuable asset for future research and may provide researchers with new research opportunities. © 2011 Society for Risk Analysis.
Gómez, Jaime; Salazar, Idana; Vargas, Pilar
In this paper we use a panel of manufacturing firms in Spain to examine the extent to which they use internal and external sources of information (customers, suppliers, competitors, consultants and universities) to generate product and process innovation. Our results show that, although internal sources are influential, external sources of information are key to achieve innovation performance. These results are in line with the open innovation literature because they show that firms that are opening up their innovation process and that use different information sources have a greater capacity to generate innovations. We also find that the importance of external sources of information varies depending on the type of innovation (product or process) considered. To generate process innovation, firms mainly rely on suppliers while, to generate product innovation, the main contribution is from customers. The potential simultaneity between product and process innovation is also taken into consideration. We find that the generation of both types of innovation is not independent.
Full Text Available In this paper we use a panel of manufacturing firms in Spain to examine the extent to which they use internal and external sources of information (customers, suppliers, competitors, consultants and universities to generate product and process innovation. Our results show that, although internal sources are influential, external sources of information are key to achieve innovation performance. These results are in line with the open innovation literature because they show that firms that are opening up their innovation process and that use different information sources have a greater capacity to generate innovations. We also find that the importance of external sources of information varies depending on the type of innovation (product or process considered. To generate process innovation, firms mainly rely on suppliers while, to generate product innovation, the main contribution is from customers. The potential simultaneity between product and process innovation is also taken into consideration. We find that the generation of both types of innovation is not independent.
In this paper we use a panel of manufacturing firms in Spain to examine the extent to which they use internal and external sources of information (customers, suppliers, competitors, consultants and universities) to generate product and process innovation. Our results show that, although internal sources are influential, external sources of information are key to achieve innovation performance. These results are in line with the open innovation literature because they show that firms that are opening up their innovation process and that use different information sources have a greater capacity to generate innovations. We also find that the importance of external sources of information varies depending on the type of innovation (product or process) considered. To generate process innovation, firms mainly rely on suppliers while, to generate product innovation, the main contribution is from customers. The potential simultaneity between product and process innovation is also taken into consideration. We find that the generation of both types of innovation is not independent. PMID:27035456
Initiative NYFD New York Fire Department NYPD New York Police Department OLAP On Line Analytics Processing OSINT Open Source Intelligence...Intelligence ( OSINT ), from public websites, media sources, and other unclassified events and reports. Although some of these sources do not have a direct
Soffer-Dudek, Nirit; Sadeh, Avi; Dahl, Ronald E; Rosenblat-Stein, Shiran
There is deepening understanding of the effects of sleep on emotional information processing. Emotion information processing is a key aspect of social competence, which undergoes important maturational and developmental changes in adolescence; however, most research in this area has focused on adults. Our aim was to test the links between sleep and emotion information processing during early adolescence. Sleep and facial information processing were assessed objectively during 3 assessment waves, separated by 1-year lags. Data were obtained in natural environments-sleep was assessed in home settings, and facial information processing was assessed at school. 94 healthy children (53 girls, 41 boys), aged 10 years at Time 1. N/A. Facial information processing was tested under neutral (gender identification) and emotional (emotional expression identification) conditions. Sleep was assessed in home settings using actigraphy for 7 nights at each assessment wave. Waking > 5 min was considered a night awakening. Using multilevel modeling, elevated night awakenings and decreased sleep efficiency significantly predicted poor performance only in the emotional information processing condition (e.g., b = -1.79, SD = 0.52, confidence interval: lower boundary = -2.82, upper boundary = -0.076, t(416.94) = -3.42, P = 0.001). Poor sleep quality is associated with compromised emotional information processing during early adolescence, a sensitive period in socio-emotional development.
Full Text Available In this paper we examine the domain of information search and propose a "goal-based" approach to study search strategy. We describe "goal-based information search" using a framework of Knowledge Discovery. We identify two Information Retrieval (IR goals using the constructs of Knowledge Acquisition (KA and Knowledge Explanation (KE. We classify these constructs into two specific information problems: An exploration-exploitation problem and an implicit-explicit problem. Our proposed framework is an extension of prior work in this domain, applying an IR Process Model originally developed for Legal-IR and adapted to Medical-IR. The approach in this paper is guided by the recent ACM-SIG Medical Information Retrieval (MedIR Workshop definition: "methodologies and technologies that seek to improve access to medical information archives via a process of information retrieval."
Full Text Available After many decades of flourishing computer science it is now rather evident that in a world dominated by different kinds of digital information, both applications and people are forced to seek new, innovative structures and forms of data management and organization. Following this blunt observation, researchers in informatics have strived over the recent years to tackle the non-unique and rather evolving notion of context, which aids significantly the data disambiguation process. Motivated by this environment, this work attempts to summarize and organize in a researcher-friendly tabular manner important or pioneer related research works deriving from diverse computational intelligence domains: Initially, we discuss the influence of context with respect to traditional low-level multimedia content analysis and search, and retrieval tasks and then we advance to the fields of overall computational context-awareness and the so-called human-generated contextual elements. In an effort to provide meaningful information to fellow researchers, this brief survey focuses on the impact of context in modern and popular computing undertakings of our era. More specifically, we focus to the presentation of a short review of visual context modeling methods, followed by the depiction of context-awareness in modern computing. Works dealing with the interpretation of context by human-generated interactions are also discussed herein, as the particular domain gains an ever-increasing proportion of related research nowadays. We then conclude the paper by providing a short discussion on (i the motivation behind the included context type categorization into three main pillars; (ii the findings and conclusions of the survey for each context category; and (iii a couple of brief advices derived from the survey for both interested developers and fellow researchers.
Yu, Yang; Krishnamachari, Bhaskar
This book presents state-of-the-art cross-layer optimization techniques for energy-efficient information processing and routing in wireless sensor networks. Besides providing a survey on this important research area, three specific topics are discussed in detail - information processing in a collocated cluster, information transport over a tree substrate, and information routing for computationally intensive applications. The book covers several important system knobs for cross-layer optimization, including voltage scaling, rate adaptation, and tunable compression. By exploring tradeoffs of en
Sophia R. Sklan
Full Text Available Phonons, the quanta of mechanical vibration, are important to the transport of heat and sound in solid materials. Recent advances in the fundamental control of phonons (phononics have brought into prominence the potential role of phonons in information processing. In this review, the many directions of realizing phononic computing and information processing are examined. Given the relative similarity of vibrational transport at different length scales, the related fields of acoustic, phononic, and thermal information processing are all included, as are quantum and classical computer implementations. Connections are made between the fundamental questions in phonon transport and phononic control and the device level approach to diodes, transistors, memory, and logic.
Bazua Rueda, L.F.
The author has been working first in the National Institute of Nuclear Energy (Mexico) and then in URAMEX (Uranio Mexicano) since 1975 to 1983, integrated to radiometric and magnetometric aerial prospecting projects in computerized processing of information aspects. During this period the author participated in the work out of computing systems, information processing and mathematical procedures definition for the geophysical reduction of the calibration equipment data. With cumulated experience, in this thesis are presented aspects concerning to management and operation of computerized processing of information systems. Operation handbooks of the majority of modules are presented. Program lists are not included. (Author)
Joan C. Durrance
Full Text Available Introduction. This article results from a qualitative study of 1 information behavior in community problem-solving framed as a distributed information use environment and 2 approaches used by a best-practice library to anticipate information needs associated with community problem solving. Method. Several approaches to data collection were used - focus groups, interviews, observation of community and library meetings, and analysis of supporting documents. We focused first on the information behaviour of community groups. Finding that the library supported these activities we sought to understand its approach. Analysis. Data were coded thematically for both information behaviour concepts and themes germane to problem-solving activity. A grounded theory approach was taken to capture aspects of the library staff's practice. Themes evolved from the data; supporting documentation - reports, articles and library communication - was also coded. Results. The study showed 1 how information use environment components (people, setting, problems, problem resolutions combine in this distributed information use environment to determine specific information needs and uses; and 2 how the library contributed to the viability of this distributed information use environment. Conclusion. Community problem solving, here explicated as a distributed IUE, is likely to be seen in multiple communities. The library model presented demonstrates that by reshaping its information practice within the framework of an information use environment, a library can anticipate community information needs as they are generated and where they are most relevant.
and the controller-decisionmaker. The control policy used is decentralised , with control decisions made independently by each node using the estimates...approach are, usage of microcomputer systems. which provide a cost-effective solution for data processing, reliability, survivability, local autonomy ...reflected energy from the aircraft to provide a radar echo, but is making a full-blooded reply 150 I.....l itself, this enables the transmitters on the
Redwood, Yanique; Schulz, Amy J; Israel, Barbara A; Yoshihama, Mieko; Wang, Caroline C; Kreuter, Marshall
Growing evidence suggests that the built environment features found in many high-poverty urban areas contribute to negative health outcomes. Both built environment hazards and negative health outcomes disproportionately affect poor people of color. We used community-based participatory research and Photovoice in inner-city Atlanta to elicit African Americans' perspectives on their health priorities. The built environment emerged as a critical factor, impacting physical and mental health outcomes. We offer a conceptual model, informed by residents' perspectives, linking social, economic, and political processes to built environment and health inequities. Research, practice, and policy implications are discussed within an environmental justice framework.
Full Text Available This paper presents and discuss a survey which describes how small-medium enterprises (SMEs implement and use their information system with respect to their logistic and production processes. The study first describes the rationale of the research, then it identifies the characteristics of the companies and detects their general attitude towards information technology (IT. In the following section the paper presents a set of detailed processes to verify the structure and workflow of companies and how IT supports their processes. In the last part we study the influence of some company characteristics to effective use of processes and to different technological approaches, to support defined logistic and production processes. The novelty of the study and its interest, both in academic and institutional context as in the real world, resides in the opportunity to verify and understand the different attitudes of SMEs towards information technology in defining, organizing, planning and control their processes.
Anderson, Ashley A.; Brossard, Dominique; Scheufele, Dietram A.
The shift toward online communication in all realms, from print newspapers to broadcast television, has implications for how the general public consumes information about nanotechnology. The goal of this study is threefold: to investigate who is using online sources for information and news about science and nanotechnology, to examine what the general public is searching for online with regards to nanotechnology, and to analyze what they find in online content of nanotechnology. Using survey data, we find those who report the Internet as their primary source of science and technology news are diverse in age, more knowledgeable about science and nanotechnology, highly educated, male, and more diverse racially than users of other media. In a comparison of demographic data on actual visits by online users to general news and science Web sites, science sites attracted more male, non-white users from the Western region of the United States than news sites did. News sites, on the other hand, attracted those with a slightly higher level of education. Our analysis of published estimates of keyword searches on nanotechnology reveals people are turning to the Internet to search for keyword searches related to the future, health, and applications of nanotechnology. A content analysis of online content reveals health content dominates overall. Comparisons of content in different types of sites-blogs, government, and general sites-are conducted.
Anderson, Ashley A., E-mail: firstname.lastname@example.org; Brossard, Dominique; Scheufele, Dietram A. [University of Wisconsin-Madison, Department of Life Sciences Communication (United States)
The shift toward online communication in all realms, from print newspapers to broadcast television, has implications for how the general public consumes information about nanotechnology. The goal of this study is threefold: to investigate who is using online sources for information and news about science and nanotechnology, to examine what the general public is searching for online with regards to nanotechnology, and to analyze what they find in online content of nanotechnology. Using survey data, we find those who report the Internet as their primary source of science and technology news are diverse in age, more knowledgeable about science and nanotechnology, highly educated, male, and more diverse racially than users of other media. In a comparison of demographic data on actual visits by online users to general news and science Web sites, science sites attracted more male, non-white users from the Western region of the United States than news sites did. News sites, on the other hand, attracted those with a slightly higher level of education. Our analysis of published estimates of keyword searches on nanotechnology reveals people are turning to the Internet to search for keyword searches related to the future, health, and applications of nanotechnology. A content analysis of online content reveals health content dominates overall. Comparisons of content in different types of sites-blogs, government, and general sites-are conducted.
Kasim, Shahreen; Hafit, Hanayanti; Yee, Ng Peng; Hashim, Rathiah; Ruslai, Husni; Jahidin, Kamaruzzaman; Syafwan Arshad, Mohammad
Crime Map is an online web based geographical information system that assists the public and users to visualize crime activities geographically. It acts as a platform for the public communities to share crime activities they encountered. Crime and violence plague the communities we are living in. As part of the community, crime prevention is everyone's responsibility. The purpose of Crime Map is to provide insights of the crimes occurring around Malaysia and raise the public's awareness on crime activities in their neighbourhood. For that, Crime Map visualizes crime activities on a geographical heat maps, generated based on geospatial data. Crime Map analyse data obtained from crime reports to generate useful information on crime trends. At the end of the development, users should be able to make use of the system to access to details of crime reported, crime analysis and report crimes activities. The development of Crime Map also enable the public to obtain insights about crime activities in their area. Thus, enabling the public to work together with the law enforcer to prevent and fight crime.
This revision of Energy Technologies and the Environment reflects the changes in energy supply and demand, focus of environmental concern, and emphasis of energy research and development that have occurred since publication of the earlier edition in 1980. The increase in availability of oil and natural gas, at least for the near term, is responsible in part for a reduced emphasis on development of replacement fuels and technologies. Trends in energy development also have been influenced by an increased reliance on private industry initiatives, and a correspondingly reduced government involvement, in demonstrating more developed technologies. Environmental concerns related to acid rain and waste management continue to increase the demand for development of innovative energy systems. The basic criteria for including a technology in this report are that (1) the technology is a major current or potential future energy supply and (2) significant changes in employing or understanding the technology have occurred since publication of the 1980 edition. Coal is seen to be a continuing major source of energy supply, and thus chapters pertaining to the principal coal technologies have been revised from the 1980 edition (those on coal mining and preparation, conventional coal-fired power plants, fluidized-bed combustion, coal gasification, and coal liquefaction) or added as necessary to include emerging technologies (those on oil shale, combined-cycle power plants, coal-liquid mixtures, and fuel cells).
Kozyrovska N. O.
Full Text Available Plants are heavily populated by pro- and eukaryotic microorganisms and represent therefore the tremendous complexity as a biological system. This system exists as an information-processing entity with rather complex processes of communication, occurring throughout the individual plant. The plant cellular information-proces- sing network constitutes the foundation for processes like growth, defense, and adaptation to the environment. Up to date, the molecular mechanisms, underlying perception, transfer, analysis, and storage of the endogenous and environmental information within the plant, remain to be fully understood. The associated microorganisms and their investment in the information conditioning are often ignored. Endophytes as plant partners are indispen- sable integrative part of the plant system. Diverse endophytic microorganisms comprise «normal» microbiota that plays a role in plant immunity and helps the plant system to survive in the environment (providing assistance in defense, nutrition, detoxification etc.. The role of endophytic microbiota in the processing of information may be presumed, taking into account a plant-microbial co-evolution and empirical data. Since the literature are be- ginning to emerge on this topic, in this article, I review key works in the field of plant-endophytes interactions in the context of information processing and represent the opinion on their putative role in plant information web under defense and the adaptation to changed conditions.
Serghey A. Amelkin
Full Text Available Finite-time approach allows one to optimize regimes of processes in macrosystems when duration of the processes is restricted. Driving force of the processes is difference of intensive variables: temperatures in thermodynamics, values in economics, etc. In microeconomic systems two counterflow fluxes appear due to the only driving force. They are goods and money fluxes. Another possible case is two fluxes with the same direction. The processes of information exchange can be described by this formalism.
Geng, ZhiQiang; Dong, JunGen; Han, YongMing; Zhu, QunXiong
Highlights: •An improved environment DEA cross-model method is proposed. •Energy and environment efficiency analysis framework of complex chemical processes is obtained. •This proposed method is efficient in energy-saving and emission reduction of complex chemical processes. -- Abstract: The complex chemical process is a high pollution and high energy consumption industrial process. Therefore, it is very important to analyze and evaluate the energy and environment efficiency of the complex chemical process. Data Envelopment Analysis (DEA) is used to evaluate the relative effectiveness of decision-making units (DMUs). However, the traditional DEA method usually cannot genuinely distinguish the effective and inefficient DMU due to its extreme or unreasonable weight distribution of input and output variables. Therefore, this paper proposes an energy and environment efficiency analysis method based on an improved environment DEA cross-model (DEACM) method. The inputs of the complex chemical process are divided into energy and non-energy inputs. Meanwhile, the outputs are divided into desirable and undesirable outputs. And then the energy and environment performance index (EEPI) based on the cross evaluation is used to represent the overall performance of each DMU. Moreover, the improvement direction of energy-saving and carbon emission reduction of each inefficiency DMU is quantitatively obtained based on the self-evaluation model of the improved environment DEACM. The results show that the improved environment DEACM method has a better effective discrimination than the original DEA method by analyzing the energy and environment efficiency of the ethylene production process in complex chemical processes, and it can obtain the potential of energy-saving and carbon emission reduction of ethylene plants, especially the improvement direction of inefficient DMUs to improve energy efficiency and reduce carbon emission.
Through the application of specialized systems, future-orientated information processing integrates the sciences of processes, control systems, process control strategies, user behaviour and ergonomics. Improvements in process control can be attained, inter alia, by the preparation of the information contained (e.g. by suppressing the flow of signals and replacing it with signals which are found on substance) and also by an ergonomic representation of the study of the process. (orig.) [de
... Activities: Application To Use the Automated Commercial Environment (ACE) AGENCY: U.S. Customs and Border... Commercial Environment (ACE). This request for comment is being made pursuant to the Paperwork Reduction Act... Number: None. Abstract: The Automated Commercial Environment (ACE) is a trade processing system that will...
Zhang Liangju; Zhang Youhua; Liu Xu; An Zhencai; Li Baoxiang
The computer-based process information system has effectively improved the interface between operation person and the reactor, and has been successfully used in reactor operation environment. This article presents the design strategy, functions realized in the system and some advanced techniques used in the system construction and software development
Weber, Barbara; Rinderle, S.B.; Reichert, M.U.
In today's dynamic business world the economic success of an enterprise increasingly depends on its ability to react to changes in its environment in a quick and flexible way. Process-aware information systems (PAIS) offer promising perspectives in this respect and are increasingly employed for
Full Text Available Rational models in decision processes are marked out by many anomalies, caused by behavioral issues. We point out the importance of information in causing inconsistent preferences in a decision process. In a single or multi agent decision process each mental model is influenced by the presence, the absence or false information about the problem or about other members of the decision making group. The difficulty in modeling these effects increases because behavioral biases influence also the modeler. Behavioral Operational Research (BOR studies these influences to create efficient models to define choices in similar decision processes.
Shabtai, Itamar; Leshno, Moshe; Blondheim, Orna; Kornbluth, Jonathan
With their ever-growing importance and usability, the healthcare sector has been investing heavily in medical information systems in recent years, as part of the effort to improve medical decision-making and increase its efficiency through improved medical processes, reduced costs, integration of patients' data, etc. In light of these developments, this research aims to evaluate the contribution of information technology (IT) to improving the medical decision-making processes at the point of care of internal medicine and surgical departments and to evaluate the degree to which IT investments are worthwhile. This has been done by assessing the value of information to decision-makers (physicians) at the point of care by investigating whether the information systems improved the medical outcomes. The research included three steps (after a pilot study)--the assessment of the subjective value of information, the assessment of the realistic value of information, and the assessment of the normative value of information, the results of each step being used as the starting assumptions for the following steps. Following a discussion and integration of the results from the various steps, the results of the three assessment stages were summarized in a cost-effectiveness analysis and an overall return on investment (ROI) analysis. In addition, we tried to suggest IT strategies for decision-makers in the healthcare sector on the advisability of implementing such systems as well as the implications for managing them. This research is uniquely pioneering in the manner in which it combines an assessment of the three kinds of measures of value of information in the healthcare environment. Our aim in performing it was to contribute to researchers (by providing additional insight into the fields of decision theory, value of information and medical informatics, amongst others), practitioners (by promoting efficiency in the design of new medical IS and improving existing IS), physicians
Université de Genève
Geneva University Physics Department 24, Quai Ernest Ansermet CH-1211 Geneva 4 Monday 11 April 2011 17h00 - Ecole de Physique, Auditoire Stückelberg The optical route to quantum information processing Prof. Terry Rudolph/Imperial College, London Photons are attractive as carriers of quantum information both because they travel, and can thus transmit information, but also because of their good coherence properties and ease in undergoing single-qubit manipulations. The main obstacle to their use in information processing is inducing an effective interaction between them in order to produce entanglement. The most promising approach in photon-based information processing architectures is so-called measurement-based quantum computing. This relies on creating upfront a multi-qubit highly entangled state (the cluster state) which has the remarkable property that, once prepared, it can be used to perform quantum computation by making only single qubit measurements. In this talk I will discuss generically the...
Parker, Jonathan K.
Discusses the benefits of lecturing, when done properly, in high schools. Describes the positive attributes of effective lecturers. Provides a human information-processing model applicable to the task of lecturing to students. (HB)
Bliss-Moreau, Eliza; Moadab, Gilda; Machado, Christopher J
Despite evolutionary claims about the function of facial behaviors across phylogeny, rarely are those hypotheses tested in a comparative context-that is, by evaluating how nonhuman animals process such behaviors. Further, while increasing evidence indicates that humans make meaning of faces by integrating contextual information, including that from the body, the extent to which nonhuman animals process contextual information during affective displays is unknown. In the present study, we evaluated the extent to which rhesus macaques (Macaca mulatta) process dynamic affective displays of conspecifics that included both facial and body behaviors. Contrary to hypotheses that they would preferentially attend to faces during affective displays, monkeys looked for longest, most frequently, and first at conspecifics' bodies rather than their heads. These findings indicate that macaques, like humans, attend to available contextual information during the processing of affective displays, and that the body may also be providing unique information about affective states. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Full Text Available The performance of present-day informational technologies has two main properties: the universality of the structures used and the flexibility of the final user's interfaces. The first determines the potential cover area of the informational domain. The second determines the diversity and efficiency of processing methods of the proceedings being automated. The mentioned aspects are of great importance in agriculture and ecology because there are complex processes and considerable volumes of used information. For example, the meteoro-logical processes are a part of the ecological one like habitats' existential conditions and are known as a complex prognostic problem. The latter needs considerable computational resources to solve the appropriate equations. Likewise, agriculture as a controlled activity under strong impact from natural conditions has the same high requirements for diverse structures and flexibility of information processing.
Rawl, Ruth K.; O'Tuel, Frances S.
The basic assumptions of information processing theories in cognitive psychology are reviewed, and the application of this approach to problem solving in gifted education is considered. Specific implications are cited on problem selection and instruction giving. (CL)
Cash, Philip; Kreye, Melanie
suggestions for improvements and support. One theory that may be particularly applicable to the early design stages is Information Processing Theory (IPT) as it is linked to the design process with regard to the key concepts considered. IPT states that designers search for information if they perceive......, the new knowledge is shared between the design team to reduce ambiguity with regards to its meaning and to build a shared understanding – reducing perceived uncertainty. Thus, we propose that Information-Processing Theory is suitable to describe designer activity in the early design stages...... uncertainty with regard to the knowledge necessary to solve a design challenge. They then process this information and compare if the new knowledge they have gained covers the previous knowledge gap. In engineering design, uncertainty plays a key role, particularly in the early design stages which has been...
Wollstadt, Patricia; Sellers, Kristin K; Rudelt, Lucas; Priesemann, Viola; Hutt, Axel; Fröhlich, Flavio; Wibral, Michael
The disruption of coupling between brain areas has been suggested as the mechanism underlying loss of consciousness in anesthesia. This hypothesis has been tested previously by measuring the information transfer between brain areas, and by taking reduced information transfer as a proxy for decoupling. Yet, information transfer is a function of the amount of information available in the information source-such that transfer decreases even for unchanged coupling when less source information is available. Therefore, we reconsidered past interpretations of reduced information transfer as a sign of decoupling, and asked whether impaired local information processing leads to a loss of information transfer. An important prediction of this alternative hypothesis is that changes in locally available information (signal entropy) should be at least as pronounced as changes in information transfer. We tested this prediction by recording local field potentials in two ferrets after administration of isoflurane in concentrations of 0.0%, 0.5%, and 1.0%. We found strong decreases in the source entropy under isoflurane in area V1 and the prefrontal cortex (PFC)-as predicted by our alternative hypothesis. The decrease in source entropy was stronger in PFC compared to V1. Information transfer between V1 and PFC was reduced bidirectionally, but with a stronger decrease from PFC to V1. This links the stronger decrease in information transfer to the stronger decrease in source entropy-suggesting reduced source entropy reduces information transfer. This conclusion fits the observation that the synaptic targets of isoflurane are located in local cortical circuits rather than on the synapses formed by interareal axonal projections. Thus, changes in information transfer under isoflurane seem to be a consequence of changes in local processing more than of decoupling between brain areas. We suggest that source entropy changes must be considered whenever interpreting changes in information
Cheverie, Joan F
This insightful book explores the challenging issues related to effective access to government information.Amidst all the chaos of today's dynamic information transition period, the only constants related to government information are change and inconsistency, yet with Government Information Collections in the Networked Environment: New Issues and Models, you will defeat the challenging issues and take advantage of the opportunities that networked government information collections have to offer. This valuable book gives you a fresh opportunity to rethink collecting activities and to
Thwaits, Anne Y.
United States. This dissertation presents an account of the history of the institution and the continuing legacy of the early Exploratorium and its founder, Frank Oppenheimer. I argue that the institution is an early example of a constructivist learning museum. I then describe how art encourages learning in the museum. It provides means of presenting information that engage all of the senses and encourage emotional involvement. It reframes familiar sights so that viewers look more closely in search of recognition, and it presents intangible or dematerialized things in a tangible way. It facilitates play, with its many benefits. It brings fresh perspectives and processes to problem solving and the acquisition of new knowledge. This project is the study of an institution where art and science have always coexisted with equal importance, setting it apart from more traditional museums where art was added as a secondary focus to the original disciplinary concentration of the institution. Many of the exhibits were created by artists, but the real value the visual arts bring to the museum is in its contributions to processes such as inquiry, play, problem-solving, and innovation.
Tække, Jesper; Paulsen, Michael
by systems theory we outline a more adequate way of teaching in the new medium environment – a teaching that can manage the new situation and use the new possibilities provided by the digital media. The argumentation builds on empirical findings from the action research project Socio Media Education (SME......This paper is about challenges to steering and leadership of educational interaction in classrooms provided by the new medium environment that comes with digital media. In the new medium environment, the old way of steering what is going on in the classroom appears not to work since...
Kilpinen, R; Saunamäki, T; Jehkonen, M
To provide a comprehensive review of studies on information processing speed in patients with obstructive sleep apnea syndrome (OSAS) as compared to healthy controls and normative data, and to determine whether continuous positive airway pressure (CPAP) treatment improves information processing speed. A systematic review was performed on studies drawn from Medline and PsycINFO (January 1990-December 2011) and identified from lists of references in these studies. After inclusion criteria, 159 articles were left for abstract review, and after exclusion criteria 44 articles were fully reviewed. The number of patients in the studies reviewed ranged from 10 to 157 and the study samples consisted mainly of men. Half of the studies reported that patients with OSAS showed reduced information processing speed when compared to healthy controls. Reduced information processing speed was seen more often (75%) when compared to norm-referenced data. Psychomotor speed seemed to be particularly liable to change. CPAP treatment improved processing speed, but the improvement was marginal when compared to placebo or conservative treatment. Patients with OSAS are affected by reduced information processing speed, which may persist despite CPAP treatment. Information processing is usually assessed as part of other cognitive functioning, not as a cognitive domain per se. However, it is important to take account of information processing speed when assessing other aspects of cognitive functioning. This will make it possible to determine whether cognitive decline in patients with OSAS is based on lower-level or higher-level cognitive processes or both. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Wolleat, Patricia L.
Information processing theory could be made more sensitive to differences in career outcomes for males and females by (1) examining the nature of the career decision; (2) expanding the notion of information; (3) relating the vocational schema to the gender schema; and (4) noting whether variables are general, sex related, or sex specific. (SK)
The positioning process of marketing used by special libraries and information centers involves two key decisions from which other decisions are derived: to which user groups marketing programs and services will be directed; and which information needs will be served. Two cases are discussed and a bibliography is provided. (EJS)
Laudato, Nicholas C.; DeSantis, Dennis J.
The approach used by the University of Pittsburgh (Pennsylvania) in designing a campus-wide information architecture and a framework for reengineering the business process included building consensus on a general philosophy for information systems, using pattern-based abstraction techniques, applying data modeling and application prototyping, and…
Оксана Николаевна Ромашкова
Full Text Available This work concerns information model of the educational complex which includes several schools. A classification of educational complexes formed in Moscow is given. There are also a consideration of the existing organizational structure of the educational complex and a suggestion of matrix management structure. Basic management information processes of the educational complex were conceptualized.
Apikyan, S.; Yerznkanyan, K.; Diamond, D.; Vardanyan, M.; Sevikyan, G.
The successful implementations of the NATO-ASTECMATRIX project in Armenia are essential contribution into security, stability and solidarity among regional nations, by applying the best technical expertise to problem solving. Collaboration, networking and capacity-building are means used to accomplish these goals. A further aim is to promote the co-operation with new partners and the ASTEC are creating links between scientists and organizations in formerly separated communities, developing new strategy concentrating support on security related collaborative projects and finding answers to critical questions and a way of connecting nations. The NATO-ASTECMATRIX within Armenia leads to a network of high standards laboratories that will drastically improve the overview and the technical infrastructure for monitoring, accounting and control of CBRN materials in the Armenia. This new infrastructure will enhance the exchange of information on this vital issue via the IRIS. In follow-up phases, it will also help to better define the needs and requirements for a policy to enhance legal tools for the management of these materials, and for the creation of one or several agencies aiming at dealing with wastes or no longer useful materials containing CBRN components in Armenia
Successful processing of quantum information is, to a large degree, based on two aspects: a) the implementation of high-fidelity quantum gates, as well as b) avoiding or suppressing decoherence processes that destroy quantum information. The presented work shows our progress in the field of experimental quantum information processing over the last years: the implementation and characterisation of several quantum operations, amongst others the first realisation of the quantum Toffoli gate in an ion-trap based quantum computer. The creation of entangled states with up to 14 qubits serves as basis for investigations of decoherence processes. Based on the realised quantum operations as well as the knowledge about dominant noise processes in the employed apparatus, entanglement swapping as well as quantum operations within a decoherence-free subspace are demonstrated. (author) [de
Detecting, investigating and prosecuting cybercrime? Extremely important, but not really the solution for the problem. Prevention is better! The sectors that have joined the Cybercrime Information Exchange have accepted the challenge of ensuring the effectiveness of the (information) security of process control systems (PCS), including SCADA. This publication makes it clear why it is vital that organizations establish and maintain control over the security of the information and communication...
Weber, Darren L
This review considers theory and evidence for abnormal information processing in post-traumatic stress disorder (PTSD). Cognitive studies have indicated sensitivity in PTSD for traumatic information, more so than general emotional information. These findings were supported by neuroimaging studies that identify increased brain activity during traumatic cognition, especially in affective networks (including the amygdala, orbitofrontal and anterior cingulate cortex). In theory, it is proposed th...
Ross, M. D.
Study of montages, tracings and reconstructions prepared from a series of 570 consecutive ultrathin sections shows that rat maculas are morphologically organized for parallel processing of linear acceleratory information. Type II cells of one terminal field distribute information to neighboring terminals as well. The findings are examined in light of physiological data which indicate that macular receptor fields have a preferred directional vector, and are interpreted by analogy to a computer technology known as an information network.
Zwolak, Michael; Quan, Haitao; Zurek, Wojciech
Quantum Darwinism provides an information-theoretic framework for the emergence of the classical world from the quantum substrate. It recognizes that we - the observers - acquire our information about the ``systems of interest'' indirectly from their imprints on the environment. Objectivity, a key property of the classical world, arises via the proliferation of redundant information into the environment where many observers can then intercept it and independently determine the state of the system. While causing a system to decohere, environments that remain nearly invariant under the Hamiltonian dynamics, such as very mixed states, have a diminished ability to transmit information about the system, yet can still acquire redundant information about the system [1,2]. Our results show that Quantum Darwinism is robust with respect to non-ideal initial states of the environment. This research is supported by the U.S. Department of Energy through the LANL/LDRD Program.
Bass, Ellen J.; Baumgart, Leigh A.; Shepley, Kathryn Klein
Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noisy sensor data are used by both the human and the information analysis automation to make judgments. In a simplified air traffic conflict prediction experiment, 32 participants made probability of horizontal conflict judgments under different display content conditions. After being exposed to the information analysis automation, judgment achievement significantly improved for all participants as compared to judgments without any of the automation's information. Participants provided with additional display content pertaining to cue variability in the task environment had significantly higher aided judgment achievement compared to those provided with only the automation's judgment of a probability of conflict. When designing information analysis automation for environments where the automation's judgment achievement is impacted by noisy environmental data, it may be beneficial to show additional task environment information to the human judge in order to improve judgment performance. PMID:24847184
Bass, Ellen J; Baumgart, Leigh A; Shepley, Kathryn Klein
Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noisy sensor data are used by both the human and the information analysis automation to make judgments. In a simplified air traffic conflict prediction experiment, 32 participants made probability of horizontal conflict judgments under different display content conditions. After being exposed to the information analysis automation, judgment achievement significantly improved for all participants as compared to judgments without any of the automation's information. Participants provided with additional display content pertaining to cue variability in the task environment had significantly higher aided judgment achievement compared to those provided with only the automation's judgment of a probability of conflict. When designing information analysis automation for environments where the automation's judgment achievement is impacted by noisy environmental data, it may be beneficial to show additional task environment information to the human judge in order to improve judgment performance.
Marcus, M.A.; Wang, A.
This volume contains 35 papers presented at the symposium. Some of the topics covered are: sensors for the energy industry; sensors for materials evaluation and structural monitoring; sensors for engine industry; and other harsh environments sensors
businessman with experience in the film and radio broadcasting industries who had also been an overseas consultant to the Voice of America . The new USIA...Center HASC House Armed Services Committee HR House Resolution IE information environment INC Information and Censorship Section ISIS...USIS United States Information Service USSOCOM United States Special Operations Command VOA Voice of America xiv THIS PAGE INTENTIONALLY
Askola, Kreetta; Atsushi, Toshimori; Huotari, Maija-Leena
Introduction: The aim of this study was to identify cultural differences in the information environment and information practices, namely active seeking and encountering, of web-based health information between Finnish and Japanese university students. Method: The data were gathered with a Web-based survey among first-year university students at…
Olney, Cynthia A
After arguing that most community-based organizations (CBOs) function as complex adaptive systems, this white paper describes the evaluation goals, questions, indicators, and methods most important at different stages of community-based health information outreach. This paper presents the basic characteristics of complex adaptive systems and argues that the typical CBO can be considered this type of system. It then presents evaluation as a tool for helping outreach teams adapt their outreach efforts to the CBO environment and thus maximize success. Finally, it describes the goals, questions, indicators, and methods most important or helpful at each stage of evaluation (community assessment, needs assessment and planning, process evaluation, and outcomes assessment). Literature from complex adaptive systems as applied to health care, business, and evaluation settings is presented. Evaluation models and applications, particularly those based on participatory approaches, are presented as methods for maximizing the effectiveness of evaluation in dynamic CBO environments. If one accepts that CBOs function as complex adaptive systems-characterized by dynamic relationships among many agents, influences, and forces-then effective evaluation at the stages of community assessment, needs assessment and planning, process evaluation, and outcomes assessment is critical to outreach success.
Full Text Available This paper analyzes the quality of the administration of information, identifying deficiencies in the information systems, used in the negotiation process for concession of bank credit, to small and mid-sized companies, under the business managers' perspective. The results make the deficiencies evident and confirm the need for change in the systems of administration of information, in order to allow for both an improvement in the negotiation process of bank credit as well as a larger economical efficiency of the available resources.
Full Text Available In essence, process of maintaining equipment is a support process, because it indirectly contributes to operational ability of the production process necessary for the supply chain of the new value. Taking into account increased levels of automatization and quality, this proces s becomes more and more significant and for some branches of industry, even crucial. Due to the fact that the quality of the entire process is more and more dependent on the maintenance process, these processes must be carefully designed and effectively im plemented. There are various techniques and approaches at our disposal, such as technical, logistical and intensive application of the information - communication technologies. This last approach is presented in this work. It begins with organizational goa ls, especially quality objectives. Then, maintenance processes and integrated information system structures are defined. Maintenance process quality and improvement processes are defined using a set of performances, with a special emphasis placed on effectiveness and quality economics. At the end of the work, information system for improving maintenance economics is structured. Besides theoretical analysis, work also presents results authors obtained analyzing food industry, metal processing industry an d building materials industry.
Full Text Available The paper presents an overview of some problems of information science which are explicitly portrayed in literature. It covers the following issues: information explosion, information flood and data deluge, information retrieval and relevance of information, and finally, the problem of scientific communication. The purpose of this paper is to explain why knowledge acquisition, can be considered as an issue in information sciences. The existing theoretical foundation within the information sciences, i.e. the DIKW hierarchy and its key concepts - data, information, knowledge and wisdom, is recognized as a symbolic representation as well as the theoretical foundation of the knowledge acquisition process. Moreover, it seems that the relationship between the DIKW hierarchy and the knowledge acquisition process is essential for a stronger foundation of information sciences in the 'body' of the overall human knowledge. In addition, the history of both the human and machine knowledge acquisition has been considered, as well as a proposal that the DIKW hierarchy take place as a symbol of general knowledge acquisition process, which could equally relate to both human and machine knowledge acquisition. To achieve this goal, it is necessary to modify the existing concept of the DIKW hierarchy. The appropriate modification of the DIKW hierarchy (one of which is presented in this paper could result in a much more solid theoretical foundation of the knowledge acquisition process and information sciences as a whole. The theoretical assumptions on which the knowledge acquisition process may be established as a problem of information science are presented at the end of the paper. The knowledge acquisition process does not necessarily have to be the subject of epistemology. It may establish a stronger link between the concepts of data and knowledge; furthermore, it can be used in the context of scientific research, but on the more primitive level than conducting
Rose, Susan A.; Feldman, Judith F.; Jankowski, Jeffery J.
This study examined the relation of 3-year core information-processing abilities to lexical growth and development. The core abilities covered four domains--memory, representational competence (cross-modal transfer), processing speed, and attention. Lexical proficiency was assessed at 3 and 13 years with the Peabody Picture Vocabulary Test (PPVT)…
Jepma, Marieke; Wagenmakers, Eric-Jan; Nieuwenhuis, Sander
People are able to use temporal cues to anticipate the timing of an event, enabling them to process that event more efficiently. We conducted two experiments, using the fixed-foreperiod paradigm (Experiment 1) and the temporal-cueing paradigm (Experiment 2), to assess which components of information processing are speeded when subjects use such…
Beer, Randall D.; Williams, Paul L.
There has been considerable debate in the literature about the relative merits of information processing versus dynamical approaches to understanding cognitive processes. In this article, we explore the relationship between these two styles of explanation using a model agent evolved to solve a relational categorization task. Specifically, we…
Fulk, Janet; And Others
Presents a model to examine how social influence processes affect individuals' attitudes toward communication media and media use behavior, integrating two research areas: media use patterns as the outcome of objectively rational choices and social information processing theory. Asserts (in a synthesis) that media characteristics and attitudes are…
Romantsov, V. A.; Dyubkin, I. A.; Klyukbin, L. N.
An automated system for primary and scientific analysis of deep water hydrological information is presented. Primary processing of the data in this system is carried out on a drifting station, which also calculates the parameters of vertical stability of the sea layers, as well as their depths and altitudes. Methods of processing the raw data are described.
Suggests that information architecture design is primarily an inductive process, partly because it lacks internal theory and partly because it is an activity that supports emergent phenomena (user experiences) from basic design components. Suggests a resemblance to Constructive Induction, a design process that locates the best representational…
Lee, Jee Hoh; Kim, Tae Hwan; Choi, Kwang; Chung, Hyun Sook; Keum, Jong Yong
This project is to establish high-quality information circulation system by developing serials-control system to improve serials management from ordering to distributing and availability on R and D and to advance in quality of information service needed in R and D by fast retrieval and providing of research information with CD-Net. The results of the project are as follows. 1. Serials management process which covers from ordering to distributing have higher efficiency by development of subscription information system. 2. Systematic control on each issue of serials is achieved by development of serials checking system. 3. It is possible to provide vol. and no. information of issue received currently to researchers promptly by improvement of serials holding information system. 4. Retrieval of research information contained in various CD-ROM DB throughout KAERI-NET is possible by research on construction methods of CD-Net. 2 figs, 25 refs. (Author)
Lee, Jee Hoh; Kim, Tae Hwan; Choi, Kwang; Chung, Hyun Sook; Keum, Jong Yong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)
This project is to establish high-quality information circulation system by developing serials-control system to improve serials management from ordering to distributing and availability on R and D and to advance in quality of information service needed in R and D by fast retrieval and providing of research information with CD-Net. The results of the project are as follows. 1. Serials management process which covers from ordering to distributing have higher efficiency by development of subscription information system. 2. Systematic control on each issue of serials is achieved by development of serials checking system. 3. It is possible to provide vol. and no. information of issue received currently to researchers promptly by improvement of serials holding information system. 4. Retrieval of research information contained in various CD-ROM DB throughout KAERI-NET is possible by research on construction methods of CD-Net. 2 figs, 25 refs. (Author).
Full Text Available In the article theoretical bases of electro-coagulation of admixtures are examined in a water technological environment with the use of theory of the active hittings, which are based on the results of the executed researches and analysis of scientific information. Application of theory of the active hittings is in coagulation, provides high efficiency of process of extraction of admixtures from water environments during minimization of energy consumption and expenses of materials.
Full Text Available Information technology outsourcing relationship is one of the key issues to IT outsourcing success. To explore how to manage and promote IT outsourcing relationship, it is necessary to understand its evolution process. Firstly, the types of IT outsourcing based on relationship quality and IT outsourcing project level will be analyzed; Secondly, two evolution process models of IT outsourcing relationship are proposed based on relationship quality and IT outsourcing project level, and the IT outsourcing relationship evolution process is indicated; Finally, an IT outsourcing relationship evolution process model is developed, and the development process of IT outsourcing relationship from low to high under the internal and external power is explained.
Đurović Aleksandar M.
Full Text Available This paper examines the role of information communication technologies in reengineering processes. General analysis of a process will show that information communication technologies improve their efficiency. Reengineering model based on the BPMN 2.0 standard will be applied to the process of seeking internship/job by students from Faculty of Transport and Traffic Engineering. In the paper, after defining the technical characteristics and required functionalities, web / mobile application is proposed, enabling better visibility of traffic engineers to companies seeking that education profile.
The new technology revolution based on Internet, information and communication technology has triggered an upsurge of educational information in the world, including English learning and teaching. The improvement of teacher's information literacy is the key to the success of the current educational informatization reform. From the perspectives of…
Ceklic, Tijana; Bastien, Célyne H
Insomnia sufferers (INS) are cortically hyperaroused during sleep, which seems to translate into altered information processing during nighttime. While information processing, as measured by event-related potentials (ERPs), during wake appears to be associated with sleep quality of the preceding night, the existence of such an association during nighttime has never been investigated. This study aims to investigate nighttime information processing among good sleepers (GS) and INS while considering concomitant sleep quality. Following a multistep clinical evaluation, INS and GS participants underwent 4 consecutive nights of PSG recordings in the sleep laboratory. Thirty nine GS (mean age 34.56±9.02) and twenty nine INS (mean age 43.03±9.12) were included in the study. ERPs (N1, P2, N350) were recorded all night on Night 4 (oddball paradigm) during NREM sleep. Regardless of sleep quality, INS presented a larger N350 amplitude during SWS (p=0.042) while GS showed a larger N350 amplitude during late-night stage 2 sleep (p=0.004). Regardless of diagnosis, those who slept objectively well showed a smaller N350 amplitude (p=0.020) while those who slept subjectively well showed a smaller P2 (pInformation processing seems to be associated with concomitant subjective and objective sleep quality for both GS and INS. However, INS show an alteration in information processing during sleep, especially for inhibition processes, regardless of their sleep quality. Copyright © 2015 Elsevier B.V. All rights reserved.
Wendt, Mike; Luna-Rodriguez, Aquiles; Jacobsen, Thomas
Humans are selective information processors who efficiently prevent goal-inappropriate stimulus information to gain control over their actions. Nonetheless, stimuli, which are both unnecessary for solving a current task and liable to cue an incorrect response (i.e., "distractors"), frequently modulate task performance, even when consistently paired with a physical feature that makes them easily discernible from target stimuli. Current models of cognitive control assume adjustment of the processing of distractor information based on the overall distractor utility (e.g., predictive value regarding the appropriate response, likelihood to elicit conflict with target processing). Although studies on distractor interference have supported the notion of utility-based processing adjustment, previous evidence is inconclusive regarding the specificity of this adjustment for distractor information and the stage(s) of processing affected. To assess the processing of distractors during sensory-perceptual phases we applied EEG recording in a stimulus identification task, involving successive distractor-target presentation, and manipulated the overall distractor utility. Behavioral measures replicated previously found utility modulations of distractor interference. Crucially, distractor-evoked visual potentials (i.e., posterior N1) were more pronounced in high-utility than low-utility conditions. This effect generalized to distractors unrelated to the utility manipulation, providing evidence for item-unspecific adjustment of early distractor processing to the experienced utility of distractor information. Copyright © 2014 the authors 0270-6474/14/3416720-06$15.00/0.
Aso, Kenji; Hanakawa, Takashi; Aso, Toshihiko; Fukuyama, Hidenao
The neural basis of temporal information processing remains unclear, but it is proposed that the cerebellum plays an important role through its internal clock or feed-forward computation functions. In this study, fMRI was used to investigate the brain networks engaged in perceptual and motor aspects of subsecond temporal processing without accompanying coprocessing of spatial information. Direct comparison between perceptual and motor aspects of time processing was made with a categorical-design analysis. The right lateral cerebellum (lobule VI) was active during a time discrimination task, whereas the left cerebellar lobule VI was activated during a timed movement generation task. These findings were consistent with the idea that the cerebellum contributed to subsecond time processing in both perceptual and motor aspects. The feed-forward computational theory of the cerebellum predicted increased cerebro-cerebellar interactions during time information processing. In fact, a psychophysiological interaction analysis identified the supplementary motor and dorsal premotor areas, which had a significant functional connectivity with the right cerebellar region during a time discrimination task and with the left lateral cerebellum during a timed movement generation task. The involvement of cerebro-cerebellar interactions may provide supportive evidence that temporal information processing relies on the simulation of timing information through feed-forward computation in the cerebellum.
Gao, S.; Mioc, Darka; Yi, X.L.
facilitated the online processing, mapping and sharing of health information, with the use of HERXML and Open Geospatial Consortium (OGC) services. It brought a new solution in better health data representation and initial exploration of the Web-based processing of health information. Conclusion: The designed......Background: There is great concern within health surveillance, on how to grapple with environmental degradation, rapid urbanization, population mobility and growth. The Internet has emerged as an efficient way to share health information, enabling users to access and understand data....... For the representation of health information through Web-mapping applications, there still lacks a standard format to accommodate all fixed (such as location) and variable (such as age, gender, health outcome, etc) indicators in the representation of health information. Furthermore, net-centric computing has not been...
Arlinah Imam Rahardjo
Full Text Available PCU-CAMEL (Petra Christian University-Computer Aided Mechanical Engineering Department Learning Environment has been developed to integrate the use of this web-based learning environment into the traditional, face-to-face setting of class activities. This integrated learning method is designed as an effort to enrich and improve the teaching-learning process at Petra Christian University. A study was conducted to introduce the use of PCU-CAMEL as a tool in evaluating teaching learning process. The study on this method of evaluation was conducted by using a case analysis on the integration of PCU-CAMEL to the traditional face-to-face meetings of LIS (Library Information System class at the Informatics Engineering Department of Petra Christian University. Students’ responses documented in some features of PCU-CAMEL were measured and analyzed to evaluate the effectiveness of this integrated system in developing intrinsic motivation of the LIS students of the first and second semester of 2004/2005 to learn. It is believed that intrinsic motivation can drive students to learn more. From the study conducted, it is concluded that besides its capability in developing intrinsic motivation, PCU-CAMEL as a web-based learning environment, can also serve as an effective tool for both students and instructors to evaluate the teaching-learning process. However, some weaknesses did exist in using this method of evaluating teaching-learning process. The free style and unstructured form of the documentation features of this web-based learning environment can lead to ineffective evaluation results
Axson, Sydney A; Giordano, Nicholas A; Hermann, Robin M; Ulrich, Connie M
Informed consent is fundamental to the autonomous decision-making of patients, yet much is still unknown about the process in the clinical setting. In an evolving healthcare landscape, nurses must be prepared to address patient understanding and participate in the informed consent process to better fulfill their well-established role as patient advocates. This study examines hospital-based nurses' experiences and understandings of the informed consent process. This qualitative descriptive study utilized a semi-structured interview approach identifying thematic concerns, experiences, and knowledge of informed consent across a selected population of clinically practicing nurses. Participants and research context: In all, 20 baccalaureate prepared registered nurses practicing in various clinical settings (i.e. critical care, oncology, medical/surgical) at a large northeastern academic medical center in the United States completed semi-structured interviews and a demographic survey. The mean age of participants was 36.6 years old, with a mean of 12.2 years of clinical experience. Ethical considerations: Participation in this study involved minimal risk and no invasive measures. This study received Institutional Review Board approval from the University of Pennsylvania. All participants voluntarily consented. The majority of participants (N = 19) believe patient safety is directly linked to patient comprehension of the informed consent process. However, when asked if nurses have a defined role in the informed consent process, nearly half did not agree (N = 9). Through this qualitative approach, three major nursing roles emerged: the nurse as a communicator, the nurse as an advocate, and the clerical role of the nurse. This investigation contributes to the foundation of ethical research that will better prepare nurses for patient engagement, advance current understanding of informed consent, and allow for future development of solutions. Nurses are at the forefront of
Review of the safeguards of information technology, its current developments and status of safeguards in Member States are described concerning especially the role of domestic safeguards in cooperation with IAEA Safeguards. A Number of reports is dealing with declarations provided to the IAEA pursuant to Protocols Additional to Safeguard agreements. The Information Section of the IAEA Safeguards Information Technology Division is responsible for the data entry, loading and quality control od State supplied declarations. A software system is used to process information which should be readily accessible and usable in implementation of the strengthened safeguards system. Experiences in combating illegal trafficking of nuclear materials in a number of countries are included
Critchley, Frank; Dodson, Christopher
This book focuses on the application and development of information geometric methods in the analysis, classification and retrieval of images and signals. It provides introductory chapters to help those new to information geometry and applies the theory to several applications. This area has developed rapidly over recent years, propelled by the major theoretical developments in information geometry, efficient data and image acquisition and the desire to process and interpret large databases of digital information. The book addresses both the transfer of methodology to practitioners involved in database analysis and in its efficient computational implementation.
Weir, Charlene; Gibson, Bryan; Taft, Teresa; Slager, Stacey; Lewis, Lacey; Staggers, Nancy
Delirium is a fluctuating disturbance of cognition and/or consciousness associated with poor outcomes. Caring for patients with delirium requires integration of disparate information across clinicians, settings and time. The goal of this project was to characterize the information processes involved in nurses' assessment, documentation, decisionmaking and communication regarding patients' mental status in the inpatient setting. VA nurse managers of medical wards (n=18) were systematically selected across the US. A semi-structured telephone interview focused on current assessment, documentation, and communication processes, as well as clinical and administrative decision-making was conducted, audio-recorded and transcribed. A thematic analytic approach was used. Five themes emerged: 1) Fuzzy Concepts, 2) Grey Data, 3) Process Variability 4) Context is Critical and 5) Goal Conflict. This project describes the vague and variable information processes related to delirium and mental status that undermine effective risk, prevention, identification, communication and mitigation of harm.
Zhou, Zude; Xiao, Zheng; Liu, Quan; Ai, Qingsong
'Customer requirements' (CRs) management is a key component of customer relationship management (CRM). By processing customer-focused information, CRs management plays an important role in enterprise systems (ESs). Although two main CRs analysis methods, quality function deployment (QFD) and Kano model, have been applied to many fields by many enterprises in the past several decades, the limitations such as complex processes and operations make them unsuitable for online businesses among small- and medium-sized enterprises (SMEs). Currently, most SMEs do not have the resources to implement QFD or Kano model. In this article, we propose a method named customer requirement information (CRI), which provides a simpler and easier way for SMEs to run CRs analysis. The proposed method analyses CRs from the perspective of information and applies mathematical methods to the analysis process. A detailed description of CRI's acquisition, classification and processing is provided.
Bodatsch, Mitja; Klosterkötter, Joachim; Müller, Ralf; Ruhrmann, Stephan
The basic symptoms (BS) approach provides a valid instrument in predicting psychosis onset and represents moreover a significant heuristic framework for research. The term "basic symptoms" denotes subtle changes of cognition and perception in the earliest and prodromal stages of psychosis development. BS are thought to correspond to disturbances of neural information processing. Following the heuristic implications of the BS approach, the present paper aims at exploring disturbances of information processing, revealed by functional magnetic resonance imaging (fMRI) and electro-encephalographic as characteristics of the at-risk state of psychosis. Furthermore, since high-risk studies employing ultra-high-risk criteria revealed non-conversion rates commonly exceeding 50%, thus warranting approaches that increase specificity, the potential contribution of neural information processing disturbances to psychosis prediction is reviewed. In summary, the at-risk state seems to be associated with information processing disturbances. Moreover, fMRI investigations suggested that disturbances of language processing domains might be a characteristic of the prodromal state. Neurophysiological studies revealed that disturbances of sensory processing may assist psychosis prediction in allowing for a quantification of risk in terms of magnitude and time. The latter finding represents a significant advancement since an estimation of the time to event has not yet been achieved by clinical approaches. Some evidence suggests a close relationship between self-experienced BS and neural information processing. With regard to future research, the relationship between neural information processing disturbances and different clinical risk concepts warrants further investigations. Thereby, a possible time sequence in the prodromal phase might be of particular interest.
Wurch, Louie; Giannone, Richard J; Belisle, Bernard S; Swift, Carolyn; Utturkar, Sagar; Hettich, Robert L; Reysenbach, Anna-Louise; Podar, Mircea
Biological features can be inferred, based on genomic data, for many microbial lineages that remain uncultured. However, cultivation is important for characterizing an organism's physiology and testing its genome-encoded potential. Here we use single-cell genomics to infer cultivation conditions for the isolation of an ectosymbiotic Nanoarchaeota ('Nanopusillus acidilobi') and its host (Acidilobus, a crenarchaeote) from a terrestrial geothermal environment. The cells of 'Nanopusillus' are among the smallest known cellular organisms (100-300 nm). They appear to have a complete genetic information processing machinery, but lack almost all primary biosynthetic functions as well as respiration and ATP synthesis. Genomic and proteomic comparison with its distant relative, the marine Nanoarchaeum equitans illustrate an ancient, common evolutionary history of adaptation of the Nanoarchaeota to ectosymbiosis, so far unique among the Archaea.
Rossi, R.; Elliott, E. M.; Bain, D.; Crowley, K. J.; Steiner, M. A.; Divers, M. T.; Hopkins, K. G.; Giarratani, L.; Gilmore, M. E.
While energy links all living and non-living systems, the integration of energy, the environment, and society is often not clearly represented in 9 - 12 classrooms and informal learning venues. However, objective public learning that integrates these components is essential for improving public environmental literacy. ENERGY-NET (Energy, Environment and Society Learning Network) is a National Science Foundation funded initiative that uses an Earth Systems Science framework to guide experimental learning for high school students and to improve public learning opportunities regarding the energy-environment-society nexus in a Museum setting. One of the primary objectives of the ENERGY-NET project is to develop a rich set of experimental learning activities that are presented as exhibits at the Carnegie Museum of Natural History in Pittsburgh, Pennsylvania (USA). Here we detail the evolution of the ENERGY-NET exhibit building process and the subsequent evolution of exhibit content over the past three years. While preliminary plans included the development of five "exploration stations" (i.e., traveling activity carts) per calendar year, the opportunity arose to create a single, larger topical exhibit per semester, which was assumed to have a greater impact on museum visitors. Evaluative assessments conducted to date reveal important practices to be incorporated into ongoing exhibit development: 1) Undergraduate mentors and teen exhibit developers should receive additional content training to allow richer exhibit materials. 2) The development process should be distributed over as long a time period as possible and emphasize iteration. This project can serve as a model for other collaborations between geoscience departments and museums. In particular, these practices may streamline development of public presentations and increase the effectiveness of experimental learning activities.