Full Text Available Abstract Background Proteins interact through specific binding interfaces that contain many residues in domains. Protein interactions thus occur on three different levels of a concept hierarchy: whole-proteins, domains, and residues. Each level offers a distinct and complementary set of features for computationally predicting interactions, including functional genomic features of whole proteins, evolutionary features of domain families and physical-chemical features of individual residues. The predictions at each level could benefit from using the features at all three levels. However, it is not trivial as the features are provided at different granularity. Results To link up the predictions at the three levels, we propose a multi-level machine-learning framework that allows for explicit information flow between the levels. We demonstrate, using representative yeast interaction networks, that our algorithm is able to utilize complementary feature sets to make more accurate predictions at the three levels than when the three problems are approached independently. To facilitate application of our multi-level learning framework, we discuss three key aspects of multi-level learning and the corresponding design choices that we have made in the implementation of a concrete learning algorithm. 1 Architecture of information flow: we show the greater flexibility of bidirectional flow over independent levels and unidirectional flow; 2 Coupling mechanism of the different levels: We show how this can be accomplished via augmenting the training sets at each level, and discuss the prevention of error propagation between different levels by means of soft coupling; 3 Sparseness of data: We show that the multi-level framework compounds data sparsity issues, and discuss how this can be dealt with by building local models in information-rich parts of the data. Our proof-of-concept learning algorithm demonstrates the advantage of combining levels, and opens up
Nielson, Hanne Riis; Nielson, Flemming
Information flow control extends access control by not only regulating who is allowed to access what data but also the subsequent use of the data. Applications within communications systems require such information flow control to be dependent on the actual contents of the data. We develop...
This brochure describes the Border Information Flow Architecture (BIFA). The Transportation Border Working Group, a bi-national group that works to enhance coordination and planning between the United States and Canada, identified collaboration on th...
Danielly Oliveira Inomata
Full Text Available Introduction: Information flows are vital for the support of processes, decision making and product development in organizations. Objective: To show and describe models of information flows found in the literature and disseminated in Information Science, highlighting the stages, contexts and key outcomes identified. Methodology: Exploratory Search, through the following criteria for the selection of the models: a presentation of a schematic model; b description of the steps that make up the flow. Results: eight models are detailed, which reproduce efficiently the process of information management based on information flows. The models have similar features, but have specific biases, on communication, information management or at the cognitive level. We identified elements and aspects that influence the flow of information. Conclusions: In the organizational context, information value adding should be aligned with the goals of the organization. The study of flows allows characterizing a lean and simple process, from the identification of its elements, and of humans as artifacts of knowledge and as part of this process.
Petkova, Valia T.; Lu Yuan; Ion, Roxana A.; Sander, Peter C.
It is well-known [Reliab. Eng. Syst. Saf. 75 (2002) 295] that in modern development processes it is essential to have an information flow structure that facilitates fast feedback from product users (customers) to departments at the front end, in particular development and production. As information is only relevant if it is used when taking decisions, this paper presents a guideline for building field feedback information flows that facilitate the decision taking during the product creation and realisation process. The guideline takes into consideration that the type of decisions depends on the span-of-control, therefore following Parsons [Structure and Process in Modern Societies (1990)] the span-of-control is subdivided into the following three levels: strategic, tactic, and executive. The guideline is illustrated with a case in which it is used for analysing the quality of existing field feedback flows
Baldan, Paolo; Beggiato, Alessandro; Lluch Lafuente, Alberto
Information flow techniques typically classify information according to suitable security levels and enforce policies that are based on binary relations between individual levels, e.g., stating that information is allowed to flow from one level to another. We argue that some information flow...... of competing agencies might agree to disclose their secrets, with individual disclosures being undesired, etc. Motivated by this we propose a simple language for expressing information flow policies where the usual admitted flow relation between individual security levels is replaced by a relation between sets...... of security levels, thus allowing to capture coordinated flows of information. The flow of information is expressed in terms of causal dependencies and the satisfaction of a policy is defined with respect to an event structure that is assumed to capture the causal structure of system computations. We suggest...
Barbiero, Marie; Rousseau, Célia; Papaxanthis, Charalambos; White, Olivier
Whether the central nervous system is capable to switch between contexts critically depends on experimental details. Motor control studies regularly adopt robotic devices to perturb the dynamics of a certain task. Other approaches investigate motor control by altering the gravitoinertial context itself as in parabolic flights and human centrifuges. In contrast to conventional robotic experiments, where only the hand is perturbed, these gravitoinertial or immersive settings coherently plunge participants into new environments. However, radically different they are, perfect adaptation of motor responses are commonly reported. In object manipulation tasks, this translates into a good matching of the grasping force or grip force to the destabilizing load force. One possible bias in these protocols is the predictability of the forthcoming dynamics. Here we test whether the successful switching and adaptation processes observed in immersive environments are a consequence of the fact that participants can predict the perturbation schedule. We used a short arm human centrifuge to decouple the effects of space and time on the dynamics of an object manipulation task by adding an unnatural explicit position-dependent force. We created different dynamical contexts by asking 20 participants to move the object at three different paces. These contextual sessions were interleaved such that we could simulate concurrent learning. We assessed adaptation by measuring how grip force was adjusted to this unnatural load force. We found that the motor system can switch between new unusual dynamical contexts, as reported by surprisingly well-adjusted grip forces, and that this capacity is not a mere consequence of the ability to predict the time course of the upcoming dynamics. We posit that a coherent flow of multimodal sensory information born in a homogeneous milieu allows switching between dynamical contexts.
Full Text Available Whether the central nervous system is capable to switch between contexts critically depends on experimental details. Motor control studies regularly adopt robotic devices to perturb the dynamics of a certain task. Other approaches investigate motor control by altering the gravitoinertial context itself as in parabolic flights and human centrifuges. In contrast to conventional robotic experiments, where only the hand is perturbed, these gravitoinertial or immersive settings coherently plunge participants into new environments. However, radically different they are, perfect adaptation of motor responses are commonly reported. In object manipulation tasks, this translates into a good matching of the grasping force or grip force to the destabilizing load force. One possible bias in these protocols is the predictability of the forthcoming dynamics. Here we test whether the successful switching and adaptation processes observed in immersive environments are a consequence of the fact that participants can predict the perturbation schedule. We used a short arm human centrifuge to decouple the effects of space and time on the dynamics of an object manipulation task by adding an unnatural explicit position-dependent force. We created different dynamical contexts by asking 20 participants to move the object at three different paces. These contextual sessions were interleaved such that we could simulate concurrent learning. We assessed adaptation by measuring how grip force was adjusted to this unnatural load force. We found that the motor system can switch between new unusual dynamical contexts, as reported by surprisingly well-adjusted grip forces, and that this capacity is not a mere consequence of the ability to predict the time course of the upcoming dynamics. We posit that a coherent flow of multimodal sensory information born in a homogeneous milieu allows switching between dynamical contexts.
W.S. Wadman (Wander); G. Bloemhof; D.T. Crommelin (Daan); J.E. Frank (Jason)
htmlabstractThis paper presents a probabilistic power flow model subject to connection temperature constraints. Renewable power generation is included and modelled stochastically in order to reflect its intermittent nature. In contrast to conventional models that enforce connection current
Wadman, W.; Bloemhof, G.; Crommelin, D.; Frank, J.; Ozdemir, A.
This paper presents a probabilistic power flow model subject to connection temperature constraints. Renewable power generation is included and modelled stochastically in order to reflect its intermittent nature. In contrast to conventional models that enforce connection current constraints,
This book is conceived as an introductory text into the theory of syntactic and semantic information, and information flow. Syntactic information theory is concerned with the information contained in the very fact that some signal has a non-random structure. Semantic information theory is concerned with the meaning or information content of messages and the like. The theory of information flow is concerned with deriving some piece of information from another. The main part will take us to situation semantics as a foundation of modern approaches in information theory. We give a brief overview o
Lyoo, Youngwook; Kim, Jieun E; Yoon, Sujung
The human connectome is a complex network that transmits information between interlinked brain regions. Using graph theory, previously well-known network measures of integration between brain regions have been constructed under the key assumption that information flows strictly along the shortest paths possible between two nodes. However, it is now apparent that information does flow through non-shortest paths in many real-world networks such as cellular networks, social networks, and the internet. In the current hypothesis, we present a novel framework using the maximum flow to quantify information flow along all possible paths within the brain, so as to implement an analogy to network traffic. We hypothesize that the connection strengths of brain networks represent a limit on the amount of information that can flow through the connections per unit of time. This allows us to compute the maximum amount of information flow between two brain regions along all possible paths. Using this novel framework of maximum flow, previous network topological measures are expanded to account for information flow through non-shortest paths. The most important advantage of the current approach using maximum flow is that it can integrate the weighted connectivity data in a way that better reflects the real information flow of the brain network. The current framework and its concept regarding maximum flow provides insight on how network structure shapes information flow in contrast to graph theory, and suggests future applications such as investigating structural and functional connectomes at a neuronal level. Copyright © 2017 Elsevier Ltd. All rights reserved.
Nielson, Hanne Riis; Nielson, Flemming; Li, Ximeng
Information flow control extends access control by not only regulating who is allowed to access what data but also the subsequent use of the data accessed. Applications within communication networks require such information flow control to depend on the actual data. For a concurrent language...... with synchronous communication and separate data domains we develop a Hoare logic for enforcing disjunctive information flow policies. We establish the soundness of the Hoare logic with respect to an operational semantics and illustrate the development on a running example....
Agricultural informational flow in informal communication networks of farmers in Ghana. ... should identify such farmers who can serve as intermediaries between actors to help disseminate information in rural communities. Keywords: key communicators, farmers, rural communities, social networks, extension agents ...
Parraguez, Pedro; Maier, Anja
Complex engineering design projects need to manage simultaneously multiple information flows across design activities associated with different areas of the design process. Previous research on this area has mostly focused on either analysing the “required information flows” through activity...... networks at the project level or in studying the social networks that deliver the “actual information flow”. In this paper we propose and empirically test a model and method that integrates both social and activity networks into one compact representation, allowing to compare actual and required...... information flows between design spaces, and to assess the influence that these misalignments could have on the performance of engineering design projects....
Tolstrup, Terkel Kristian; Nielson, Flemming; Nielson, Hanne Riis
We describe a fragment of the hardware description language VHDL that is suitable for implementing the Advanced Encryption Standard algorithm. We then define an Information Flow analysis as required by the international standard Common Criteria. The goal of the analysis is to identify the entire...... information flow through the VHDL program. The result of the analysis is presented as a non-transitive directed graph that connects those nodes (representing either variables or signals) where an information flow might occur. We compare our approach to that of Kemmerer and conclude that our approach yields...
Patrick, R. L.
This paper is designed to fill the need for an easily understood introduction to the computing and data processing field for the layman who has, or can expect to have, some contact with it. Information provided includes the unique terminology and jargon of the field, the various types of computers and the scope of computational capabilities, and…
Grusho, Alexander A. [Institute of Informatics Problems of Federal Research Center “Computer Science and Control” of the Russian Academy of Sciences, Vavilova str., 44/2, Moscow (Russian Federation); Faculty of Computational Mathematics and Cybernetics, Moscow State University, Moscow (Russian Federation); Grusho, Nick A.; Timonina, Elena E. [Institute of Informatics Problems of Federal Research Center “Computer Science and Control” of the Russian Academy of Sciences, Vavilova str., 44/2, Moscow (Russian Federation)
The paper deals with architecture of content recognition system. To analyze the problem the stochastic model of content recognition in information flows was built. We proved that under certain conditions it is possible to solve correctly a part of the problem with probability 1, viewing a finite section of the information flow. That means that good architecture consists of two steps. The first step determines correctly certain subsets of contents, while the second step may demand much more time for true decision.
Nielson, Flemming; Nielson, Hanne Riis; Vasilikos, Panagiotis
One of the key demands of cyberphysical systems is that they meet their safety goals. Timed Automata has established itself as a formalism for modelling and analysing the real-time safety aspects of cyberphysical systems. Increasingly it is also demanded that cyberphysical systems meet a number o...... of security goals for confidentiality and integrity. Information Flow Control is an approach to ensuring that there are no flows of information that violate the stated security policy....
Jain, Rahil; Lutz, Barry
Frequency tuning has emerged as an attractive alternative to conventional pumping techniques in microfluidics. Oscillating (AC) flow driven through a passive valve can be rectified to create steady (DC) flow, and tuning the excitation frequency to the characteristic (resonance) frequency of the underlying microfluidic network allows control of flow magnitude using simple hardware, such as an on-chip piezo buzzer. In this paper, we report that frequency tuning can also be used to control the direction (forward or backward) of the rectified DC flow in a single device. Initially, we observed that certain devices provided DC flow in the "forward" direction expected from previous work with a similar valve geometry, and the maximum DC flow occurred at the same frequency as a prominent peak in the AC flow magnitude, as expected. However, devices of a slightly different geometry provided the DC flow in the opposite direction and at a frequency well below the peak AC flow. Using an equivalent electrical circuit model, we found that the "forward" DC flow occurred at the series resonance frequency (with large AC flow peak), while the "backward" DC flow occurred at a less obvious parallel resonance (a valley in AC flow magnitude). We also observed that the DC flow occurred only when there was a measurable differential in the AC flow magnitude across the valve, and the DC flow direction was from the channel with large AC flow magnitude to that with small AC flow magnitude. Using these observations and the AC flow predictions from the equivalent circuit model, we designed a device with an AC flowrate frequency profile that was expected to allow the DC flow in opposite directions at two distinct frequencies. The fabricated device showed the expected flow reversal at the expected frequencies. This approach expands the flow control toolkit to include both magnitude and direction control in frequency-tuned microfluidic pumps. The work also raises interesting questions about the
Andrievski, Rostislav A.; Klyuchareva, Svetlana V.
The nanotechnology development is accompanied by an intensive growth of information flow which is specially noticeable as applied to journal information flow. Now over the world there are the 69 nano-titled journals with the impact factor and/or a settled periodicity as well as the 70 those which lack stability periodicity and are in an organization stage. Only 49 nano-titled have the impact factor with the comparatively high mean value of about 3.44. The domestic nano-titled journals published in Russia, India, China, and other countries are also considered. The attention is taken that in the 2006–2010 period the 95 new nano-titled journals were organized and in 2011 this process is continuing and seems to be the most impressive. Many nano-related journals (including classical physical, chemical and materials science ones) are also described and discussed.
Gao, Liang; Song, Chaoming; Gao, Ziyou; Barabási, Albert-László; Bagrow, James P.; Wang, Dashun
Recent advances on human dynamics have focused on the normal patterns of human activities, with the quantitative understanding of human behavior under extreme events remaining a crucial missing chapter. This has a wide array of potential applications, ranging from emergency response and detection to traffic control and management. Previous studies have shown that human communications are both temporally and spatially localized following the onset of emergencies, indicating that social propagation is a primary means to propagate situational awareness. We study real anomalous events using country-wide mobile phone data, finding that information flow during emergencies is dominated by repeated communications. We further demonstrate that the observed communication patterns cannot be explained by inherent reciprocity in social networks, and are universal across different demographics.
Klepper, G.; Peterson, S.
The upcoming European Emissions Trading Scheme (ETS) is one of the more controversial climate policy instruments. Predictions about its likely impact and its performance can at present only be made to a certain degree. As long as the National Allocations Plans are not finally settled the overall supply of allowances is not determined. In this paper we will identify key features and key impacts of the EU ETS by scanning the range of likely allocation plans using the simulation model DART. The analysis of the simulation results highlights a number of interesting details in terms of allowance trade flows between member countries, of allowance prices, and in terms of the role of the accession countries in the ETS
Pinazza, O.; Augustinus, A.; Chochula, P.Ch.; Jirden, L.S.; Lechman, M.; Rosinsky, P.; Cataldo, G. de; Kurepin, A.N.; Moreno, A.
ALICE is one of the experiments at the Large Hadron Collider (LHC) at CERN in Geneva, Switzerland. The ALICE detector control system (DCS) is an integrated system collecting 18 different detectors' controls and general services. DCS is implemented using the commercial SCADA package PVSS. Information of general interest, such as beam and condition data, and data related to shared plants or systems, are made available to all the subsystems via the distribution capabilities of PVSS. Great care has been taken to build a modular and hierarchical system, limiting the inter-dependencies of the various subsystems. Accessing remote resources in a PVSS distributed environment is very simple and can be initiated unilaterally. In order to improve the reliability of distributed data and to avoid unforeseen and unwished dependencies, the ALICE DCS group has enforced the centralization of global data required by the subsystems. A tool has been developed to monitor the level of inter-dependency and to understand the optimal layout of the distributed connections, allowing for an interactive visualization of the distribution topology. (authors)
McCabe, E.R.B.; Towbin, J.A. (Baylor College of Medicine, Houston, TX (United States)); Engh, G. van den; Trask, B.J. (Lawrence Livermore National Lab., CA (United States))
Bivariate flow karyotyping was used to estimate the deletion sizes for a series of patients with Xp21 contiguous gene syndromes. The deletion estimates were used to develop an approximate scale for the genomic map in Xp21. The bivariate flow karyotype results were compared with clinical and molecular genetic information on the extent of the patients' deletions, and these various types of data were consistent. The resulting map spans >15 Mb, from the telomeric interval between DXS41 (99-6) and DXS68 (1-4) to a position centromeric to the ornithine transcarbamylase locus. The deletion sizing was considered to be accurate to [plus minus]1 Mb. The map provides information on the relative localization of genes and markers within this region. For example, the map suggests that the adrenal hypoplasia congenita and glycerol kinase genes are physically close to each other, are within 1-2 Mb of the telomeric end of the Duchenne muscular dystrophy (DMD) gene, and are nearer to the DMD locus than to the more distal marker DXS28 (C7). Information of this type is useful in developing genomic strategies for positional cloning in Xp21. These investigations demonstrate that the DNA from patients with Xp21 contiguous gene syndromes can be valuable reagents, not only for ordering loci and markers but also for providing an approximate scale to the map of the Xp21 region surrounding DMD. 44 refs., 3 figs.
Home; Journals; Pramana – Journal of Physics; Volume 59; Issue 2. Information ﬂow in quantum teleportation ... Quantum information; quantum teleportation; parameter independence. Abstract. The ﬂow of information is discussed in the context of quantum teleportation. Situations are described which use a sequence of ...
Allahverdyan, Armen E; Janzing, Dominik; Mahler, Guenter
A basic task of information processing is information transfer (flow). Here we study a pair of Brownian particles each coupled to a thermal bath at temperatures T 1 and T 2 . The information flow in such a system is defined via the time-shifted mutual information. The information flow nullifies at equilibrium, and its efficiency is defined as the ratio of the flow to the total entropy production in the system. For a stationary state the information flows from higher to lower temperatures, and its efficiency is bounded from above by (max[T 1 ,T 2 ])/(|T 1 −T 2 |). This upper bound is imposed by the second law and it quantifies the thermodynamic cost for information flow in the present class of systems. It can be reached in the adiabatic situation, where the particles have widely different characteristic times. The efficiency of heat flow—defined as the heat flow over the total amount of dissipated heat—is limited from above by the same factor. There is a complementarity between heat and information flow: the set-up which is most efficient for the former is the least efficient for the latter and vice versa. The above bound for the efficiency can be (transiently) overcome in certain non-stationary situations, but the efficiency is still limited from above. We study yet another measure of information processing (transfer entropy) proposed in the literature. Though this measure does not require any thermodynamic cost, the information flow and transfer entropy are shown to be intimately related for stationary states
Rabinovich, Mikhail I.; Afraimovich, Valentin S.; Bick, Christian; Varona, Pablo
Timing and dynamics of information in the brain is a hot field in modern neuroscience. The analysis of the temporal evolution of brain information is crucially important for the understanding of higher cognitive mechanisms in normal and pathological states. From the perspective of information dynamics, in this review we discuss working memory capacity, language dynamics, goal-dependent behavior programming and other functions of brain activity. In contrast with the classical description of information theory, which is mostly algebraic, brain flow information dynamics deals with problems such as the stability/instability of information flows, their quality, the timing of sequential processing, the top-down cognitive control of perceptual information, and information creation. In this framework, different types of information flow instabilities correspond to different cognitive disorders. On the other hand, the robustness of cognitive activity is related to the control of the information flow stability. We discuss these problems using both experimental and theoretical approaches, and we argue that brain activity is better understood considering information flows in the phase space of the corresponding dynamical model. In particular, we show how theory helps to understand intriguing experimental results in this matter, and how recent knowledge inspires new theoretical formalisms that can be tested with modern experimental techniques.
Patrycja Vasilyev Missiuro
Full Text Available Recent studies of cellular networks have revealed modular organizations of genes and proteins. For example, in interactome networks, a module refers to a group of interacting proteins that form molecular complexes and/or biochemical pathways and together mediate a biological process. However, it is still poorly understood how biological information is transmitted between different modules. We have developed information flow analysis, a new computational approach that identifies proteins central to the transmission of biological information throughout the network. In the information flow analysis, we represent an interactome network as an electrical circuit, where interactions are modeled as resistors and proteins as interconnecting junctions. Construing the propagation of biological signals as flow of electrical current, our method calculates an information flow score for every protein. Unlike previous metrics of network centrality such as degree or betweenness that only consider topological features, our approach incorporates confidence scores of protein-protein interactions and automatically considers all possible paths in a network when evaluating the importance of each protein. We apply our method to the interactome networks of Saccharomyces cerevisiae and Caenorhabditis elegans. We find that the likelihood of observing lethality and pleiotropy when a protein is eliminated is positively correlated with the protein's information flow score. Even among proteins of low degree or low betweenness, high information scores serve as a strong predictor of loss-of-function lethality or pleiotropy. The correlation between information flow scores and phenotypes supports our hypothesis that the proteins of high information flow reside in central positions in interactome networks. We also show that the ranks of information flow scores are more consistent than that of betweenness when a large amount of noisy data is added to an interactome. Finally, we
Ling, Bin; Allison, Colin; Nicholl, J. Ross; Moodley, Luke; Roberts, Dave
The Disabilities Information Flow (DIF) project at the University of St Andrews has sought to provide a means of efficiently managing all student disabilities information within the institution and provide appropriate role-based service interfaces for all staff who need to routinely interact with this information. This paper describes the software…
Damian, D.; Danvy, Olivier
consider the administrative reductions of a Plotkin-style transformation into Continuation-Passing Style (CPS), and how they affect the result of a constraint-based control-flow analysis and, in particular, the least element in the space of solutions. We show that administrative reductions preserve...... the least solution. Preservation of least solutions solves a problem that was left open in Palsberg and Wand's article ‘CPS Transformation of Flow Information.’ Together, Palsberg and Wand's article and the present article show how to map in linear time the least solution of the flow constraints...... of a program into the least solution of the flow constraints of the CPS counterpart of this program, after administrative reductions. Furthermore, we show how to CPS transform control-flow information in one pass....
... burden hours or to CBP Form 4315. Type of Review: Extension (without change). Affected Public: Businesses... change to the burden hours. This document is published to obtain comments from the public and affected..., or other technological techniques or other forms of information. Title: Application for Allowance in...
Augustinus, A; Moreno, A; Kurepin, A N; De Cataldo, G; Pinazza, O; Rosinský, P; Lechman, M; Jirdén, L S
ALICE is one of the experiments at the Large Hadron Collider (LHC) at CERN in Geneva, Switzerland. The ALICE detector control system is an integrated system collecting 18 different detectors’ controls and general services. Is implemented using the commercial SCADA package PVSS. Information of general interest, such as beam and condition data, and data related to shared plants or systems, are made available to all the subsystems via the distribution capabilities of PVSS. Great care has been taken to build a modular and hierarchical system, limiting the interdependencies of the various subsystems. Accessing remote resources in a PVSS distributed environment is very simple and can be initiated unilaterally. In order to improve the reliability of distributed data and to avoid unforeseen and unwished dependencies, the ALICE DCS group has enforced the centralization of global data required by the subsystems. A tool has been developed to monitor the level of interdependency and to understand the ...
Di Iorio, Concetta Tania; Carinci, Fabrizio; Brillante, Massimo
The EUBIROD project aims to perform a cross-border flow of diabetes information across 19 European countries using the BIRO information system, which embeds privacy principles and data protection mechanisms in its architecture (privacy by design). A specific task of EUBIROD was to investigate...
Yousefi, Nazila; Alibabaei, Ahmad
Managing the supply chain plays an important role in creating competitive advantages for companies. Adequate information flow in supply chain is one of the most important issues in SCM. Therefore, using certain Information Systems can have a significant role in managing and integrating data and information within the supply chain. Pharmaceutical supply chain is more complex than many other supply chains, in the sense that it can affect social and political perspectives. On the other hand, man...
Yousefi, Nazila; Alibabaei, Ahmad
Managing the supply chain plays an important role in creating competitive advantages for companies. Adequate information flow in supply chain is one of the most important issues in SCM. Therefore, using certain Information Systems can have a significant role in managing and integrating data and information within the supply chain. Pharmaceutical supply chain is more complex than many other supply chains, in the sense that it can affect social and political perspectives. On the other hand, managing the pharmaceutical supply chain is difficult because of its complexity and also government regulations in this field. Although, Iran has progressed a lot in pharmaceutical manufacturing, still there are many unsolved issues in managing the information flow in the pharmaceutical supply chain. In this study, we reviewed the benefits of using different levels of an integrated information system in the supply chain and the possible challenges ahead.
Li, Ximeng; Nielson, Flemming; Nielson, Hanne Riis
The security validation of practical computer systems calls for the ability to specify and verify information flow policies that are dependent on data content. Such policies play an important role in concurrent, communicating systems: consider a scenario where messages are sent to different...... processes according to their tagging. We devise a security type system that enforces content-dependent information flow policies in the presence of communication and concurrency. The type system soundly guarantees a compositional noninterference property. All theoretical results have been formally proved...
James, Ryan G.; Barnett, Nix; Crutchfield, James P.
A central task in analyzing complex dynamics is to determine the loci of information storage and the communication topology of information flows within a system. Over the last decade and a half, diagnostics for the latter have come to be dominated by the transfer entropy. Via straightforward examples, we show that it and a derivative quantity, the causation entropy, do not, in fact, quantify the flow of information. At one and the same time they can overestimate flow or underestimate influence. We isolate why this is the case and propose several avenues to alternate measures for information flow. We also address an auxiliary consequence: The proliferation of networks as a now-common theoretical model for large-scale systems, in concert with the use of transferlike entropies, has shoehorned dyadic relationships into our structural interpretation of the organization and behavior of complex systems. This interpretation thus fails to include the effects of polyadic dependencies. The net result is that much of the sophisticated organization of complex systems may go undetected.
Helke, Steffen; Kammüunietd kller, Florian; Probst, Christian W.
Refactoring means that a program is changed without changing its behaviour from an observer's point of view. Does the change of behaviour also imply that the security of the program is not affected by the changes? Using Myers and Liskov's distributed information flow control model DLM and its Java...
Hinich, Melvin J.; Molyneux, Robert E.
Discusses information flow in networks and predicting network traffic and describes a study that uses time series analysis on a day's worth of Internet log data. Examines nonlinearity and traffic invariants, and suggests that prediction of network traffic may not be possible with current techniques. (Author/LRW)
Full Text Available Forwarding the frequent usage of complex processes and the big volume of information, it is imperative to manage the automatic circuit of the document flow in a company activity. The main advantage of such a system consist in document waiting to be proces
Märtens, M.; Meier, J.M.; Hillebrand, Arjan; Tewarie, Prejaas; Van Mieghem, P.F.A.
Recent work has revealed frequency-dependent global patterns of information flow by a network analysis of magnetoencephalography data of the human brain. However, it is unknown which properties on a small subgraph-scale of those functional brain networks are dominant at different frequencies bands.
Diez, Ibai; Erramuzpe, Asier; Escudero, Iñaki; Mateos, Beatriz; Cabrera, Alberto; Marinazzo, Daniele; Sanz-Arigita, Ernesto J; Stramaglia, Sebastiano; Cortes Diaz, Jesus M
The resting brain dynamics self-organize into a finite number of correlated patterns known as resting-state networks (RSNs). It is well known that techniques such as independent component analysis can separate the brain activity at rest to provide such RSNs, but the specific pattern of interaction between RSNs is not yet fully understood. To this aim, we propose here a novel method to compute the information flow (IF) between different RSNs from resting-state magnetic resonance imaging. After hemodynamic response function blind deconvolution of all voxel signals, and under the hypothesis that RSNs define regions of interest, our method first uses principal component analysis to reduce dimensionality in each RSN to next compute IF (estimated here in terms of transfer entropy) between the different RSNs by systematically increasing k (the number of principal components used in the calculation). When k=1, this method is equivalent to computing IF using the average of all voxel activities in each RSN. For k≥1, our method calculates the k multivariate IF between the different RSNs. We find that the average IF among RSNs is dimension dependent, increasing from k=1 (i.e., the average voxel activity) up to a maximum occurring at k=5 and to finally decay to zero for k≥10. This suggests that a small number of components (close to five) is sufficient to describe the IF pattern between RSNs. Our method--addressing differences in IF between RSNs for any generic data--can be used for group comparison in health or disease. To illustrate this, we have calculated the inter-RSN IF in a data set of Alzheimer's disease (AD) to find that the most significant differences between AD and controls occurred for k=2, in addition to AD showing increased IF w.r.t. The spatial localization of the k=2 component, within RSNs, allows the characterization of IF differences between AD and controls.
and reference monitors, have been proposed in the context of programming languages and process calculi, to enforce such properties. The most widely used definitions of information flow security are noninterference-like properties. For concurrent systems where processes communicate with each other to accomplish...... computational tasks, fine-grained security policies can be formulated by distinguishing between whether communication can happen, and what is communicated. As the first contribution of this PhD thesis, we formulate a noninterference-like property that takes all combinations of sensitivity levels for “whether...... to a classical one when the two dimensions are intentionally blurred. As the second contribution, we focus on the “what” dimension and further allow the flow policy to vary under different contents stored and communicated. This is the area of content-dependent (or conditional) information flow, which has...
Egedorf, Maren Marie; Villanueva Holm-Nielsen, Pablo
The traditional view of the disaster circle is phase based. Disaster and development professionals recognize that the actions carried out in the various phases of the disaster management cycle are overlapping and build upon each other, having resilience as the overall goal. However information does...... not necessarily flow across the phases of the circle in an effective manner. This is particularly true for the information that crosses the disaster point of the circle. Organisations carry out assessments, surveys and baselines for various purposes, at various points of time in the disaster circle. Output...
Anselmi, C.E.; Anselmi, O.E.
A nuclear medicine information system that allows reporting and sending images through intranet. Aim: This system was developed in order to improve the processes of typing, correcting, verifying and distribution of the reports and images, improving the efficiency of the personnel in the nuclear medicine department and reducing the time between the creation of the report and its reading by the referring physician. Materials and Methods: The system runs a web server (Personal Web Server, Microsoft) which serves web pages written in hypertext markup language (HTML) and active server pages (ASP). The database utilized is Microsoft Access 97. The whole communication between the web server and the database is performed by the programs written in ASP. Integrating the images from the patients is done through a 486 ibm-pc running Red Hat Linux, which serves as an intermediary between the isolated nuclear medicine network and the hospital's network. Results: The time from report verification and referring physician reading has decreased from approximately 24 hours to 12 hours. It is possible to run queries in the system in order to get productivity reports or clinical research. Imaging storage allows for correlation of current and previous studies. Conclusion: Bureaucratic processes have diminished to a certain extent in the department. Reports are now online as soon as they are verified by the nuclear medicine physician. There is no need to install dedicated software in the viewing stations since the whole system runs in the server
Metcalfe, Kiloran H M; Worsley, Calum A; Swerner, Casey B; Sinha, Devan; Solanki, Ravi; Ravi, Krithi; Dattani, Raj S
The 2014 Varsity Medical Ethics debate convened upon the motion: "This house believes that genetic information should not be commoditised". This annual debate between students from the Universities of Oxford and Cambridge, now in its sixth year, provided the starting point for arguments on the subject. The present article brings together and extends many of the arguments put forward during the debate. We explore the circumstances under which genetic material should be considered patentable, the possible effects of this on the research and development of novel therapeutics, and the need for clear guidelines within this rapidly developing field.The Varsity Medical Debate was first held in 2008 with the aim of allowing students to engage in discussion about ethics and policy within healthcare. Two Oxford medical students, Mahiben Maruthappu and Sanjay Budheo founded the event. The event is held annually and it is hoped that this will allow future leaders to voice a perspective on the arguments behind topics that will feature heavily in future healthcare and science policy. This year the Oxford University Medical Society at the Oxford Union hosted the debate.
Marin-Spiotta, E.; Chadwick, O.; Kramer, M. G.
Most of our understanding of soil carbon (C) dynamics derives from the top 10 to 20 cm, although globally the majority of the bulk soil C pool is found below those depths. Mineral associated C in deep soil is more stable than that held in surface horizons, and its long-term persistence may contribute to sequestration of anthropogenic C. Carbon can enter deep soil horizons in multiple ways: through biologically-mediated or abiotic physical mixing, illuviation, root inputs, or through a physical disturbance that would cause the burial of an originally shallow organic horizon. In this study, we investigated the role of dissolved organic matter (DOM) in the transport and stabilization of soil C in tropical rainforest volcanic soils, where high rainfall, a highly productive forest, and dominance of highly reactive, non-crystalline minerals contribute to large soil C stocks at depth with long mean residence times. DOM plays an important role in many biological and chemical processes in soils, including nutrient transfer within and across ecosystems. Carbon storage in these soils is linked to movement of both DOC and particulate organic C along infiltration pathways. Climate and soil mineralogical properties create the right conditions for C to be pumped from the organic horizons where microbial activity is highest, to deep mineral horizons, where the potential for stabilization is greatest. High rainfall preserves hydrated short-range order minerals that are subject to strong shrinkage during occasional drought periods. The resulting cracks in subsurface B horizons become pathways for DOM complexed with Fe and Al moving in soil solution during subsequent wet periods. Preferential flow of these organically rich solutes and/or colloids moves C to depth where C, Fe and Al are preferentially deposited on near-vertical crack surfaces and along near-horizonal flow surfaces at horizon boundaries. Long-term deposition forms discontinuous Fe- and OM-cemented lamella that serve to
Full Text Available Purpose: We show that the regulation of traffic is especially important under conditions of infrastructure. The aim was to show that the objectives of traffic light regulation is to minimize and, where possible, exceptions meetings conflicting streams of vehicles. The authors show that the applied aspects of optimization of traffic lights is to eliminate mash situation and ensure the safety of all road users. Methods: The method used in the comparison is to select the best technical solution, feasibility to test mathematical tools and engineering for the practical implementation of the proposed technical solution. Results: The results of the study is repeated then that shows the organization of the preconditions for the introduction of traffic light regulation based on the calculation cycle of the traffic light object. It is proposed to cycle more than 120 seconds, as if waiting longer allowing drivers can count signal lights faulty and start moving to the blocking signal. The main tasks of traffic light regulation is to minimize and, where possible, exceptions meetings conflicting streams of vehicles and eliminate mash situation and ensure the safety of all participants. An important part of the work on the introduction of traffic light regulation is the calculation cycle of the traffic light object. For reasons of safety, cycle time of more than 120 is considered unacceptable, since the longer waiting signal allowing drivers can count faulty lights and start moving to the prohibitive signal. Discussion: In conclusion, the article revealed that the divergent forms of regulation of the intersection can also be used for general stress on public roads. The methods of testing for the road tion type.
Woodhouse, Francis G.; Fawcett, Joanna B.; Dunkel, Jörn
Recent experiments show that both natural and artificial microswimmers in narrow channel-like geometries will self-organise to form steady, directed flows. This suggests that networks of flowing active matter could function as novel autonomous microfluidic devices. However, little is known about how information propagates through these far-from-equilibrium systems. Through a mathematical analogy with spin-ice vertex models, we investigate here the input–output characteristics of generic incompressible active flow networks (AFNs). Our analysis shows that information transport through an AFN is inherently different from conventional pressure or voltage driven networks. Active flows on hexagonal arrays preserve input information over longer distances than their passive counterparts and are highly sensitive to bulk topological defects, whose presence can be inferred from marginal input–output distributions alone. This sensitivity further allows controlled permutations on parallel inputs, revealing an unexpected link between active matter and group theory that can guide new microfluidic mixing strategies facilitated by active matter and aid the design of generic autonomous information transport networks.
Lin, J.C.; He Wei
In recent years, many U.S. nuclear plants have applied and received approval for the risk-informed extension of the Allowed Outage Time (AOT) for Emergency Diesel Generators (EDGs). These risk-informed applications need to meet the regulatory guidance on the risk criteria. This paper discusses in detail insights derived from the risk-informed analyses performed to support these applications. The risk criteria on ΔCDF/ΔLERF evaluate the increase in average risk by extending the AOT for EDGs, induced primarily by an increase in EDG maintenance unavailability due to the introduction of additional EDG preventive maintenance. By performing this preventive maintenance work on-line, the outage duration can be shortened. With proper refinement of the risk model, most plants can meet the ΔCDF/ΔLERF criteria for extending the EDGAOT from, for example, 3 days to 14 days. The key areas for model enhancements to meet these criteria include offsite/onsite power recovery, LERF modeling, etc. The most important LERF model enhancements consist of refinement of the penetrations included in the containment isolation model for the consideration of a large release, and taking credit for operator vessel depressurization during the time period between core damage and vessel failure. A recent study showed that although the frequency of loss of offsite power (LOSP) has decreased, the duration of offsite power recovery has actually increased. However, many of the events used to derive this conclusion may not be applicable to PRAs. One approach develops the offsite power non-recovery factor by first screening the LOSP events for applicability to the plant being analyzed, power operation, and LOSP initiating event, then using the remaining events data for the derivation based on the fraction of events with recovery duration longer than the time window allowed. The risk criteria on ICCDP/ICLERP examine the increase in risk from the average CDF/LERF, based on the increased maintenance
... 26 Internal Revenue 18 2010-04-01 2010-04-01 false False information with respect to withholding allowances based on itemized deductions. 301.6682-1 Section 301.6682-1 Internal Revenue INTERNAL REVENUE... Amounts § 301.6682-1 False information with respect to withholding allowances based on itemized deductions...
Recognizing the fundamental role of information flow in future transportation applications, the research team investigated the quality and security of information flow in the connected vehicle (CV) environment. The research team identified key challe...
Harrington, Heather A; Feliu, Elisenda; Wiuf, Carsten
recent developments from dynamical systems and chemical reaction network theory to identify and characterize the key-role of the spatial organization of eukaryotic cells in cellular information processing. In particular, the existence of distinct compartments plays a pivotal role in whether a system...... is capable of multistationarity (multiple response states), and is thus directly linked to the amount of information that the signaling molecules can represent in the nucleus. Multistationarity provides a mechanism for switching between different response states in cell signaling systems and enables multiple...
Full Text Available Without having direct access to the information that is being exchanged, traces of information flow can be obtained by looking at temporal sequences of user interactions. These sequences can be represented as causality trees whose statistics result from a complex interplay between the topology of the underlying (social network and the time correlations among the communications. Here, we study causality trees in mobile-phone data, which can be represented as a dynamical directed network. This representation of the data reveals the existence of super-spreaders and super-receivers. We show that the tree statistics, respectively the information spreading process, are extremely sensitive to the in-out degree correlation exhibited by the users. We also learn that a given information, e.g., a rumor, would require users to retransmit it for more than 30 hours in order to cover a macroscopic fraction of the system. Our analysis indicates that topological node-node correlations of the underlying social network, while allowing the existence of information loops, they also promote information spreading. Temporal correlations, and therefore causality effects, are only visible as local phenomena and during short time scales. Consequently, the very idea that there is (intentional information spreading beyond a small vecinity is called into question. These results are obtained through a combination of theory and data analysis techniques.
Schubert, András; Somogyi, Anikó
In order to reveal impacts of natural and social sciences on each other, the authors examined connections between fields of medical and social sciences using a search for references and citations of scientific publication. 1. The largest affinity between the medical and social sciences was found between neurosciences and psychology, but there was a significant affinity between clinical sciences and general social sciences, as well. 2. The example of General & Internal Medicine papers in the topics of "diabetes" suggests that in the period 2001-2010 the share of references to social sciences was significantly increased. In the meantime, social science papers in the same topics contained references to Clinical Medicine papers in a constantly high percentage. 3. In the sample under study, the age distribution of social science papers in the references did not differ significantly from that of the other sources. 4. Share of references to social science papers was found to be extremely high among Hungarian General & Internal Medicine papers in the topics of "diabetes". This finding still requires clarification, nevertheless, since e.g. it was not supported by an institutional comparison including the largest Hungarian medical research university. 5. The intensity of the reference/citation mediated information flows between the Hungarian Medical Journal, Orvosi Hetilap and social sciences appears to be in accordance with the current international trends.
Full Text Available
Postural control is the result of different sensorial information integration. During complex movements, such as acrobatic skills when a subject jumps and turns on the transversal axis, sensorial conflicts can appear, especially among visual and vestibular inputs. The importance of these conflicts during learning and posterior execution of an acrobatic manoeuvre is not clear. An experimental study was carried out where we controlled the environmental illumination of flying and landing phases of an acrobatic skill execution (forward tucked somersault during the learning process. We obtained significant differences between different practice groups, showing better results those subjects who accomplished their practice without illumination during the landing phase. Our results suggest that although visual information might be important to perform the take-off phase correctly, it doesn’t seem to be a determining factor on its final phase (landing and could even interfere with vestibular information.
KEYWORDS: sensorymotor integration, vision, vestibular information, acrobatic activities
El control postural es el resultado de la integración de diferentes informaciones sensoriales. En la ejecución de movimientos complejos, como las habilidades acrobáticas basadas en saltar y girar en torno al eje transversal, pueden aparecer conflictos sensoriales, especialmente entre la información visual y la vestibular. La repercusión de estos conflictos sobre el aprendizaje y dominio de este tipo de habilidad no esta clara. Se realizó un estudio experimental, en el cual la iluminación del ambiente fue
Russell, Lucian; Wolfson, Ouri; Yu, Clement
To meet the demands of commercial data traffic on the information highway, a new look at managing data is necessary. One projected activity, sharing of point of sale information, is being considered in the Demand Activated Manufacturing Project (DAMA) of the American Textile Partnership (AMTEX) project. A scenario is examined in which 100 000 retail outlets communicate over a period of days. They provide the latest estimate of demand for sewn products across a chain of 26 000 suppliers through the use of bill of materials explosions at four levels of detail. Enabling this communication requires an approach that shares common features with both workflows and database management. A new paradigm, the information flow manager, is developed to handle this situation, including the case where members of the supply chain fail to communicate and go out of business. Techniques for approximation are introduced so as to keep estimates of demand as current as possible.
Russell, L. [Argonne National Lab., IL (United States); Wolfson, O.; Yu, C. [Illinois Univ., Chicago, IL (United States)
To meet the demands of commercial data traffic on the information highway, a new look at managing data is necessary. One projected activity, sharing of point-of-sale information, is being considered in the Demand Activated Manufacturing Project of the American Textile Partnership project. A scenario is examined in which 100,000 retail outlets communicate over a period of days. They provide the latest estimate of demand for sewn products across a chain of 26,000 suppliers through the use of bill-of-materials explosions at four levels of detail. A new paradign the information flow manager, is developed to handle this situation, including the case where members of the supply chain fail to communicate and go out of business. Techniques for approximation are introduced to keep estimates of demand as current as possible.
Abbott, Mark W. [Flowserve Corporation, 1978 Foreman Drive Cookeville, TN 38506 (United States)
, yet allows the installation of process monitoring instruments, such as a turbidity meter to be placed in the flow stream. The basis of the design is a valve body, which, rather than having a directly mounted bonnet has lengths of concentric pipe added, which move the bonnet away from the valve body. The pipe is conceptually similar to an oil field well, with the various strings of casing, and tubing installed. Each concentric pipe provides a required function, such as the outermost pipes, the valve sleeve and penetration sleeve, which provide structural support to the deck flange. For plug valve based designs, the next inner pipe provides compression on the environmental seals at the top of the body to bonnet joint, followed by the innermost pipe which provides rotation of the plug, in the same manner as an extended stem. Ball valve ESVs have an additional pipe to provide compressive loading on the stem packing. Due to the availability of standard pipe grades and weights, the product can be configured to fit a wide array of valve sizes, and application lengths, with current designs as short as seven inches and as tall as 18 feet. Central to the design is the requirement for no special tools or downhole tools to remove parts or configure the product. Off the shelf wrenches, sockets or other hand tools are all that is required. Compared to other products historically available, this design offers a lightweight option, which, while not as rigidly stiff, can deflect compliantly under extreme seismic loading, rather than break. Application conditions vary widely, as the base product is 316 and 304 stainless steel, but utilizes 17-4PH, and other allows as needed based on the temperature range and mechanical requirements. Existing designs are installed in applications as hot as 1400 deg. F, at low pressure, and separately in highly radioactive environments. The selection of plug versus ball valve, metal versus soft seats, and the material of the seals and seats is all
Martinez Alvaro, O.; Nuñez Gonzalez, A.
National Post Offices manage huge volumes of letters and parcels. Data associated to these flows are growing fast, with a great variety related to the diversity of postal products. The research described in this paper has classified all information flows of Correos, the Spanish National Post Office. In spite of the complexity of the current postal service portfolio, only four categories of matrices allow the classification of all postal information flows. Thanks to the migration towards new products, analyses with simple techniques will provide more and better information in the future, due to the structured nature of existing databases. (Author)
van Engelenburg, S.H.; Janssen, M.F.W.H.A.; Klievink, A.J.; Tan, Y.; Janssen, Marijn; Axelsson, Karin; Glassey, Olivier; Klievink, Bram; Krimmer, Robert; Lindgren, Ida; Parycek, Peter; Scholl, Hans J.; Trutnev, Dmitrii
Advanced architectures for business-to-government (B2G) information sharing can benefit both businesses and government. An essential choice in the design of such an architecture is whether information is shared using a thick or a thin information flow. In an architecture with a thick flow, all
Danz, Mary E.
The Level 4 Mission Sequence Test (MST) was studied to develop strategies and recommendations to facilitate information flow. Recommendations developed as a result of this study include revised format of the Test and Assembly Procedure (TAP) document and a conceptualized software based system to assist in the management of information flow during the MST.
Haruna, Taichi; Fujiki, Yuuya
We investigate the influence of the small-world topology on the composition of information flow on networks. By appealing to the combinatorial Hodge theory, we decompose information flow generated by random threshold networks on the Watts-Strogatz model into three components: gradient, harmonic and curl flows. The harmonic and curl flows represent globally circular and locally circular components, respectively. The Watts-Strogatz model bridges the two extreme network topologies, a lattice network and a random network, by a single parameter that is the probability of random rewiring. The small-world topology is realized within a certain range between them. By numerical simulation we found that as networks become more random the ratio of harmonic flow to the total magnitude of information flow increases whereas the ratio of curl flow decreases. Furthermore, both quantities are significantly enhanced from the level when only network structure is considered for the network close to a random network and a lattice network, respectively. Finally, the sum of these two ratios takes its maximum value within the small-world region. These findings suggest that the dynamical information counterpart of global integration and that of local segregation are the harmonic flow and the curl flow, respectively, and that a part of the small-world region is dominated by internal circulation of information flow.
Full Text Available We investigate the influence of the small-world topology on the composition of information flow on networks. By appealing to the combinatorial Hodge theory, we decompose information flow generated by random threshold networks on the Watts-Strogatz model into three components: gradient, harmonic and curl flows. The harmonic and curl flows represent globally circular and locally circular components, respectively. The Watts-Strogatz model bridges the two extreme network topologies, a lattice network and a random network, by a single parameter that is the probability of random rewiring. The small-world topology is realized within a certain range between them. By numerical simulation we found that as networks become more random the ratio of harmonic flow to the total magnitude of information flow increases whereas the ratio of curl flow decreases. Furthermore, both quantities are significantly enhanced from the level when only network structure is considered for the network close to a random network and a lattice network, respectively. Finally, the sum of these two ratios takes its maximum value within the small-world region. These findings suggest that the dynamical information counterpart of global integration and that of local segregation are the harmonic flow and the curl flow, respectively, and that a part of the small-world region is dominated by internal circulation of information flow.
Methodology : Morocco was selected for the case study. The researchers had ready access to key informants and information about the Logistics Management Information System. Because the study had time and resource constraints, research included desktop reviews and interview, rather than data collection in the field.
Naus, Joeri; Spaargaren, Gert; Vliet, Bas J.M. van; Horst, Hilje M. van der
Smart energy grids and smart meters are commonly expected to promote more sustainable ways of living. This paper presents a conceptual framework for analysing the different ways in which smart grid developments shape – and are shaped by – the everyday lives of residents. Drawing upon theories of social practices and the concept of informational governance, the framework discerns three categories of ‘information flows’: flows between household-members, flows between households and energy service providers, and flows between local and distant households. Based on interviews with Dutch stakeholders and observations at workshops we examine, for all three information flows, the changes in domestic energy practices and the social relations they help to create. The analysis reveals that new information flows may not produce more sustainable practices in linear and predictable ways. Instead, changes are contextual and emergent. Second, new possibilities for information sharing between households open up a terrain for new practices. Third, information flows affect social relationships in ways as illustrated by the debates on consumer privacy in the Netherlands. An exclusive focus on privacy, however, deviates attention from opportunities for information disclosure by energy providers, and from the significance of transparency issues in redefining relationships both within and between households. - Highlights: • Smart grids generate three key new information flows that affect social relations. • Practice theory can reveal the ways in which households handle/govern information. • Householders show ambivalence about the workings of the different information flows. • Policies should account for the ‘bright’ as well as the ‘dark’ sides of information
From Fear to Flow explores how personality traits may influence attitude, behaviour and reaction to information. Consideration is made for individual differences in information behaviour and reasons behind individual search differences. The book reviews personality and information behaviour and discusses how personality may influence the attitude towards information. Reaction to information is examined in contexts such as everyday life, decision-making, work, studies and human-computer interaction.Introduces a little researched area which is current and needed in our Informatio
Dogge, Myrthel; Hofman, Dennis; Boersma, Maria; Dijkerman, H Chris; Aarts, Henk
Building on the recent finding that agency experiences do not merely rely on sensorimotor information but also on cognitive cues, this exploratory study uses electroencephalographic recordings to examine functional connectivity during agency inference processing in a setting where action and outcome
Damian, Daniel; Danvy, Olivier
the least solution. Preservation of least solutions solves a problem that was left open in Palsberg and Wand's article ‘CPS Transformation of Flow Information.’ Together, Palsberg and Wand's article and the present article show how to map in linear time the least solution of the flow constraints...
Dmitriy Aleksandrovich Postoev
Full Text Available The article is devoted to the method of information-flow-based access control, adopted for virtualized systems. General structure of access control system for virtual infrastructure is proposed.
Full Text Available We employ Clarkson and Schneider's "hyperproperties" to classify various verification problems of quantitative information flow. The results of this paper unify and extend the previous results on the hardness of checking and inferring quantitative information flow. In particular, we identify a subclass of liveness hyperproperties, which we call "k-observable hyperproperties", that can be checked relative to a reachability oracle via self composition.
Iwamura, Yoshiro; Tanimoto, Jun
To investigate an interesting question as to whether or not social dilemma structures can be found in a realistic traffic flow reproduced by a model, we built a new microscopic model in which an intentional driver may try lane-changing to go in front of other vehicles and may hamper others’ lane-changes. Our model consists of twofold parts; cellular automaton emulating a real traffic flow and evolutionary game theory to implement a driver’s decision making-process. Numerical results reveal that a social dilemma like the multi-player chicken game or prisoner’s dilemma game emerges depending on the traffic phase. This finding implies that a social dilemma, which has been investigated by applied mathematics so far, hides behind a traffic flow, which has been explored by fluid dynamics. Highlight - Complex system of traffic flow with consideration of driver’s decision making process is concerned. - A new model dovetailing cellular automaton with game theory is established. - Statistical result from numerical simulations reveals a social dilemma structure underlying traffic flow. - The social dilemma is triggered by a driver’s egocentric actions of lane-changing and hampering other’s lane-change.
Samir Yahya Umri
Full Text Available This research views the most important obstacles facing the flow of electronic information in Arab world; with an illustration about the effect of each obstacle on the flowing of the information, and the special recommendation to overcome it. the research states 4 main obstacles; they are: number of internet users and the weakness of the infrastructure in our Arab world, the spam e-mails, the Bugs and vulnerabilities in the operating systems that allow hackers to attack the information systems, and the spread of the electronic pornography on the internet
Khushk, Abdul Rauf; Li, Xiaozhong
Solutions proposed and developed for the cost-effective cloud systems suffer from a combination of secure private clouds and less secure public clouds. Need to locate applications within different clouds poses a security risk to the information flow of the entire system. This study addresses this by assigning security levels of a given lattice to the entities of a federated cloud system. A dynamic flow sensitive security model featuring Bell-LaPadula procedures is explored that tracks and authenticates the secure information flow in federated clouds. Additionally, a Petri net model is considered as a case study to represent the proposed system and further validate the performance of the said system.
Jensen, Uffe Birk; Owens, David; Pedersen, Søren
Zinc salt-based fixation (ZBF) has proved advantageous in histochemical analyses conducted on intact tissues but has not been exploited in flow cytometry procedures that focus on quantitative analysis of individual cells. Here, we show that ZBF performs equally well to paraformaldehyde in the pre......Zinc salt-based fixation (ZBF) has proved advantageous in histochemical analyses conducted on intact tissues but has not been exploited in flow cytometry procedures that focus on quantitative analysis of individual cells. Here, we show that ZBF performs equally well to paraformaldehyde...... allowing subsequent quantitative PCR analysis or labeling for incorporation of the thymidine analog EdU following surface and intracellular epitope staining. Finally, ZBF treatment allows for long-term storage of labeled cells with little change in these parameters. Thus, we present a protocol for zinc...... salt fixation of cells that allows for the simultaneous analysis of DNA and intracellular and cell surface proteins by flow cytometry....
Full Text Available Information flow analysis checks whether certain pieces of (confidential data may affect the results of computations in unwanted ways and thus leak information. Dynamic information flow analysis adds instrumentation code to the target software to track flows at run time and raise alarms if a flow policy is violated; hybrid analyses combine this with preliminary static analysis. Using a subset of C as the target language, we extend previous work on hybrid information flow analysis that handled pointers to scalars. Our extended formulation handles arrays, pointers to array elements, and pointer arithmetic. Information flow through arrays of pointers is tracked precisely while arrays of non-pointer types are summarized efficiently. A prototype of our approach is implemented using the Frama-C program analysis and transformation framework. Work on a full machine-checked proof of the correctness of our approach using Isabelle/HOL is well underway; we present the existing parts and sketch the rest of the correctness argument.
Maxwell, Terrence A.
Summarizes some of the activities the United States government has undertaken to control the dissemination of information since 2001. It also explores, through a conceptual model of information flows, potential impacts and discontinuities between policy purposes and outcomes. (AEF)
Rowe, Neil C; Sjoberg, Eric; Adams, Paige
... of it. We are developing data mining techniques to track the flow of such information by comparing important information-security Web sites, alert messages, and strings in packets to find similar words and sentences...
Full Text Available Building on the recent finding that agency experiences do not merely rely on sensorimotor information but also on cognitive cues, this exploratory study uses electroencephalographic recordings to examine functional connectivity during agency inference processing in a setting where action and outcome are independent. Participants completed a computerized task in which they pressed a button followed by one of two color words (red or blue and rated their experienced agency over producing the color. Before executing the action, a matching or mismatching color word was pre-activated by explicitly instructing participants to produce the color (goal condition or by briefly presenting the color word (prime condition. In both conditions, experienced agency was higher in matching versus mismatching trials. Furthermore, increased electroencephalography (EEG-based connectivity strength was observed between parietal and frontal nodes and within the (prefrontal cortex when color-outcomes matched with goals and participants reported high agency. This pattern of increased connectivity was not identified in trials where outcomes were pre-activated through primes. These results suggest that different connections are involved in the experience and in the loss of agency, as well as in inferences of agency resulting from different types of pre-activation. Moreover, the findings provide novel support for the involvement of a fronto-parietal network in agency inferences.
Haikka, P.; McEndoo, S.; Maniscalco, S.; De Chiara, G.; Palma, G. M.
We study quantum information flow in a model comprised of a trapped impurity qubit immersed in a Bose-Einstein-condensed reservoir. We demonstrate how information flux between the qubit and the condensate can be manipulated by engineering the ultracold reservoir within experimentally realistic limits. We show that this system undergoes a transition from Markovian to non-Markovian dynamics, which can be controlled by changing key parameters such as the condensate scattering length. In this way, one can realize a quantum simulator of both Markovian and non-Markovian open quantum systems, the latter ones being characterized by a reverse flow of information from the background gas (reservoir) to the impurity (system).
Causal paradoxes arising in the tachyon theory have been systematically solved by using the reinterpretation principle as a consequence of which cause and effect no longer retain an absolute meaning. However, even in the tachyon theory, a cause is always seen to chronologically precede its effect, but this is obtained at the price of allowing cause and effect to be interchanged when required. A recent result has shown that this interchange-ability of cause and effect must not be unlimited if heavy paradoxes are to be avoided. This partial recovery of the classical concept of causality has been expressed by the conjecture that transcendent tachyons cannot be absorbed by a tachyon detector. In this paper the directional properties of the flow of information between two observers in relative motion and its consequences on the logical self-consistency of the theory of superluminal particles are analyzed. It is shown that the above conjecture does not provide a satisfactory solution to the problem because it implies that tachyons of any speed cannot be intercepted by the same detector. (author)
Mesároš, P.; Mandičák, T.
The article describes the options for the management of material flows in the construction process. Management and resource planning is one of the key factors influencing the effectiveness of construction project. It is very difficult to set these flows correctly. The current period offers several options and tools to do this. Information systems and their modules can be used just for the management of materials in the construction process.
Danielly Oliveira Inomata
Full Text Available Objective. This paper presents and discuss the concepts, contexts and applications involving information flows in organizations. Method. Systematic review, followed by a bibliometric analysis and system analysis. The systematic review aimed to search for, evaluate and review evidence about the research topic. The systematic review process comprised the following steps: 1 definition of keywords, 2 systematic review, 3 exploration and analysis of articles and 4 comparison and consolidation of results. Results. A bibliometric analysis aimed to provide a statement of the relevance of articles where the authors, dates of publications, citation index, and periodic keywords with higher occurrence. Conclusions. As survey results confirms the emphasis on information featured in the knowledge management process, and advancing years, it seems that the emphasis is on networks, ie, studies are turning to the operationalization and analysis of flows information networks. The literature produced demonstrates the relationship of information flow with its management, applied to different organizational contexts, including showing new trends in information science as the study and analysis of information flow in networks.
Ranjbari, Leyla; Bahar, Arifah; Aziz, Zainal Abdul
The paper considers the natural-gas storage valuation based on the information-based pricing framework of Brody-Hughston-Macrina (BHM). As opposed to many studies which the associated filtration is considered pre-specified, this work tries to construct the filtration in terms of the information provided to the market. The value of the storage is given by the sum of the discounted expectations of the cash flows under risk-neutral measure, conditional to the constructed filtration with the Brownian bridge noise term. In order to model the flow of information about the cash flows, we assume the existence of a fixed pricing kernel with liquid, homogenous and incomplete market without arbitrage.
Liu, D.; Guo, S.; Lian, Y.
Stationarity is often assumed for frequency analysis of low flows in water resources management and planning. However, many studies have shown that flow characteristics, particularly the frequency spectrum of extreme hydrologic events,were modified by climate change and human activities and the conventional frequency analysis without considering the non-stationary characteristics may lead to costly design. The analysis presented in this paper was based on the more than 100 years of daily flow data from the Yichang gaging station 44 kilometers downstream of the Three Gorges Dam. The Mann-Kendall trend test under the scaling hypothesis showed that the annual low flows had significant monotonic trend, whereas an abrupt change point was identified in 1936 by the Pettitt test. The climate informed low flow frequency analysis and the divided and combined method are employed to account for the impacts from related climate variables and the nonstationarities in annual low flows. Without prior knowledge of the probability density function for the gaging station, six distribution functions including the Generalized Extreme Values (GEV), Pearson Type III, Gumbel, Gamma, Lognormal, and Weibull distributions have been tested to find the best fit, in which the local likelihood method is used to estimate the parameters. Analyses show that GEV had the best fit for the observed low flows. This study has also shown that the climate informed low flow frequency analysis is able to exploit the link between climate indices and low flows, which would account for the dynamic feature for reservoir management and provide more accurate and reliable designs for infrastructure and water supply.
Li, Xin; Gao, Deli; Chen, Xuyue
Hydraulic extended-reach limit (HERL) model of horizontal extended-reach well (ERW) can predict the maximum measured depth (MMD) of the horizontal ERW. The HERL refers to the well's MMD when drilling fluid cannot be normally circulated by drilling pump. Previous model analyzed the following two constraint conditions, drilling pump rated pressure and rated power. However, effects of the allowable range of drilling fluid flow rate (Q min ≤ Q ≤ Q max ) were not considered. In this study, three cases of HERL model are proposed according to the relationship between allowable range of drilling fluid flow rate and rated flow rate of drilling pump (Q r ). A horizontal ERW is analyzed to predict its HERL, especially its horizontal-section limit (L h ). Results show that when Q min ≤ Q r ≤ Q max (Case I), L h depends both on horizontal-section limit based on rated pump pressure (L h1 ) and horizontal-section limit based on rated pump power (L h2 ); when Q min drilling fluid flow rate, while L h2 keeps decreasing as the drilling fluid flow rate increases. The comprehensive model provides a more accurate prediction on HERL.
Damian, Daniel; Danvy, Olivier
We build on Danvy and Nielsen's first-order program transformation into continuation-passing style (CPS) to design a new CPS transformation of flow information that is simpler and more efficient than what has been presented in previous work. The key to simplicity and efficiency is that our CPS tr...
Lluch Lafuente, Alberto; Nielson, Flemming; Nielson, Hanne Riis
system for statically checking if a system specification ensures an information flow policy. The approach is illustrated with two archetypal examples of distributed and parallel computing systems: a protocol for an identity-secured data providing service and a parallel MapReduce computation....
Pontes Soares Rocha, B.; Conti, M.; Etalle, S.; Crispo, B.
There are different paradigms for enforcing information flow and declassification policies. These approaches can be divided into static analyzers and runtime enforcers. Each class has its own strengths and weaknesses, each being able to enforce a different set of policies. In this paper we introduce
Rocha, Bruno P.S.; Conti, Mauro; Etalle, Sandro; Crispo, Bruno
There are different paradigms for enforcing information flow and declassification policies. These approaches can be divided into static analyzers and runtime enforcers. Each class has its own strengths and weaknesses, each being able to enforce a different set of policies. In this paper, we
Al Rahahleh, Naseem; Bhatti, M. Ishaq; Adeinat, Iman
Bhatti and Nguyen (2012) used the copula approach to measure the tail dependence between a number of international markets. They observed that some country pairs exhibit only left-tail dependence whereas others show only right-tail. However, the flow of information from uni-dimensional (one-tail) to bi-dimensional (two-tails) between various markets was not accounted for. In this study, we address the flow of information of this nature by using the dynamic conditional correlation (DCC-GARCH) model. More specifically, we use various versions of the DCC models to explain the nexus between the information flow of international equity and to explain the stochastic forward vs. backward dynamics of financial markets based on data for a 15-year period comprising 3,782 observations. We observed that the information flow between the US and Hong Kong markets and between the US and Australian markets are bi-directional. We also observed that the DCC model captures a wider co-movement structure and inter-connectedness compared to the symmetric Joe-Clayton copula.
Noda, Masaru; Sato, Shoko; Ueda, Tadashi; Tsuchi, Hiroyuki; Koike, Akihisa
The Information Flow Diagram for Literature Survey (IFDLS) has been developed to manage information and procedure in the literature survey phase of the PTAs selection process. It is a tool utilizing information technology, which can organize, analyze, and evaluate information from literature survey and manage their process systematically. IFDLS is able to show the flow of information and data, and the history of information management processing. Information coverage and quality is not homogenous throughout the country and, in some areas, there may not even be sufficient data available to be able to reach a judgment on conformity with the site-specific evaluation factors (SSEF). Literature surveys could only be conducted on a volunteer area before it is nominated as a PIA. However, the absence of information on any factor mentioned will not constitute disqualification of the area. On the contrary, an attempt to compare these sites with analogous areas in Japan will be done to assemble sufficient data and consequently make a decision on whether to proceed further. The application of IFDLS to literature survey phase of the PIA selection process is being proposed. The concept, construction, application and evolution of IFDLS towards application phase on a trial basis are discussed. (authors)
Wei, Quan; Courtney, Karen L
Long-term care (LTC), residential care requiring 24-hour nursing services, plays an important role in the health care service delivery system. The purpose of this study was to identify the needed clinical information and information flow to support LTC Registered Nurses (RNs) in care collaboration and clinical decision making. This descriptive qualitative study combines direct observations and semistructured interviews, conducted at Alberta's LTC facilities between May 2014 and August 2015. The constant comparative method (CCM) of joint coding was used for data analysis. Nine RNs from six LTC facilities participated in the study. The RN practice environment includes two essential RN information management aspects: information resources and information spaces. Ten commonly used information resources by RNs included: (1) RN-personal notes; (2) facility-specific templates/forms; (3) nursing processes/tasks; (4) paper-based resident profile; (5) daily care plans; (6) RN-notebooks; (7) medication administration records (MARs); (8) reporting software application (RAI-MDS); (9) people (care providers); and (10) references (i.e., books). Nurses used a combination of shared information spaces, such as the Nurses Station or RN-notebook, and personal information spaces, such as personal notebooks or "sticky" notes. Four essential RN information management functions were identified: collection, classification, storage, and distribution. Six sets of information were necessary to perform RN care tasks and communication, including: (1) admission, discharge, and transfer (ADT); (2) assessment; (3) care plan; (4) intervention (with two subsets: medication and care procedure); (5) report; and (6) reference. Based on the RN information management system requirements, a graphic information flow model was constructed. This baseline study identified key components of a current LTC nursing information management system. The information flow model may assist health information
I consider a stochastic model of multi-agent communication in regular network. The model describes how dispersed animals exchange information. Each agent can initiate and transfer the signal to its nearest neighbors, who may pass it farther. For an external observer of busy networks, signaling activity may appear random, even though information flow actually thrives. Only when signal initiation and transfer are at low levels do spatiotemporal autocorrelations emerge as clumping signaling activity in space and pink noise time series. Under such conditions, the costs of signaling are moderate, but the signaler can reach a large audience. I propose that real-world networks of dispersed signalers-receivers may self-organize into this state and the flow of information maintains their integrity.
Shi, Jia-Dong; Wang, Dong; Ye, Liu
In this paper, the dynamics of entanglement is investigated in the presence of a noisy environment. We reveal its revival behavior and probe the mechanisms of this behavior via an information-theoretic approach. By analyzing the correlation distribution and the information flow within the composite system including the qubit subsystem and a noisy environment, it has been found that the subsystem-environment coupling can induce the quasi-periodic entanglement revival. Furthermore, the dynamical relationship among tripartite correlations, bipartite entanglement and local state information is explored, which provides a new insight into the non-Markovian mechanisms during the evolution.
Full Text Available The paper examines the relationships between two different approaches to planning processes (participa- tive and non-participative and information flows within management control in companies. It augments the existing theoretical and empirical research by coupling management control and management infor- mation with participative planning, not only in operational but also in the strategic perspective. The re- sults presented in the paper stem from two consecutive studies, conducted between November 2010 and January 2012 and between November 2013 and January 2014. The studies comprised 397 and 179 Polish companies respectively. The authors formulated two hypotheses linking participative planning with upward and downward management information flows. The paper employed a quantitative approach, using the Spearman rank correlation analysis and hierarchical clustering using the Ward method, which enabled comparative analyses both in reference to various groups of companies included in particular research samples and over time. The results obtained showed the positive influence of participative plan- ning both on upward and downward information flows in enterprises. In particular, participative planning reduced information imbalances between top (the management and lower (employees of functional departments tiers in organisation structures.
Oka, Mizuki; Ikegami, Takashi
Social networking services (e.g., Twitter, Facebook) are now major sources of World Wide Web (called "Web") dynamics, together with Web search services (e.g., Google). These two types of Web services mutually influence each other but generate different dynamics. In this paper, we distinguish two modes of Web dynamics: the reactive mode and the default mode. It is assumed that Twitter messages (called "tweets") and Google search queries react to significant social movements and events, but they also demonstrate signs of becoming self-activated, thereby forming a baseline Web activity. We define the former as the reactive mode and the latter as the default mode of the Web. In this paper, we investigate these reactive and default modes of the Web's dynamics using transfer entropy (TE). The amount of information transferred between a time series of 1,000 frequent keywords in Twitter and the same keywords in Google queries is investigated across an 11-month time period. Study of the information flow on Google and Twitter revealed that information is generally transferred from Twitter to Google, indicating that Twitter time series have some preceding information about Google time series. We also studied the information flow among different Twitter keywords time series by taking keywords as nodes and flow directions as edges of a network. An analysis of this network revealed that frequent keywords tend to become an information source and infrequent keywords tend to become sink for other keywords. Based on these findings, we hypothesize that frequent keywords form the Web's default mode, which becomes an information source for infrequent keywords that generally form the Web's reactive mode. We also found that the Web consists of different time resolutions with respect to TE among Twitter keywords, which will be another focal point of this paper.
Human Resources Division
HR Division wishes to clarify to members of the personnel that the allowance for a dependent child continues to be paid during all training courses ('stages'), apprenticeships, 'contrats de qualification', sandwich courses or other courses of similar nature. Any payment received for these training courses, including apprenticeships, is however deducted from the amount reimbursable as school fees. HR Division would also like to draw the attention of members of the personnel to the fact that any contract of employment will lead to the suppression of the child allowance and of the right to reimbursement of school fees.
Fountas, S.; Wulfsohn, Dvora-Laiô; Blackmore, B.S.
A participative methology was developed in which farm managers decomposed their process of decision making in Precision Agriculture (PA) into brief secision statesments along with associated information requirements. The methodology was first developed on a university research farm in Denmark...... and further revised during testing on a number of research and commercial farms in Indiana, USA. Twenty-one decision analysis factors were idebfied to characterise a farm manager's decision-making process. Then a general data flow diagram (DFD) was constructed that describes the information flows "from data...... to decision". Illustrative examples of the model in the form of DFDs are presented for a strategic and an operational decision. The model was validated for a range of decisions related to operations by three university farm managers and by five commercial farmers practicing PA for cereal, corn and soybean...
Dobyns, York [PEAR, Princeton University, Princeton, NJ 08544-5263 (United States); Atmanspacher, Harald [Institut fuer Grenzgebiete der Psychologie und Psychohygiene, Wilhelmstr. 3a, 79098 Freiburg (Germany)]. E-mail: firstname.lastname@example.org
Weakly interacting lattices of coupled maps can be modeled as ordinary coupled map lattices separated from each other by boundary regions with small coupling parameters. We demonstrate that such weakly interacting lattices can nevertheless have unexpected and striking effects on each other. Under specific conditions, particular stability properties of the lattices are significantly influenced by their weak mutual interaction. This observation is tantamount to an efficacious information flow across the boundary.
Dobyns, York; Atmanspacher, Harald
Weakly interacting lattices of coupled maps can be modeled as ordinary coupled map lattices separated from each other by boundary regions with small coupling parameters. We demonstrate that such weakly interacting lattices can nevertheless have unexpected and striking effects on each other. Under specific conditions, particular stability properties of the lattices are significantly influenced by their weak mutual interaction. This observation is tantamount to an efficacious information flow across the boundary
Breuer, Heinz-Peter; Amato, Giulio; Vacchini, Bassano
Mixing dynamical maps describing open quantum systems can lead from Markovian to non-Markovian processes. Being surprising and counter-intuitive, this result has been used as argument against characterization of non-Markovianity in terms of information exchange. Here, we demonstrate that, quite the contrary, mixing can be understood in a natural way which is fully consistent with existing theories of memory effects. In particular, we show how mixing-induced non-Markovianity can be interpreted in terms of the distinguishability of quantum states, system-environment correlations and the information flow between system and environment.
Hlinka, Jaroslav; Jajcay, Nikola; Hartman, David; Paluš, Milan
Roč. 27, č. 3 (2017), č. článku 035811. ISSN 1054-1500 R&D Projects: GA ČR GCP103/11/J068; GA MŠk LH14001 Institutional support: RVO:67985807 Keywords : directed network * causal network * Granger causality * climate network * information flow * temperature network Subject RIV: IN - Informatics, Computer Science OBOR OECD: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8) Impact factor: 2.283, year: 2016
Schittler Neves, Fabio; Martim Schubert, Benno; Erichsen, Rubem, Jr.
Layered neural networks are feedforward structures that yield robust parallel and distributed pattern recognition. Even though much attention has been paid to pattern retrieval properties in such systems, many aspects of their dynamics are not yet well characterized or understood. In this work we study, at different temperatures, the memory activity and information flows through layered networks in which the elements are the simplest binary odd non-monotonic function. Our results show that, considering a standard Hebbian learning approach, the network information content has its maximum always at the monotonic limit, even though the maximum memory capacity can be found at non-monotonic values for small enough temperatures. Furthermore, we show that such systems exhibit rich macroscopic dynamics, including not only fixed point solutions of its iterative map, but also cyclic and chaotic attractors that also carry information.
Neves, Fabio Schittler; Schubert, Benno Martim; Erichsen, Rubem Jr
Layered neural networks are feedforward structures that yield robust parallel and distributed pattern recognition. Even though much attention has been paid to pattern retrieval properties in such systems, many aspects of their dynamics are not yet well characterized or understood. In this work we study, at different temperatures, the memory activity and information flows through layered networks in which the elements are the simplest binary odd non-monotonic function. Our results show that, considering a standard Hebbian learning approach, the network information content has its maximum always at the monotonic limit, even though the maximum memory capacity can be found at non-monotonic values for small enough temperatures. Furthermore, we show that such systems exhibit rich macroscopic dynamics, including not only fixed point solutions of its iterative map, but also cyclic and chaotic attractors that also carry information. (paper)
Jing, Wei; Guo, Daqing; Zhang, Yunxiang; Guo, Fengru; Valdés-Sosa, Pedro A; Xia, Yang; Yao, Dezhong
Functional MRI (fMRI) studies have demonstrated that the rodent brain shows a default mode network (DMN) activity similar to that in humans, offering a potential preclinical model both for physiological and pathophysiological studies. However, the neuronal mechanism underlying rodent DMN remains poorly understood. Here, we used electrophysiological data to analyze the power spectrum and estimate the directed phase transfer entropy (dPTE) within rat DMN across three vigilance states: wakeful rest (WR), slow-wave sleep (SWS), and rapid-eye-movement sleep (REMS). We observed decreased gamma powers during SWS compared with WR in most of the DMN regions. Increased gamma powers were found in prelimbic cortex, cingulate cortex, and hippocampus during REMS compared with WR, whereas retrosplenial cortex showed a reverse trend. These changed gamma powers are in line with the local metabolic variation of homologous brain regions in humans. In the analysis of directional interactions, we observed well-organized anterior-to-posterior patterns of information flow in the delta band, while opposite patterns of posterior-to-anterior flow were found in the theta band. These frequency-specific opposite patterns were only observed in WR and REMS. Additionally, most of the information senders in the delta band were also the receivers in the theta band, and vice versa. Our results provide electrophysiological evidence that rat DMN is similar to its human counterpart, and there is a frequency-dependent reentry loop of anterior-posterior information flow within rat DMN, which may offer a mechanism for functional integration, supporting conscious awareness.
McCarthy, J. Daniel; Barnes, Lianne N.; Alvarez, Bryan D.; Caplovitz, Gideon Paul
In grapheme-color synesthesia, graphemes (e.g., numbers or letters) evoke color experiences. It is generally reported that the opposite is not true: colors will not generate experiences of graphemes or their associated information. However, recent research has provided evidence that colors can implicitly elicit symbolic representations of associated graphemes. Here, we examine if these representations can be cognitively accessed. Using a mathematical verification task replacing graphemes with color patches, we find that synesthetes can verify such problems with colors as accurately as with graphemes. Doing so, however, takes time: ~250ms per color. Moreover, we find minimal reaction time switch-costs for switching between computing with graphemes and colors. This demonstrates that given specific task demands, synesthetes can cognitively access numerical information elicited by physical colors, and they do so as accurately as with graphemes. We discuss these results in the context of possible cognitive strategies used to access the information. PMID:24100131
Piras, Vincent; Tomita, Masaru; Selvarajoo, Kumar
The central dogma of molecular biology has come under scrutiny in recent years. Here, we reviewed high-throughput mRNA and protein expression data of Escherichia coli, Saccharomyces cerevisiae, and several mammalian cells. At both single cell and population scales, the statistical comparisons between the entire transcriptomes and proteomes show clear correlation structures. In contrast, the pair-wise correlations of single transcripts to proteins show nullity. These data suggest that the organizing structure guiding cellular processes is observed at omics-wide scale, and not at single molecule level. The central dogma, thus, globally emerges as an average integrated flow of cellular information.
Full Text Available The central dogma of molecular biology has come under scrutiny in recent years. Here, we reviewed high-throughput mRNA and protein expression data of Escherichia coli, Saccharomyces cerevisiae, and several mammalian cells. At both single cell and population scales, the statistical comparisons between the entire transcriptomes and proteomes show clear correlation structures. In contrast, the pair-wise correlations of single transcript to protein show nullity. These data suggest that the organizing structure guiding cellular processes is observed at omics-wide scale and not at single molecule level. The central dogma, thus, globally emerges as an average integrated flow of cellular information.
Parraguez, Pedro; Eppinger, Steven D.; Maier, Anja
The pattern of information flow through the network of interdependent design activities is thought to be an important determinant of engineering design process results. A previously unexplored aspect of such patterns relates to the temporal dynamics of information transfer between activities...... design process and thus support theory-building toward the evolution of information flows through systems engineering stages. Implications include guidance on how to analyze and predict information flows as well as better planning of information flows in engineering design projects according...
This parliamentary report first proposes a presentation of the European carbon emission allowances market or emission trading scheme (ETS) by recalling the context of its creation, and by describing its operation (a trading platform to reduce CO_2 emission in Europe), and commenting critics which are generally made about this market. Then, the authors present and comment proposals of reform with notably the creation of a reserve fund of stability, and a structural reform of the market. The authors then explain why and how the ETS reform must go beyond that if the European Union wants to meet commitments defined in the Paris agreement
Joanna Nowakowska-Grunt; Janusz Grabara
The paper presents information flow process in management of supply chains. Authors notices information flows as a driving element of the global supply chain. Authors points also on the logistics aspects in supply chain of waste management company
Johanna A. Badenhorst
Full Text Available Member organisations in a supply chain are dependent on each other to provide material, services and information to perform optimally in the supply chain. Efficient, unrestricted information flow is needed in supply chains to function properly. Information flow is thus an element of supply chain management that needs to be managed. Yet, no indication could be found in supply chain management literature of the measurement of information flow efficiency. Hence, the aim of this article is to explore the measurement of information flow efficiency in supply chain management (SCM and exploratively develop possible measures (indicators and associated metrics to measure the efficiency of information flow.In this research the theory of information and related concepts, the basic notions of information systems and the models of business performance measurement were explored. Based on information flow theory and information flow characteristics a research instrument was developed. It was used in a survey to seek inputs from supply chain managers as to the usefulness of characteristics as indicators and metrics for the measurement of information flow efficiency in a supply chain. The main contribution of the study is the development of a conceptual framework of indicators and metrics that may be used to evaluate the efficiency of information flows in supply chains. The results of this study can be used as a basis for further studies to validate the instrument for measuring information flow efficiency and to develop scales to actually measure information flow efficiency.
Glasman, Naftaly S.
First part of an article examining the content of information flow; the amount of information released; the mechanism of the flow; the factors affecting the content, amount, and mechanism; and the corollaries of information flow and the characteristics of the school system. Includes the questions put to the teachers. (Author/IRT)
Khushi, Matloob; Edwards, Georgina; de Marcos, Diego Alonso; Carpenter, Jane E; Graham, J Dinny; Clarke, Christine L
Virtual microscopy includes digitisation of histology slides and the use of computer technologies for complex investigation of diseases such as cancer. However, automated image analysis, or website publishing of such digital images, is hampered by their large file sizes. We have developed two Java based open source tools: Snapshot Creator and NDPI-Splitter. Snapshot Creator converts a portion of a large digital slide into a desired quality JPEG image. The image is linked to the patient's clinical and treatment information in a customised open source cancer data management software (Caisis) in use at the Australian Breast Cancer Tissue Bank (ABCTB) and then published on the ABCTB website (http://www.abctb.org.au) using Deep Zoom open source technology. Using the ABCTB online search engine, digital images can be searched by defining various criteria such as cancer type, or biomarkers expressed. NDPI-Splitter splits a large image file into smaller sections of TIFF images so that they can be easily analysed by image analysis software such as Metamorph or Matlab. NDPI-Splitter also has the capacity to filter out empty images. Snapshot Creator and NDPI-Splitter are novel open source Java tools. They convert digital slides into files of smaller size for further processing. In conjunction with other open source tools such as Deep Zoom and Caisis, this suite of tools is used for the management and archiving of digital microscopy images, enabling digitised images to be explored and zoomed online. Our online image repository also has the capacity to be used as a teaching resource. These tools also enable large files to be sectioned for image analysis. The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/5330903258483934.
Full Text Available Abstract Background Virtual microscopy includes digitisation of histology slides and the use of computer technologies for complex investigation of diseases such as cancer. However, automated image analysis, or website publishing of such digital images, is hampered by their large file sizes. Results We have developed two Java based open source tools: Snapshot Creator and NDPI-Splitter. Snapshot Creator converts a portion of a large digital slide into a desired quality JPEG image. The image is linked to the patient’s clinical and treatment information in a customised open source cancer data management software (Caisis in use at the Australian Breast Cancer Tissue Bank (ABCTB and then published on the ABCTB website (http://www.abctb.org.au using Deep Zoom open source technology. Using the ABCTB online search engine, digital images can be searched by defining various criteria such as cancer type, or biomarkers expressed. NDPI-Splitter splits a large image file into smaller sections of TIFF images so that they can be easily analysed by image analysis software such as Metamorph or Matlab. NDPI-Splitter also has the capacity to filter out empty images. Conclusions Snapshot Creator and NDPI-Splitter are novel open source Java tools. They convert digital slides into files of smaller size for further processing. In conjunction with other open source tools such as Deep Zoom and Caisis, this suite of tools is used for the management and archiving of digital microscopy images, enabling digitised images to be explored and zoomed online. Our online image repository also has the capacity to be used as a teaching resource. These tools also enable large files to be sectioned for image analysis. Virtual Slides The virtual slide(s for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/5330903258483934
Boczkowski, Lucas; Natale, Emanuele; Feinerman, Ofer; Korman, Amos
Biological systems can share and collectively process information to yield emergent effects, despite inherent noise in communication. While man-made systems often employ intricate structural solutions to overcome noise, the structure of many biological systems is more amorphous. It is not well understood how communication noise may affect the computational repertoire of such groups. To approach this question we consider the basic collective task of rumor spreading, in which information from few knowledgeable sources must reliably flow into the rest of the population. We study the effect of communication noise on the ability of groups that lack stable structures to efficiently solve this task. We present an impossibility result which strongly restricts reliable rumor spreading in such groups. Namely, we prove that, in the presence of even moderate levels of noise that affect all facets of the communication, no scheme can significantly outperform the trivial one in which agents have to wait until directly interacting with the sources-a process which requires linear time in the population size. Our results imply that in order to achieve efficient rumor spread a system must exhibit either some degree of structural stability or, alternatively, some facet of the communication which is immune to noise. We then corroborate this claim by providing new analyses of experimental data regarding recruitment in Cataglyphis niger desert ants. Finally, in light of our theoretical results, we discuss strategies to overcome noise in other biological systems.
Johanna A. Badenhorst; Claus Maurer; Tersia Brevis-Landsberg
Member organisations in a supply chain are dependent on each other to provide material, services and information to perform optimally in the supply chain. Efficient, unrestricted information flow is needed in supply chains to function properly. Information flow is thus an element of supply chain management that needs to be managed. Yet, no indication could be found in supply chain management literature of the measurement of information flow efficiency. Hence, the aim of this article is to exp...
Kolesa, K.; Vejvodova, I.
In the year 1981 the reliability information system for nuclear power plants (ISS-JE) was established. The objective of the system is to make a statistical evaluation of the operation of nuclear power plants and to obtain information on the reliability of the equipment of nuclear power plants and the transmission of this information to manufacturers with the aim of inducing them to take corrective measures. The HP 1000 computer with the data base system IMAGE 100 is used which allows to process single queries and periodical outputs. The content of periodical outputs designed for various groups of subcontractors is briefly described and trends of the further development of the system indicated. (Ha)
Kim, Jong Hyun
Diagnosis is one of the most complex and mental resource-demanding tasks in nuclear power plants (NPPs), especially, to main control room (MCR) operators. Diagnosis is a crucial part of disturbance control in NPPs, since it is a prerequisite task for initiating operating procedures. In order to design a control room feature for NPPs, three elements need to be considered: 1) the operational tasks that must be performed, 2) a model of human performance for these tasks, and 3) a model of how control room features are intended to support performance. The operational tasks define the classes of performance that must be considered. A model of human performance makes more explicit the requirements for accurate and efficient performance and reveals potential sources of error. Finally, the model of support allows the generation of specific hypotheses about how performance is facilitated in the control room. The model of support needs to be developed based on the human performance model. This paper proposes three approaches for the system design of operator support systems to aid MCR operators' diagnosis tasks in NPPs, considering the above three elements. This paper presents 1) a quantitative approach to modeling the information flow of diagnosis tasks, 2) strategy-based evaluation of information aids for diagnosis tasks, and 3) quantitative evaluation of NPP decision support systems. As an analysis of diagnosis tasks, this paper presents a method to quantify the cognitive information flow of diagnosis tasks, integrating a stage model (a qualitative approach) with information theory (a quantitative approach). The method includes: 1) constructing the information flow model, which consists of four stages based on operating procedures of NPPs: and 2) quantifying the information flow using Conant's model, a kind of information theory. Then, three experiments were conducted to evaluate the effectiveness of the proposed approach to predicting human performances, especially in
Bauereiss , Thomas; Hutter , Dieter
Part 6: Information Flow Control; International audience; Motivated by typical security requirements of workflow management systems, we consider the integrated verification of both safety properties (e.g. separation of duty) and information flow security predicates of the MAKS framework (e.g. modeling confidentiality requirements). Due to the refinement paradox, enforcement of safety properties might violate possibilistic information flow properties of a system. We present an approach where s...
Behrman, Robert; Carley, Kathleen
This paper describes the Dynamic Information Flow Simulation (DIFS), an abstract model for analyzing the structure and function of intelligence support organizations and the activities of entities within...
Li Nian-Qiang; Pan Wei; Yan Lian-Shan; Luo Bin; Xu Ming-Feng; Tang Yi-Long
Symbolic transfer entropy (STE) is employed to quantify the dominant direction of information flow between two chaotic-semiconductor-laser time series. The information flow in unidirectionally and bidirectionally coupled systems was analyzed systematically. Numerical results show that the dependence relationship can be revealed if there exists any coupling between two chaotic semiconductor lasers. More importantly, in both unsynchronized and good synchronization regimes, the STE can be used to quantify the direction of information flow between the lasers, although the former case leads to a better identification. The results thus establish STE as an effective tool for quantifying the direction of information flow between chaotic-laser-based systems
Withall, Elizabeth; Wilson, Annabelle M; Henderson, Julie; Tonkin, Emma; Coveney, John; Meyer, Samantha B; Clark, Jacinta; McCullum, Dean; Ankeny, Rachel; Ward, Paul R
Contemporary food systems are vast and complex, creating greater distance between consumers and their food. Consequently, consumers are required to put faith in a system of which they have limited knowledge or control. Country of origin labelling (CoOL) is one mechanism that theoretically enables consumer knowledge of provenance of food products. However, this labelling system has recently come under Australian Government review and recommendations for improvements have been proposed. Consumer engagement in this process has been limited. Therefore this study sought to obtain further consumer opinion on the issue of CoOL and to identify the extent to which Australian consumers agree with Australian Government recommendations for improvements. A citizens' jury was conducted with a sample of 14 South Australian consumers to explore their perceptions on whether the CoOL system allows them to make informed food choices, as well as what changes (if any) need to be made to enable informed food choices (recommendations). Overall, jurors' perception of usefulness of CoOL, including its ability to enable consumers to make informed food choices, fluctuated throughout the Citizens' Jury. Initially, the majority of the jurors indicated that the labels allowed informed food choice, however by the end of the session the majority disagreed with this statement. Inconsistencies within jurors' opinions were observed, particularly following delivery of information from expert witnesses and jury deliberation. Jurors provided recommendations for changes to be made to CoOL, which were similar to those provided in the Australian Government inquiry. Consumers in this study engaged with the topical issue of CoOL and provided their opinions. Overall, consumers do not think that the current CoOL system in Australia enables consumers to make informed choices. Recommendations for changes, including increasing the size of the label and the label's font, and standardising its position, were made.
Golovko, V.; Mysaka, G.
Article is dedicated to study of the actual questions of the improvement of the dataware and methodic of the economic analysis of the cash and flow in process of the optimization of management company financial resource.
Hopkins, D S; Oswald, N; McCaffrey, K; Bressler, S; Davidson, N; Vela, L
Given the diffusion of responsibilities for gathering and reporting healthcare information in a managed care environment, California stakeholders are taking concrete steps to break the deadlock on data and information flows that has characterized the industry for some time. The California Information Exchange (CALINX) was established to facilitate the implementation of the Health Insurance Portability and Accountability Act (HIPAA) standards in California and to create trust for data exchange between trading partners, without which data exchange still will not occur. Strategic directions are set by the chief executives of key associations and organizations representing purchasers, plans, providers, and consumers. Multi-stakeholder workgroups have produced detailed data guidelines for the HIPAA standards along with rules for exchange of key data sets between trading partners. These rules address frequency, timeliness, and accuracy of data submission. Both the data guidelines and the rules have been tested in live demonstration projects, and the results of these projects have been reported to substantiate the business case for implementation. Further incentives are being built into contracts between purchasers and plans, and between plans and providers. CALINX is currently promoting widespread adoption of the data guidelines and rules for exchange with all members of the industry.
Wilson, L.; Parfitt, E. A.
Perched lava ponds are infrequent but distinctive topographic features formed during some basaltic eruptions. Two such ponds, each approximately 150 m in diameter, formed during the 1968 eruption at Napau Crater and the 1974 eruption of Mauna Ulu, both on Kilauea Volcano, Hawaii. Each one formed where a channelized, high volume flux lava flow encountered a sharp reduction of slope: the flow spread out radially and stalled, forming a well-defined terminal levee enclosing a nearly circular lava pond. We describe a model of how cooling limits the motion of lava spreading radially into a pond and compare this with the case of a channelized flow. The difference in geometry has a major effect, such that the size of a pond is a good indicator of the volume flux of the lava forming it. Lateral spreading on distal shallow slopes is a major factor limiting the lengths of lava flows.
Recami, E.; Pavsic, M.
Recently Basano (Int. J. Theor. Phys.; 16:715 (1977)) in a paper entitled 'Information Flow, Causality and the Classical Theory of Tachyons' commented on earlier work by the present authors. In answer to those comments it is pointed out that although 'Extended Relativity' seems to allow one to solve any causal paradoxes with both usual particles and tachyons nevertheless a number of paradoxes are continuously proposed. It has already been shown by the authors that tachyons possibly do not imply any causality violations even in macro-physics but Basano claimed that the procedure lead to new, different paradoxes. It is here demonstrated that such presumed difficulties do not exist. (U.K.)
.... 100921457-0561-02] RIN 0660-XA20 Global Free Flow of Information on the Internet AGENCY: National... of comment period. SUMMARY: The Department of Commerce's Internet Policy Task Force announces that... on the global free flow of information on the Internet has been reopened and will extend until 5 p.m...
Akanyeti, Otar; Venturelli, Roberto; Visentin, Francesco; Fiorini, Paolo [Department of Computer Science, University of Verona, 37134 Verona (Italy); Chambers, Lily; Megill, William M, E-mail: email@example.com [Department of Mechanical Engineering, University of Bath, Bath BA2 7AY (United Kingdom)
In this work, we focus on biomimetic lateral line sensing in Karman vortex streets. After generating a Karman street in a controlled environment, we examine the hydrodynamic images obtained with digital particle image velocimetry (DPIV). On the grounds that positioning in the flow and interaction with the vortices govern bio-inspired underwater locomotion, we inspect the fluid in the swimming robot frame of reference. We spatially subsample the flow field obtained using DPIV to emulate the local flow around the body. In particular, we look at various sensor configurations in order to reliably identify the vortex shedding frequency, wake wavelength and downstream flow speed. Moreover, we propose methods that differentiate between being in and out of the Karman street with >70% accuracy, distinguish right from left with respect to Karman vortex street centreline (>80%) and highlight when the sensor system enters the vortex formation zone (>75%). Finally, we present a method that estimates the relative position of a sensor array with respect to the vortex formation point within 15% error margin.
Yoo, Peter E; Hagan, Maureen A; John, Sam E; Opie, Nicholas L; Ordidge, Roger J; O'Brien, Terence J; Oxley, Thomas J; Moffat, Bradford A; Wong, Yan T
Performing voluntary movements involves many regions of the brain, but it is unknown how they work together to plan and execute specific movements. We recorded high-resolution ultra-high-field blood-oxygen-level-dependent signal during a cued ankle-dorsiflexion task. The spatiotemporal dynamics and the patterns of task-relevant information flow across the dorsal motor network were investigated. We show that task-relevant information appears and decays earlier in the higher order areas of the dorsal motor network then in the primary motor cortex. Furthermore, the results show that task-relevant information is encoded in general initially, and then selective goals are subsequently encoded in specifics subregions across the network. Importantly, the patterns of recurrent information flow across the network vary across different subregions depending on the goal. Recurrent information flow was observed across all higher order areas of the dorsal motor network in the subregions encoding for the current goal. In contrast, only the top-down information flow from the supplementary motor cortex to the frontoparietal regions, with weakened recurrent information flow between the frontoparietal regions and bottom-up information flow from the frontoparietal regions to the supplementary cortex were observed in the subregions encoding for the opposing goal. We conclude that selective motor goal encoding and execution rely on goal-dependent differences in subregional recurrent information flow patterns across the long-range dorsal motor network areas that exhibit graded functional specialization. © 2018 Wiley Periodicals, Inc.
Acharya, Viral; DeMarzo, Peter; Kremer, Ilan
We consider the release of information by a firm when the manager has discretion regarding the timing of its release. While it is well known that firms appear to delay the release of bad news, we examine how external information about the state of the economy (or the industry) affects this decision. We develop a dynamic model of strategic disclosure in which a firm may privately receive information at a time that is random (and independent of the state of the economy). Because investors are u...
The Joint Maritime Command Information System (JMCIS) provides a common operating environment for Naval tactical decision aids that currently operates two distinct system high enclaves, one at SECRET/GENSER and one at TOP SECRET/SCI...
Abstract. We point out that controlled quantum interference corresponds to measurement in an incomplete basis and implies a nonlocal transfer of classical information. A test of whether such a generalized measurement is permissible in quantum theory is presented.
Full Text Available In this paper, we propose novel methods for measuring depth of anesthesia (DOA by quantifying dominant information flow in multichannel EEGs. Conventional methods mainly use few EEG channels independently and most of multichannel EEG based studies are limited to specific regions of the brain. Therefore the function of the cerebral cortex over wide brain regions is hardly reflected in DOA measurement. Here, DOA is measured by the quantification of dominant information flow obtained from principle bipartition. Three bipartitioning methods are used to detect the dominant information flow in entire EEG channels and the dominant information flow is quantified by calculating information entropy. High correlation between the proposed measures and the plasma concentration of propofol is confirmed from the experimental results of clinical data in 39 subjects. To illustrate the performance of the proposed methods more easily we present the results for multichannel EEG on a two-dimensional (2D brain map.
Cosman, Joshua D.; Arita, Jason T.; Ianni, Julianna D.; Woodman, Geoffrey F.
The temporal relationship between different stages of cognitive processing is long-debated. This debate is ongoing, primarily because it is often difficult to measure the time course of multiple cognitive processes simultaneously. We employed a manipulation that allowed us to isolate ERP components related to perceptual processing, working memory, and response preparation, and then examined the temporal relationship between these components while observers performed a visual search task. We f...
Diederick C. Niehorster
Full Text Available We investigated what roles global spatial frequency, surface structure, and foreground motion play in heading perception during simulated rotation from optic flow. The display (110°Hx94°V simulated walking on a straight path over a ground plane (depth range: 1.4–50 m at 2 m/s while fixating a target off to one side (mean R/T ratios: ±1, ±2, ±3 under six display conditions. Four displays consisted of nonexpanding dots that were distributed so as to manipulate the amount of foreground motion and the presence of surface structure. In one further display the ground was covered with disks that expanded during the trial and lastly a textured ground display was created with the same spatial frequency power spectrum as the disk ground. At the end of each 1s trial, observers indicated their perceived heading along a line at the display's center. Mean heading biases were smaller for the textured than for the disk ground, for the displays with more foreground motion and for the displays with surface structure defined by dot motion than without. We conclude that while spatial frequency content is not a crucial factor, dense motion parallax and surface structure in optic flow are important for accurate heading perception during rotation.
Bastien, Olivier; Ortet, Philippe; Roy, Sylvaine; Maréchal, Eric
Popular methods to reconstruct molecular phylogenies are based on multiple sequence alignments, in which addition or removal of data may change the resulting tree topology. We have sought a representation of homologous proteins that would conserve the information of pair-wise sequence alignments, respect probabilistic properties of Z-scores (Monte Carlo methods applied to pair-wise comparisons) and be the basis for a novel method of consistent and stable phylogenetic reconstruction. We have built up a spatial representation of protein sequences using concepts from particle physics (configuration space) and respecting a frame of constraints deduced from pair-wise alignment score properties in information theory. The obtained configuration space of homologous proteins (CSHP) allows the representation of real and shuffled sequences, and thereupon an expression of the TULIP theorem for Z-score probabilities. Based on the CSHP, we propose a phylogeny reconstruction using Z-scores. Deduced trees, called TULIP trees, are consistent with multiple-alignment based trees. Furthermore, the TULIP tree reconstruction method provides a solution for some previously reported incongruent results, such as the apicomplexan enolase phylogeny. The CSHP is a unified model that conserves mutual information between proteins in the way physical models conserve energy. Applications include the reconstruction of evolutionary consistent and robust trees, the topology of which is based on a spatial representation that is not reordered after addition or removal of sequences. The CSHP and its assigned phylogenetic topology, provide a powerful and easily updated representation for massive pair-wise genome comparisons based on Z-score computations.
Full Text Available Abstract Background Popular methods to reconstruct molecular phylogenies are based on multiple sequence alignments, in which addition or removal of data may change the resulting tree topology. We have sought a representation of homologous proteins that would conserve the information of pair-wise sequence alignments, respect probabilistic properties of Z-scores (Monte Carlo methods applied to pair-wise comparisons and be the basis for a novel method of consistent and stable phylogenetic reconstruction. Results We have built up a spatial representation of protein sequences using concepts from particle physics (configuration space and respecting a frame of constraints deduced from pair-wise alignment score properties in information theory. The obtained configuration space of homologous proteins (CSHP allows the representation of real and shuffled sequences, and thereupon an expression of the TULIP theorem for Z-score probabilities. Based on the CSHP, we propose a phylogeny reconstruction using Z-scores. Deduced trees, called TULIP trees, are consistent with multiple-alignment based trees. Furthermore, the TULIP tree reconstruction method provides a solution for some previously reported incongruent results, such as the apicomplexan enolase phylogeny. Conclusion The CSHP is a unified model that conserves mutual information between proteins in the way physical models conserve energy. Applications include the reconstruction of evolutionary consistent and robust trees, the topology of which is based on a spatial representation that is not reordered after addition or removal of sequences. The CSHP and its assigned phylogenetic topology, provide a powerful and easily updated representation for massive pair-wise genome comparisons based on Z-score computations.
... dissemination of accident or incident information. (a) Release of information during the field investigation... 49 Transportation 7 2010-10-01 2010-10-01 false Flow and dissemination of accident or incident information. 831.13 Section 831.13 Transportation Other Regulations Relating to Transportation (Continued...
Shaffer, Alan B
Within a multilevel secure (MLS) system, flaws in design and implementation can result in overt and covert channels, both of which may be exploited by malicious software to cause unauthorized information flows...
Brito, António Carvalho; Cruz-Correia, Ricardo João
Summary Objectives To understand and build a collective vision of all existing institutions in the Portuguese National Health Service as well as to perceive how and how far the interaction between those multiple institutions is supported by Information Systems (IS). Methods Upon identification of the institutions involved in the healthcare process, a set of interviews with experienced people from those institutions was conducted, which produced about five hours of tape. The research was focused exclusively on processes involving two different organizations and any internal processes were altogether excluded from it. Results The study allowed the identification of about 50 recurrent interaction processes, which were classified into four different varieties in accordance with the nature of the information flow: administrative, clinical, identificational and statistical. In addition, these processes were divided in accordance with the way how that integration is achieved, from completely automated to email or telephone-based. Conclusions Funds/Money related processes are technologically more rigid and standardized, whereas auditing and inspection ones are less supported by automatic systems. There emerged an interesting level of sharing and integration in clinical processes, although the integration is mostly made at the interface level. The authors identified 5 particularly relevant and dominant actors (2 classes of individuals and 3 institutions) with which there is a need for coordination and cooperation. The authors consider that, in future works, an effort should be made to provide the various institutions with guidelines/interfaces and prompt such institutions to elaborate upon these. PMID:27999840
Applications Conference (ACSAC 2005) (pp. 337-351). IEEE Xplore .  Bell, D. E., & LaPadula, L. (1973). Secure Computer Systems: Mathematical...Bowes, and D. Gardner Net Force Maneuver, Proceedings of the 2005 IEEE Workshop on Information Assurance, West Point, NY. http://ieeexplore.ieee.org...of the 1999 IEEE International Symposium on Computer Aided Control System Design, Kohala Coast, HI , USA, August 22-27, 1999, http
Grigoryuk, E. N.; Bulkin, V. V.
Automated control systems of technological processes are complex systems that are characterized by the presence of elements of the overall focus, the systemic nature of the implemented algorithms for the exchange and processing of information, as well as a large number of functional subsystems. The article gives examples of automatic control systems and automated control systems of technological processes held parallel between them by identifying strengths and weaknesses. Other proposed non-standard control system of technological process.
Chandani, Y; Breton, G
Many developing countries increasingly recognize and acknowledge family planning as a critical part of socio-economic development. However, with few health dollars to go around, countries tend to provide essential drugs for curative care, rather than for family planning products. Donors have historically provided free contraceptives for family planning services. Whether products are donated or purchased by the country, a successful family planning program depends on an uninterrupted supply of products, beginning with the manufacturer and ending with the customer. Any break in the supply chain may cause a family planning program to fail. A well-functioning logistics system can manage the supply chain and ensure that the customers have the products they need, when they need them. Morocco was selected for the case study. The researchers had ready access to key informants and information about the Logistics Management Information System. Because the study had time and resource constraints, research included desktop reviews and interview, rather than data collection in the field. The case study showed that even in a challenging environment an LMIS can be successfully deployed and fully supported by the users. It is critical to customize the system to a country-specific situation to ensure buy-in for the implementation. Significant external support funding and technical expertise are critical components to ensure the initial success of the system. Nonetheless, evidence from the case study shows that, after a system has been implemented, the benefits may not ensure its institutionalization. Other support, including local funding and technical expertise, is required.
ABSTRACT Juvonen, Piia Suvi Päivikki 2012. Effective information flow through efficient supply chain management -Value stream mapping approach - Case Outokumpu Tornio Works. Master`s Thesis. Kemi-Tornio University of Applied Sciences. Business and Culture. Pages 63. Appendices 2. The general aim of this thesis is to explore effective information flow through efficient supply chain management by following one of the lean management principles, value stream mapping. The specific research...
In this paper, stochastic thermodynamics of delayed bistable Langevin systems near coherence resonance is discussed. We calculate the heat dissipation rate and the information flow of a delayed bistable Langevin system under various noise intensities. Both the heat dissipation rate and the information flow are found to be bell-shaped functions of the noise intensity, which implies that coherence resonance manifests itself in the thermodynamic properties.
Given a quantum (or statistical) system with a very large number of degrees of freedom and a preferred tensor product factorization of the Hilbert space (or of a space of distributions) we describe how it can be approximated with a very low-dimensional field theory with geometric degrees of freedom. The geometric approximation procedure consists of three steps. The first step is to construct weighted graphs (we call information graphs) with vertices representing subsystems (e.g., qubits or random variables) and edges representing mutual information (or the flow of information) between subsystems. The second step is to deform the adjacency matrices of the information graphs to that of a (locally) low-dimensional lattice using the graph flow equations introduced in the paper. (Note that the graph flow produces very sparse adjacency matrices and thus might also be used, for example, in machine learning or network science where the task of graph sparsification is of a central importance.) The third step is to define an emergent metric and to derive an effective description of the metric and possibly other degrees of freedom. To illustrate the procedure we analyze (numerically and analytically) two information graph flows with geometric attractors (towards locally one- and two-dimensional lattices) and metric perturbations obeying a geometric flow equation. Our analysis also suggests a possible approach to (a non-perturbative) quantum gravity in which the geometry (a secondary object) emerges directly from a quantum state (a primary object) due to the flow of the information graphs.
Željana Aljinović Barać
Full Text Available This paper focuses on the voluntary disclosure of cash flows information of Croatian large companies whose shares are listed on the Zagreb Stock Exchange, with the aim to identify characteristics of companies that provide extensive disclosures. In order to conduct the research and test the likelihood that company publicly announces wealth of information about cash flows, three groups of company’s features are defined as variables: accounting data, capital market information and company’s qualitative characteristics. Verification of empirical evidence was provided through the sample of Croatian listed companies using logistic regression analysis. Obtained results indicate that despite the desire of the regulatory authorities that capital market investors receive all relevant information, companies voluntarily disclose information about cash flows very rarely. Those companies are young (i.e. their shares are listed on an organized securities market for a short time and profitable, with growing net income and growing cash flow from operating activities and usually use indirect method for operating cash flow report. The provision of features of Croatian companies that voluntary disclose cash flow information can be found as contribution of our research, because this topic in a cases of macro-oriented accounting system economies, i.e. bank oriented economies with emerging capital market is still unexplored.
Saef, Steven H; Melvin, Cathy L; Carr, Christine M
Use clinician perceptions to estimate the impact of a health information exchange (HIE) on emergency department (ED) care at four major hospital systems (HS) within a region. Use survey data provided by ED clinicians to estimate reduction in Medicare-allowable reimbursements (MARs) resulting from use of an HIE. We conducted the study during a one-year period beginning in February 2012. Study sites included eleven EDs operated by four major HS in the region of a mid-sized Southeastern city, including one academic ED, five community hospital EDs, four free-standing EDs and 1 ED/Chest Pain Center (CPC) all of which participated in an HIE. The study design was observational, prospective using a voluntary, anonymous, online survey. Eligible participants included attending emergency physicians, residents, and mid-level providers (PA & NP). Survey items asked clinicians whether information obtained from the HIE changed resource use while caring for patients at the study sites and used branching logic to ascertain specific types of services avoided including laboratory/microbiology, radiology, consultations, and hospital admissions. Additional items asked how use of the HIE affected quality of care and length of stay. The survey was automated using a survey construction tool (REDCap Survey Software © 2010 Vanderbilt University). We calculated avoided MARs by multiplying the numbers and types of services reported to have been avoided. Average cost of an admission from the ED was based on direct cost trends for ED admissions within the region. During the 12-month study period we had 325,740 patient encounters and 7,525 logons to the HIE (utilization rate of 2.3%) by 231 ED clinicians practicing at the study sites. We collected 621 surveys representing 8.25% of logons of which 532 (85.7% of surveys) reported on patients who had information available in the HIE. Within this group the following services and MARs were reported to have been avoided [type of service: number of
The general objective of the present study is to investigate and assess the incremental information content of cash flow disclosures as required by the AASB 1026 ¡°Statement of Cash Flows¡±. This test addresses the issue of whether a change in cash flow components has the same relationship with security prices as that in earnings. Several previous studies indicate both income and cash flow statements may be mutually exclusive or mutually inclusive statements. The data to test three hypotheses...
The role and working methods of specialized data compilation centres are considered in relation to the CODATA/UNISIST study on data dissemination and some technicalities of the use of network linked computers for scientific information retrieval. This general approach suggests that INIS should be used in the ''data referral'' function for nuclear science, and that data requests should in interactive retrieval be routed in cascade from INIS to the appropriate specialized data file. An outline sketch of a cheap and highly decentralized network shows that only a simple data file pointer need be added to INIS in order to make it suitable for use as an aid to retrieving numerical data. (author)
The flight deck of a moderm commercial airliner is a complex system consisting of two or more crew and a suite of technological devices. When everything goes right, all modem flight decks are easy to use. When things go sour, however, automated flight decks provide opportunities for new kinds of problems. A recent article in Aviation Week cited industry concern over the problem of verifying the safety of complex systems on automated, digital aircraft, stating that the industry must "guard against the kind of incident in which people and the automation seem to mismanage a minor occurrence or non-routine situation into larger trouble." The design of automated flight deck systems that flight crews find easy to use safely is a challenge in part because this design activity requires a theoretical perspective which can simultaneously cover the interactions of people with each other and with technology. In this paper, I will introduce some concepts that can be used to understand the flight deck as a system that is composed of two or more pilots and a complex suite of automated devices. As I will try to show, without a theory, we can repeat what seems to work, but we may not know why it worked or how to make it work in novel circumstances. Theory allows us to rise above the particulars of specific situations and makes the application of the roots of success in one setting applicable to other settings.
Louca, S.; Hawley, A. K.; Katsev, S.; Beltran, M. T.; Bhatia, M. P.; Michiels, C.; Capelle, D.; Lavik, G.; Doebeli, M.; Crowe, S.; Hallam, S. J.
Microbial activity drives marine biochemical fluxes and nutrient cycling at global scales. Geochemical measurements as well as molecular techniques such as metagenomics, metatranscriptomics and metaproteomics provide great insight into microbial activity. However, an integration of molecular and geochemical data into mechanistic biogeochemical models is still lacking. Recent work suggests that microbial metabolic pathways are, at the ecosystem level, strongly shaped by stoichiometric and energetic constraints. Hence, models rooted in fluxes of matter and energy may yield a holistic understanding of biogeochemistry. Furthermore, such pathway-centric models would allow a direct consolidation with meta'omic data. Here we present a pathway-centric biogeochemical model for the seasonal oxygen minimum zone in Saanich Inlet, a fjord off the coast of Vancouver Island. The model considers key dissimilatory nitrogen and sulfur fluxes, as well as the population dynamics of the genes that mediate them. By assuming a direct translation of biocatalyzed energy fluxes to biosynthesis rates, we make predictions about the distribution and activity of the corresponding genes. A comparison of the model to molecular measurements indicates that the model explains observed DNA, RNA, protein and cell depth profiles. This suggests that microbial activity in marine ecosystems such as oxygen minimum zones is well described by DNA abundance, which, in conjunction with geochemical constraints, determines pathway expression and process rates. Our work further demonstrates how meta'omic data can be mechanistically linked to environmental redox conditions and biogeochemical processes.
Vallverdú, M; Tibaduisa, O; Clariá, F; Hoyer, D; Giraldo, B; Benito, S; Caminal, P
Nonlinear processes of the autonomic nervous system (ANS) can produce breath-to-breath variability in the pattern of breathing. In order to provide assess to these nonlinear processes, nonlinear statistical dependencies between heart rate variability and respiratory pattern variability are analyzed. In this way, auto-mutual information and cross-mutual information concepts are applied. This information flow analysis is presented as a short-term non linear analysis method to investigate the information flow interactions in patients on weaning trials. 78 patients from mechanical ventilation were studied: Group A of 28 patients that failed to maintain spontaneous breathing and were reconnected; Group B of 50 patients with successful trials. The results show lower complexity with an increase of information flow in group A than in group B. Furthermore, a more (weakly) coupled nonlinear oscillator behavior is observed in the series of group A than in B.
Padikkal, S.; Rema, K. P.
Numerous examples exist worldwide of partial or complete alteration to the natural flow regime of river systems as a consequence of large scale water abstraction from upstream reaches. The effects may not be conspicuous in the case of very large rivers, but the ecosystems of smaller rivers or streams may be completely destroyed over a period of time. While restoration of the natural flow regime may not be possible, at present there is increased effort to implement restoration by regulating environmental flow. This study investigates the development of an environmental flow management model at an icon site in the small river basin of Bharathapuzha, west India. To determine optimal environmental flow regimes, a historic flow model based on data assimilated since 1978 indicated a satisfactory minimum flow depth for river ecosystem sustenance is 0.907 m (28.8 m3/s), a value also obtained from the hydraulic model; however, as three of the reservoirs were already operational at this time a flow depth of 0.922 m is considered a more viable estimate. Analysis of daily stream flow in 1997-2006, indicated adequate flow regimes during the monsoons in June-November, but that sections of the river dried out in December-May with alarming water quality conditions near the river mouth. Furthermore, the preferred minimum `dream' flow regime expressed by stakeholders of the region is a water depth of 1.548 m, which exceeds 50 % of the flood discharge in July. Water could potentially be conserved for environmental flow purposes by (1) the de-siltation of existing reservoirs or (2) reducing water spillage in the transfer between river basins. Ultimately environmental flow management of the region requires the establishment of a co-ordinated management body and the regular assimilation of water flow information from which science based decisions are made, to ensure both economic and environmental concerns are adequately addressed.
Full Text Available Information search is an essential part of the consumer`s decision making process. The online medium offers new opportunities and challenges for information search activities (in and outside the marketing context. We are interested in the way human information experiences and behaviors are affected by this. Very often online games and social web activities are perceived as challenging, engaging and enjoyable, while online information search is far below this evaluation. Our research proposal implies that using the online medium for information search may provoke enjoyable experiences through the flow state, which may in turn positively influence an individual`s exploratory information behavior and encourage his/her pro-active market behavior. The present study sets out to improve the understanding of the online medium`s impact on human`s exploratory behavior. We hypothesize that the inclusion of the online flow experience in our research model will better explain exploratory information search behaviors. A 11-component conceptual framework is proposed to explain the manifestations of flow, its personal and technological determinants and its behavioral consequence in the context of online information search. Our research has the primary purpose to present an integrated online flow model. Its secondary objective is to stimulate extended research in the area of informational behaviors in the digital age. The paper is organized in three sections. In the first section we briefly report the analysis results of the most relevant online flow theory literature and, drawing on it, we are trying to identify variables and relationships among these. In the second part we propose a research model and use prior flow models to specify a range of testable hypothesis. Drawing on the conceptual model developed, the last section of our study presents the final conclusions and proposes further steps in evaluating the model`s validity. Future research directions
Lehmann, A.; Fahland, D.; Lohmann, N.; Moser, S.
When outsourcing tasks of a business process to a third party, information flow security becomes a critical issue. In particular implicit information leaks are an intriguing problem. Given a business process one could ask whether the execution of a confidential task is kept secret to a third party
Alexopoulou, Peggy; Hepworth, Mark; Morris, Anne
This study explored the multitasking information behaviour of Web users and how this is influenced by working memory, flow and Personal, Artefact and Task characteristics, as described in the PAT model. The research was exploratory using a pragmatic, mixed method approach. Thirty University students participated; 10 psychologists, 10 accountants and 10 mechanical engineers. The data collection tools used were: pre and post questionnaires, a working memory test, a flow state scale test, audio-visual data, web search logs, think aloud data, observation, and the critical decision method. All participants searched information on the Web for four topics: two for which they had prior knowledge and two more without prior knowledge. Perception of task complexity was found to be related to working memory. People with low working memory reported a significant increase in task complexity after they had completed information searching tasks for which they had no prior knowledge, this was not the case for tasks with prior knowledge. Regarding flow and task complexity, the results confirmed the suggestion of the PAT model (Finneran and Zhang, 2003), which proposed that a complex task can lead to anxiety and low flow levels as well as to perceived challenge and high flow levels. However, the results did not confirm the suggestion of the PAT model regarding the characteristics of web search systems and especially perceived vividness. All participants experienced high vividness. According to the PAT model, however, only people with high flow should experience high levels of vividness. Flow affected the degree of change of knowledge of the participants. People with high flow gained more knowledge for tasks without prior knowledge rather than people with low flow. Furthermore, accountants felt that tasks without prior knowledge were less complex at the end of the web seeking procedure than psychologists and mechanical engineers. Finally, the three disciplines appeared to differ
Koehn, John D.; Todd, Charles R.; Zampatti, Brenton P.; Stuart, Ivor G.; Conallin, Anthony; Thwaites, Leigh; Ye, Qifeng
Carp are a highly successful invasive fish species, now widespread, abundant and considered a pest in south-eastern Australia. To date, most management effort has been directed at reducing abundances of adult fish, with little consideration of population growth through reproduction. Environmental water allocations are now an important option for the rehabilitation of aquatic ecosystems, particularly in the Murray-Darling Basin. As carp respond to flows, there is concern that environmental watering may cause floodplain inundation and provide access to spawning habitats subsequently causing unwanted population increase. This is a management conundrum that needs to be carefully considered within the context of contemporary river flow management (natural, environmental, irrigation). This paper uses a population model to investigate flow-related carp population dynamics for three case studies in the Murray-Darling Basin: (1) river and terminal lakes; (2) wetlands and floodplain lakes; and (3) complex river channel and floodplain system. Results highlight distinctive outcomes depending on site characteristics. In particular, the terminal lakes maintain a significant source carp population regardless of river flow; hence any additional within-channel environmental flows are likely to have little impact on carp populations. In contrast, large-scale removal of carp from the lakes may be beneficial, especially in times of extended low river flows. Case studies 2 and 3 show how wetlands, floodplain lakes and the floodplain itself can now often be inundated for several months over the carp spawning season by high volume flows provided for irrigation or water transfers. Such inundations can be a major driver of carp populations, compared to within channel flows that have relatively little effecton recruitment. The use of a population model that incorporates river flows and different habitats for this flow-responsive species, allows for the comparison of likely population
Samohovych Oleksandr S.
Full Text Available The article is aimed at identifying the impact of information incompleteness and asymmetry, irrational behavior of actors on the processes of municipal wastes management. It has been found that, at the present moment in Ukraine, quality of the transfer of information flows on the municipal wastes management between the State authority, local government bodies, enterprises, and the public stays at a low level. The urban sanitation schemes are being adopted and waste management technologies are being introduced at the local level, but the local government bodies have not been provided with sufficient information to make optimal decisions. Acting independently, the market mechanism would not be able to overcome the asymmetry of information in the short terms, and the State intervention would be needed to correct the information inadequacy of the municipal waste market. Prospect for future research will be determining conditions for an effective distribution of information flows in the process of municipal wastes management.
Pavlogiannis, Andreas; Mozhayskiy, Vadim; Tagkopoulos, Ilias
Biological networks tend to have high interconnectivity, complex topologies and multiple types of interactions. This renders difficult the identification of sub-networks that are involved in condition- specific responses. In addition, we generally lack scalable methods that can reveal the information flow in gene regulatory and biochemical pathways. Doing so will help us to identify key participants and paths under specific environmental and cellular context. This paper introduces the theory of network flooding, which aims to address the problem of network minimization and regulatory information flow in gene regulatory networks. Given a regulatory biological network, a set of source (input) nodes and optionally a set of sink (output) nodes, our task is to find (a) the minimal sub-network that encodes the regulatory program involving all input and output nodes and (b) the information flow from the source to the sink nodes of the network. Here, we describe a novel, scalable, network traversal algorithm and we assess its potential to achieve significant network size reduction in both synthetic and E. coli networks. Scalability and sensitivity analysis show that the proposed method scales well with the size of the network, and is robust to noise and missing data. The method of network flooding proves to be a useful, practical approach towards information flow analysis in gene regulatory networks. Further extension of the proposed theory has the potential to lead in a unifying framework for the simultaneous network minimization and information flow analysis across various "omics" levels.
Full Text Available This investigation is among the first ones to analyze the neural basis of an investment process with money flow information of financial market, using a simplified task where volunteers had to choose to buy or not to buy stocks based on the display of positive or negative money flow information. After choosing “to buy” or “not to buy,” participants were presented with feedback. At the same time, event-related potentials (ERPs were used to record investor’s brain activity and capture the event-related negativity (ERN and feedback-related negativity (FRN components. The results of ERN suggested that there might be a higher risk and more conflict when buying stocks with negative net money flow information than positive net money flow information, and the inverse was also true for the “not to buy” stocks option. The FRN component evoked by the bad outcome of a decision was more negative than that by the good outcome, which reflected the difference between the values of the actual and expected outcome. From the research, we could further understand how investors perceived money flow information of financial market and the neural cognitive effect in investment process.
Wang, Cuicui; Vieito, João Paulo; Ma, Qingguo
This investigation is among the first ones to analyze the neural basis of an investment process with money flow information of financial market, using a simplified task where volunteers had to choose to buy or not to buy stocks based on the display of positive or negative money flow information. After choosing “to buy” or “not to buy,” participants were presented with feedback. At the same time, event-related potentials (ERPs) were used to record investor's brain activity and capture the event-related negativity (ERN) and feedback-related negativity (FRN) components. The results of ERN suggested that there might be a higher risk and more conflict when buying stocks with negative net money flow information than positive net money flow information, and the inverse was also true for the “not to buy” stocks option. The FRN component evoked by the bad outcome of a decision was more negative than that by the good outcome, which reflected the difference between the values of the actual and expected outcome. From the research, we could further understand how investors perceived money flow information of financial market and the neural cognitive effect in investment process. PMID:26557139
Hillebrand, Arjan; Tewarie, Prejaas; van Dellen, Edwin; Yu, Meichen; Carbo, Ellen W S; Douw, Linda; Gouw, Alida A; van Straaten, Elisabeth C W; Stam, Cornelis J
Normal brain function requires interactions between spatially separated, and functionally specialized, macroscopic regions, yet the directionality of these interactions in large-scale functional networks is unknown. Magnetoencephalography was used to determine the directionality of these interactions, where directionality was inferred from time series of beamformer-reconstructed estimates of neuronal activation, using a recently proposed measure of phase transfer entropy. We observed well-organized posterior-to-anterior patterns of information flow in the higher-frequency bands (alpha1, alpha2, and beta band), dominated by regions in the visual cortex and posterior default mode network. Opposite patterns of anterior-to-posterior flow were found in the theta band, involving mainly regions in the frontal lobe that were sending information to a more distributed network. Many strong information senders in the theta band were also frequent receivers in the alpha2 band, and vice versa. Our results provide evidence that large-scale resting-state patterns of information flow in the human brain form frequency-dependent reentry loops that are dominated by flow from parieto-occipital cortex to integrative frontal areas in the higher-frequency bands, which is mirrored by a theta band anterior-to-posterior flow.
Pan, Jing Samantha; Bingham, Ned; Chen, Chang; Bingham, Geoffrey P
Use of motion to break camouflage extends back to the Cambrian [In the Blink of an Eye: How Vision Sparked the Big Bang of Evolution (New York Basic Books, 2003)]. We investigated the ability to break camouflage and continue to see camouflaged targets after motion stops. This is crucial for the survival of hunting predators. With camouflage, visual targets and distracters cannot be distinguished using only static image structure (i.e., appearance). Motion generates another source of optical information, optic flow, which breaks camouflage and specifies target locations. Optic flow calibrates image structure with respect to spatial relations among targets and distracters, and calibrated image structure makes previously camouflaged targets perceptible in a temporally stable fashion after motion stops. We investigated this proposal using laboratory experiments and compared how many camouflaged targets were identified either with optic flow information alone or with combined optic flow and image structure information. Our results show that the combination of motion-generated optic flow and target-projected image structure information yielded efficient and stable perception of camouflaged targets.
Full Text Available Sustainable materials management focuses on the dynamics of materials in economic and environmental activities to optimize material use efficiency and reduce environmental impact. A preliminary web-based information system is thus developed to analyze the issues of resource consumption and waste generation, enabling countries to manage resources and wastes from a life cycle perspective. This pioneering system features a four-layer framework that integrates information on physical flows and economic activities with material flow accounting and waste input–output table analysis. Within this framework, several applications were developed for different waste and resource management stakeholders. The hierarchical and interactive dashboards allow convenient overview of economy-wide material accounts, waste streams, and secondary resource circulation. Furthermore, the system can trace material flows through associated production supply chain and consumption activities. Integrated with economic models; this system can predict the possible overloading on the current waste management facility capacities and provide decision support for designing strategies to approach resource sustainability. The limitations of current system are specified for directing further enhancement of functionalities.
Sun, Qing-Chao; Huang, Wei-Qiang; Jiang, Ying-Jie; Sun, Wei
Multi-product collaborative development is adopted widely in manufacturing enterprise, while the present multi-project planning models don't take technical/data interactions of multiple products into account. To decrease the influence of technical/data interactions on project progresses, the information flow scheduling models based on the extended DSM is presented. Firstly, information dependencies are divided into four types: series, parallel, coupling and similar. Secondly, different types of dependencies are expressed as DSM units, and the extended DSM model is brought forward, described as a block matrix. Furthermore, the information flow scheduling methods is proposed, which involves four types of operations, where partitioning and clustering algorithm are modified from DSM for ensuring progress of high-priority project, merging and converting is the specific computation of the extended DSM. Finally, the information flow scheduling of two machine tools development is analyzed with example, and different project priorities correspond to different task sequences and total coordination cost. The proposed methodology provides a detailed instruction for information flow scheduling in multi-product development, with specially concerning technical/data interactions.
The information flow among a group of researchers at the Instituto de Pesquisas Energeticas e Nucleares of the Comissao Nacional de Energia Nuclear/Sao Paulo (IPEN-CNEN/SP) was analysed by means of a study of use and non-use of formal and informal information channels. The study proposed suggesting ideas for the improvement of the information network, as a means of contributing to the future planning of the information transfer structure among the I PEN technical-scientific community. A structural interview was used to collect the data. The researchers were characterized under functional, academic and professional aspects. Their information needs were identified as well as the factors which affect such needs. The researchers behaviour while searching for information was analysed by means of the critical incident technique. The informal communication networks were also identified according to a sociometric technique. The results show that in the Department included in the study, information flows equally through formal and informal channels. It is evident that there is a small correlation between degree of use and degree of importance of information sources. There is no evidence that those who make little use of formal channels supply their information needs by use of informal channels. It was patent that non-accessibility is the key factor which influences the non-use of information. The motivation of the use of formal sources is significantly inhibited by the fact that the library collection is not brought up-to-date. Relatively intense informal communication was verified both inter and intra-divisions. It is also evident that researchers with higher academic degree make more frequent use of formal channels, and stand a greater possibility of being identified as gatekeepers. However, those researchers who are considered more productive, at their present function, are not always those who make more frequent use of formal channels. The conclusions show that in order
Thomaz, Andrea L.; Chao, Crystal
Turn-taking is a fundamental part of human communication. Our goal is to devise a turn-taking framework for human-robot interaction that, like the human skill, represents something fundamental about interaction, generic to context or domain. We propose a model of turn-taking, and conduct an experiment with human subjects to inform this model. Our findings from this study suggest that information flow is an integral part of human floor-passing behavior. Following this, we implement autonomous ...
Olden, Julian D.; Konrad, Christopher P.; Melis, Theodore S.; Kennard, Mark J.; Freeman, Mary C.; Mims, Meryl C.; Bray, Erin N.; Gido, Keith B.; Hemphill, Nina P.; Lytle, David A.; McMullen, Laura E.; Pyron, Mark; Robinson, Christopher T.; Schmidt, John C.; Williams, John G.
Greater scientific knowledge, changing societal values, and legislative mandates have emphasized the importance of implementing large-scale flow experiments (FEs) downstream of dams. We provide the first global assessment of FEs to evaluate their success in advancing science and informing management decisions. Systematic review of 113 FEs across 20 countries revealed that clear articulation of experimental objectives, while not universally practiced, was crucial for achieving management outcomes and changing dam-operating policies. Furthermore, changes to dam operations were three times less likely when FEs were conducted primarily for scientific purposes. Despite the recognized importance of riverine flow regimes, four-fifths of FEs involved only discrete flow events. Over three-quarters of FEs documented both abiotic and biotic outcomes, but only one-third examined multiple taxonomic responses, thus limiting how FE results can inform holistic dam management. Future FEs will present new opportunities to advance scientifically credible water policies.
Ha, Chang Hoon
The objective of this study is to investigate experimentally the relationship between an operator's mental workload and the information flow rate of accident diagnosis tasks and further to propose the information flow rate as an analytic method for measuring the mental workload. There are two types of mental workload in the advanced MCR of NPPs: the information processing workload, which is the processing that the human operator must actually perform in order to complete the diagnosis task, and emotional stress workload experienced by the operator. In this study, the focus is on the former. Three kinds of methods are used to measure the operator's workload: information flow rate, subjective methods, and physiological measures. Information flows for eight accident diagnosis tasks are modeled qualitatively using a stage model and are quantified using Conant's model. The eight accident cases are considered here are: Loss Of Coolant Accident (LOCA), Steam Generator Tube Rupture (SGTR), Steam Line Break (SLB), Feedwater Line Break (FLB), Pressurizer (PZR) spray and heater failure, Reactor Coolant Pump (RCP) trip, Main Steam Isolation Valve (MSIV) failure, and PZR spray failure. The information flow rate is obtained for each diagnosis task by imposing time limit restrictions for the tasks. Subjective methods require the operators to respond to questionnaires to rate their level of mental effort. NASA-TLX and MCH scale are selected as subjective methods. NASA-TLX is a subjective method used in the various fields including the aviation, automobile, and nuclear industries. It has a multi-dimensional rating technique and provides an overall workload score based on a weighted average on six subscales using pair-wise comparison tests. MCH, on the other hand, is one-dimensional and uses a 10- point rating technique. As with NASA-TLX, the higher the score is, the higher the subjective workload is. For the physiological measurements, an eye tracking system analyzes eye movements
Ha, Chang Hoon
The objective of this study is to investigate experimentally the relationship between an operator's mental workload and the information flow rate of accident diagnosis tasks and further to propose the information flow rate as an analytic method for measuring the mental workload. There are two types of mental workload in the advanced MCR of NPPs: the information processing workload, which is the processing that the human operator must actually perform in order to complete the diagnosis task, and emotional stress workload experienced by the operator. In this study, the focus is on the former. Three kinds of methods are used to measure the operator's workload: information flow rate, subjective methods, and physiological measures. Information flows for eight accident diagnosis tasks are modeled qualitatively using a stage model and are quantified using Conant's model. The eight accident cases are considered here are: Loss Of Coolant Accident (LOCA), Steam Generator Tube Rupture (SGTR), Steam Line Break (SLB), Feedwater Line Break (FLB), Pressurizer (PZR) spray and heater failure, Reactor Coolant Pump (RCP) trip, Main Steam Isolation Valve (MSIV) failure, and PZR spray failure. The information flow rate is obtained for each diagnosis task by imposing time limit restrictions for the tasks. Subjective methods require the operators to respond to questionnaires to rate their level of mental effort. NASA-TLX and MCH scale are selected as subjective methods. NASA-TLX is a subjective method used in the various fields including the aviation, automobile, and nuclear industries. It has a multi-dimensional rating technique and provides an overall workload score based on a weighted average on six subscales using pair-wise comparison tests. MCH, on the other hand, is one-dimensional and uses a 10- point rating technique. As with NASA-TLX, the higher the score is, the higher the subjective workload is. For the physiological measurements, an eye tracking system analyzes
Borghuis, V.A.J.; Feijs, L.M.G.
In this paper we introduce a typed -calculus in which computer networks can be formalized and directed at situations where the services available on the network are stationary, while the information can flow freely. For this calculus, an analogue of the ‘propositions-as-types ’interpretation of
This thesis uses logical tools to investigate a number of basic features of social networks and their evolution over time, including flow of information and spread of opinions. Part I contains the preliminaries, including an introduction to the basic phenomena in social networks that call for a
textabstractAnalysis modules tend to be set up as one way flow of information, i.e a clear distinction between cause and effect or input and output. However, as the speed of analysis approaches real time (or faster than movie rate), it becomes increasingly difficult for an external user to
Jajcay, Nikola; Hlinka, Jaroslav; Hartman, David; Paluš, Milan
Roč. 16, - (2014), EGU2014-12768 ISSN 1607-7962. [EGU General Assembly /11./. 27.04.2014-02.05.2014, Vienna] Institutional support: RVO:67985807 Keywords : Granger causality * climate * information flow * surface air temperature * wind Subject RIV: BB - Applied Statistics, Operational Research
Zubek, Julian; Denkiewicz, Michał; Barański, Juliusz; Wróblewski, Przemysław; Rączaszek-Leonardi, Joanna; Plewczynski, Dariusz
This paper explores how information flow properties of a network affect the formation of categories shared between individuals, who are communicating through that network. Our work is based on the established multi-agent model of the emergence of linguistic categories grounded in external environment. We study how network information propagation efficiency and the direction of information flow affect categorization by performing simulations with idealized network topologies optimizing certain network centrality measures. We measure dynamic social adaptation when either network topology or environment is subject to change during the experiment, and the system has to adapt to new conditions. We find that both decentralized network topology efficient in information propagation and the presence of central authority (information flow from the center to peripheries) are beneficial for the formation of global agreement between agents. Systems with central authority cope well with network topology change, but are less robust in the case of environment change. These findings help to understand which network properties affect processes of social adaptation. They are important to inform the debate on the advantages and disadvantages of centralized systems.
Full Text Available Flow is a construct imported in marketing research from social sciences in order to examine consumer behavior in the online medium. The construct describes a state of deep involvement in a challenging activity, most frequently characterized by high levels of enjoyment, control and concentration. Researchers found that the degree to which online experience is challenging can be defined, measured, and related well to important marketing variables. As shown by our extensive literature review, flow measurements include antecedents, dimensions and consequences of flow. The present paper represents a detailed description of the construct`s operationalization in the context of online information search. In this respect, our main goal is to produce a basic instrument to evaluate the flow experience of online search, in order to capitalize on the premises of an interactive, complex informational medium â€“ the World Wide Web â€“ and on the consequence of an exploratory informational behavior of users. The instrument is conceived to offer a primal possibility to collect data. The composition, source and significance of the 11 scales used to measure the multiple factors of the flow experience during online search are detailed in this study with the aim to ensure the compliance with scientific rigors and to facilitate correct reports of data related to the reliability and validity of measurements. For further research, we propose factor analysis to test the resulted instrument and to ensure that the measures employed are psychometrically sound. Factor analysis refers to a wide range of statistic techniques used to represent a set of variables in concordance with a reduced number of hypothetical variables called factors. Factorial analysis is used to solve two types of problems: reducing the number of variables to increase data processing speed and identifying hidden patterns in the existent data relations. However, we expect our scales to perform
Full Text Available Information flows on social media platforms are able to show trends and user interests as well as connections between users. In this paper, we present a method how to analyze city related networks on the social media platform Twitter based on the user content. Forty million tweets have been downloaded via Twitter’s REST API (application programming interface and Twitter’s Streaming API. The investigation focuses on two aspects: firstly, trend detection has been done to analyze 31 informational world cities, according the user activity, popularity of shared websites and topics defined by hashtags. Secondly, a hint of how connected informational cities are to each other is given by creating a clustered network based on the number of connections between different city pairs. Tokyo, New York City, London and Paris clearly lead the ranking of the most active cities if compared by the total number of tweets. The investigation shows that Twitter is very frequently used to share content from other services like Instagram or YouTube. The most popular topics in tweets reveal great differences between the cities. In conclusion, the investigation shows that social media services like Twitter also can be a mirror of the society they are used in and bring to light information flows of connected cities in a global network. The presented method can be applied in further research to analyze information flows regarding specific topics and/or geographical locations.
Luong, Huynh Van; Rakêt, Lars Lau; Huang, Xin
Distributed video coding (DVC) is a coding paradigm which exploits the source statistics at the decoder side to reduce the complexity at the encoder. The coding efficiency of DVC critically depends on the quality of side information generation and accuracy of noise modeling. This paper considers...... Transform Domain Wyner-Ziv (TDWZ) coding and proposes using optical flow to improve side information generation and clustering to improve noise modeling. The optical flow technique is exploited at the decoder side to compensate weaknesses of block based methods, when using motion-compensation to generate...... side information frames. Clustering is introduced to capture cross band correlation and increase local adaptivity in the noise modeling. This paper also proposes techniques to learn from previously decoded (WZ) frames. Different techniques are combined by calculating a number of candidate soft side...
Research on Information Systems (IS) acceptance is substantially focused on extrinsic motivation in workplaces, little is known about the underlying intrinsic motivations of Hedonic IS (HIS) acceptance. This paper proposes a hybrid HIS acceptance model which takes the unique characteristics of HIS and multiple identities of a HIS user into consideration by interacting Hedonic theory, Flow theory with Technology Acceptance Model (TAM). The model was empirically tested by a field survey. The result indicates that emotional responses, imaginal responses, and flow experience are three main contributions of HIS acceptance.
Maréchal Eric; Ortet Philippe; Roy Sylvaine; Bastien Olivier
Abstract Background Popular methods to reconstruct molecular phylogenies are based on multiple sequence alignments, in which addition or removal of data may change the resulting tree topology. We have sought a representation of homologous proteins that would conserve the information of pair-wise sequence alignments, respect probabilistic properties of Z-scores (Monte Carlo methods applied to pair-wise comparisons) and be the basis for a novel method of consistent and stable phylogenetic recon...
Full Text Available Electroencephalogram (EEG phase synchronization analyses can reveal large-scale communication between distant brain areas. However, it is not possible to identify the directional information flow between distant areas using conventional phase synchronization analyses. In the present study, we applied transcranial magnetic stimulation (TMS to the occipital area in subjects who were resting with their eyes closed, and analyzed the spatial propagation of transient TMS-induced phase resetting by using the transfer entropy (TE, to quantify the causal and directional flow of information. The time-frequency EEG analysis indicated that the theta (5 Hz phase locking factor (PLF reached its highest value at the distant area (the motor area in this study, with a time lag that followed the peak of the transient PLF enhancements of the TMS-targeted area at the TMS onset. PPI (phase-preservation index analyses demonstrated significant phase resetting at the TMS-targeted area and distant area. Moreover, the TE from the TMS-targeted area to the distant area increased clearly during the delay that followed TMS onset. Interestingly, the time lags were almost coincident between the PLF and TE results (152 vs. 165 ms, which provides strong evidence that the emergence of the delayed PLF reflects the causal information flow. Such tendencies were observed only in the higher-intensity TMS condition, and not in the lower-intensity or sham TMS conditions. Thus, TMS may manipulate large-scale causal relationships between brain areas in an intensity-dependent manner. We demonstrated that single-pulse TMS modulated global phase dynamics and directional information flow among synchronized brain networks. Therefore, our results suggest that single-pulse TMS can manipulate both incoming and outgoing information in the TMS-targeted area associated with functional changes.
The aim of this article is to investigate the governance models of companies listed on the Italian Stock Exchange by using a network approach, which describes the interlinks between boards of directors. Following mainstream literature, I construct a weighted graph representing the listed companies (vertices) and their relationships (weighted edges), the Corporate Board Network; I then apply three different vertex centrality measures: degree, betweenness and flow betweenness. What emerges from the network construction and by applying the degree centrality is a structure with a large number of connections but not particularly dense, where the presence of a small number of highly connected nodes (hubs) is evident. Then I focus on betweenness and flow betweenness; indeed I expect that these centrality measures may give a representation of the intensity of the relationship between companies, capturing the volume of information flowing from one vertex to another. Finally, I investigate the possible scale-free structure of the network.
Kim, Jong Hyun; Shin, Yeong Cheol
Interaction between automatic control and operators is one of main issues in the application of automation technology. Inappropriate information from automatic control systems causes unexpected problems in human-automation collaboration. Poor information becomes critical, especially when the operator takes over the control from an automation system. Operators cannot properly handle the situation transferred from the automatic mode because of inadequate situation awareness, if the operator is out-of-the loop and the automatic control system fails. Some cases of unplanned reactor trips during the transition between the manual mode and the automatic mode are reported in nuclear power plants (NPPs). Among unplanned reactor trips since 2002, two cases were partially caused by automation-related failures of steam generator (SG) level control. This paper conducts information flow analysis to identify information and control requirement for human-system interaction of SG level control. At first, this paper identifies the level of automation in SG level control systems and then function allocation between system control and human operators. Then information flow analysis for monitoring and transition of automation is performed by adapting job process chart. Information and control requirements will be useful as an input for the human-system interface (HSI) design of SG level control
Titles of articles in seven highly ranked multidisciplinary psychology journals for every fifth year between 1966 and 2011 (inclusive) were studied in terms of title length, word length, punctuation density, and word pleasantness, activation, and concreteness (assessed by the Dictionary of Affect in Language). Titles grew longer (by three words) and were more frequently punctuated (by one colon or comma for every other article) between 1966 and 2011. This may reflect the increasing complexity of psychology and satisfy readers' requirements for more specific information. There were significant differences among journals (e.g., titles in the Annual Review of Psychology were scored by the Dictionary of Affect as the most pleasant, and those in Psychological Bulletin as the least pleasant) and among categories of journals (e.g., titles in review journals employed longer words than those in research or association journals). Differences were stable across time and were employed to successfully categorize titles from a validation sample.
Yamakawa, Tadashi; Munakata, Masahiro; Kimura, Hideo; Hyodo, Hiroshi
Radionuclide migration toward the human environment is to be assessed as the part of long-term safety assessments of geologic disposal of radioactive waste. Geologic processes, which include volcanic activity, hydrothermal activity, seismicity and deformation, bring about hydrogeologic changes in the regional groundwater flow system around a repository site. Groundwater flow systems in Japan have been studied in several sites such as Tono mine, Kamaishi mine and Horonobe area, but methodology of studies in these sites does not have fully developed. This study was conducted to develop methodologies of boundary delineation for regional groundwater flow systems. Geographic Information System, GIS, was applied using available topographic, hydrologic and geologic data for an area of interest. Miyakoji in the Abukuma Mountains was selected as the area, for the reason of its simple geologic setting formed by granitic rocks and topographically gentle hills of drainage basin. Data used in this study cover topographic sheets, digital elevation model, satellite imagery, geologic maps, topographic classification maps, soil distribution maps and landuse maps. Through the GIS techniques using these data, thematic maps on topographic features, surface conditions, land coverage, geology and geologic structure and weathered crust were developed, and these thematic maps were further applied to extract four factors affecting the regional groundwater flows: topographic condition, precipitation recharge, fracture characteristics and potential flows. The present study revealed that, taking the potential groundwater flows and characteristics of fractured zones in the area into consideration, the groundwater flow system in Miyakoji drainage basin should be bounded by the Otakine Mountain and the northern part of Tokoha Drainage Basin. The delineated area is larger than understood before. (author)
Full Text Available This paper highlights the first attempt by researchers at Stellenbosch University to model freight flows between and for 17 countries in sub-Saharan Africa (SSA. The model will be informed by and linked to the South African surface Freight Demand Model (FDM given these dimensions. By analysing and collating available datasets and developing a freight flow model, a better understanding of freight movements between countries can be obtained and then used for long-term planning efforts. A simple methodology is envisaged that will entail a high-level corridor classification that links a major district in the country with a similar district in another country. Existing trade data will be used to corroborate new base-year economic demand and supply volumetric data that will be generated from social accounting matrices for each country. The trade data will also provide initial flow dynamics between countries that will be refined according to the new volumes. The model can then generate commodity-level corridor flows between SSA countries, and between SSA countries and the rest of the world, as well as intra-country rural and metropolitan flows, using a gravity-based modelling approach. This article outlines efforts to harmonise trade data between the 17 countries identified, as well as between these countries and the rest of the world as a first step towards developing a freight demand model for sub-Saharan Africa.
Bilek, Edda; Ruf, Matthias; Schäfer, Axel; Akdeniz, Ceren; Calhoun, Vince D; Schmahl, Christian; Demanuele, Charmaine; Tost, Heike; Kirsch, Peter; Meyer-Lindenberg, Andreas
Social interactions are fundamental for human behavior, but the quantification of their neural underpinnings remains challenging. Here, we used hyperscanning functional MRI (fMRI) to study information flow between brains of human dyads during real-time social interaction in a joint attention paradigm. In a hardware setup enabling immersive audiovisual interaction of subjects in linked fMRI scanners, we characterize cross-brain connectivity components that are unique to interacting individuals, identifying information flow between the sender's and receiver's temporoparietal junction. We replicate these findings in an independent sample and validate our methods by demonstrating that cross-brain connectivity relates to a key real-world measure of social behavior. Together, our findings support a central role of human-specific cortical areas in the brain dynamics of dyadic interactions and provide an approach for the noninvasive examination of the neural basis of healthy and disturbed human social behavior with minimal a priori assumptions.
Full Text Available Abstract Background Single Nucleotide Polymorphism (SNP genotyping is a major activity in biomedical research. The Taqman technology is one of the most commonly used approaches. It produces large amounts of data that are difficult to process by hand. Laboratories not equipped with a Laboratory Information Management System (LIMS need tools to organize the data flow. Results We propose a package of Visual Basic programs focused on sample management and on the parsing of input and output TaqMan files. The code is written in Visual Basic, embedded in the Microsoft Office package, and it allows anyone to have access to those tools, without any programming skills and with basic computer requirements. Conclusion We have created useful tools focused on management of TaqMan genotyping data, a critical issue in genotyping laboratories whithout a more sophisticated and expensive system, such as a LIMS.
Monnier, Stéphanie; Cox, David G; Albion, Tim; Canzian, Federico
Background Single Nucleotide Polymorphism (SNP) genotyping is a major activity in biomedical research. The Taqman technology is one of the most commonly used approaches. It produces large amounts of data that are difficult to process by hand. Laboratories not equipped with a Laboratory Information Management System (LIMS) need tools to organize the data flow. Results We propose a package of Visual Basic programs focused on sample management and on the parsing of input and output TaqMan files. The code is written in Visual Basic, embedded in the Microsoft Office package, and it allows anyone to have access to those tools, without any programming skills and with basic computer requirements. Conclusion We have created useful tools focused on management of TaqMan genotyping data, a critical issue in genotyping laboratories whithout a more sophisticated and expensive system, such as a LIMS. PMID:16221298
Zhao, Jian; Cao, Nan; Wen, Zhen; Song, Yale; Lin, Yu-Ru; Collins, Christopher
We present FluxFlow, an interactive visual analysis system for revealing and analyzing anomalous information spreading in social media. Everyday, millions of messages are created, commented, and shared by people on social media websites, such as Twitter and Facebook. This provides valuable data for researchers and practitioners in many application domains, such as marketing, to inform decision-making. Distilling valuable social signals from the huge crowd's messages, however, is challenging, due to the heterogeneous and dynamic crowd behaviors. The challenge is rooted in data analysts' capability of discerning the anomalous information behaviors, such as the spreading of rumors or misinformation, from the rest that are more conventional patterns, such as popular topics and newsworthy events, in a timely fashion. FluxFlow incorporates advanced machine learning algorithms to detect anomalies, and offers a set of novel visualization designs for presenting the detected threads for deeper analysis. We evaluated FluxFlow with real datasets containing the Twitter feeds captured during significant events such as Hurricane Sandy. Through quantitative measurements of the algorithmic performance and qualitative interviews with domain experts, the results show that the back-end anomaly detection model is effective in identifying anomalous retweeting threads, and its front-end interactive visualizations are intuitive and useful for analysts to discover insights in data and comprehend the underlying analytical model.
Full Text Available In this paper the architecture of the software designed for management of position and identification data of floating and flying objects in Maritime areas controlled by Polish Border Guard is presented. The software was designed for managing information stored in a distributed system with two variants of the software, one for a mobile device installed on a vessel, an airplane or a car and second for a central server. The details of implementation of all functionalities of the MapServer in both, mobile and central, versions are briefly presented on the basis of information flow diagrams.
This report was prepared in response to a request from NRC Chairman Ahearne that directed the Office of Inspection and Enforcement to resume its investigation of information flow during the accident at Three Mile Island (TMI) that occurred on March 28, 1979. This investigation was resumed on March 21, 1980. The transfer of information among individuals, agencies, and personnel from Metropolitan Edison was analyzed to ascertain what knowledge was held by various individuals of the specific events, parameters, and systems during the accident at TMI. Maximum use was made of existing records, and additional interviews were conducted to clarify areas that had not been pursued during earlier investigations. Although the passage of time between the accident and post-accident interviews hampered precise recollections of events and circumstances, the investigation revealed that information was not intentionally withheld during the accident and that the system for effective transfer of information was inadequate during the accident
Grisson, Ricky; Kim, Ji Yeon; Brodsky, Victor; Kamis, Irina K; Singh, Balaji; Belkziz, Sidi M; Batra, Shalini; Myers, Harold J; Demyanov, Alexander; Dighe, Anand S
A central duty of the laboratory is to inform clinicians about the availability and usefulness of laboratory testing. In this report, we describe a new class of laboratory middleware that connects the traditional clinical laboratory information system with the rest of the enterprise, facilitating information flow about testing services. We demonstrate the value of this approach in efficiently supporting an inpatient order entry application. We also show that order entry monitoring and iterative middleware updates can enhance ordering efficiency and promote improved ordering practices. Furthermore, we demonstrate the value of algorithmic approaches to improve the accuracy and completeness of laboratory test searches. We conclude with a discussion of design recommendations for middleware applications and discuss the potential role of middleware as a sharable, centralized repository of laboratory test information.
Renelson Ribeiro Sampaio
Full Text Available To maintain competitive advantage, companies must innovate. Thus, high quality, low cost and diversity of products became the starting condition for competitiveness. As a result, speed and flexibility in the design of new products are crucial, as these factors are related to the ability of companies to respond adequately to the pressures from the market. Innovation is usually related to the introduction of something new in a particular work process, which lead to new products or services. The ability for people to share information is then to be seen as a factor of strengthening of the work processes in order to contribute substantially to the company's competitive advantage in this new context. Considering these aspects, this paper seeks to map the processes of product development for an automobile company, analyzing and mapping the flow of information. The main contribution of this study is the critical analysis of social network established in a time of development of a project. This analysis provides an estimate of the level of diffusion of information and knowledge among team members, as well as the comparison between social networking and formal network mapped empirically defined a priori in the company's written procedures.
Bazilian, Morgan; Nussbaumer, Patrick [United Nations Industrial Development Organization, Vienna (Austria); Gualberti, Giorgio [Technical University of Lisbon, Lisbon (Portugal); Haites, Erik [Margaree Consultants Inc., Toronto (Canada); Levi, Michael [Council on Foreign Relations, New York, NY (United States); Siegel, Judy [Energy and Security Group, Reston, VA (United States); Kammen, Daniel [The World Bank, Washington, DC (United States); Fenhann, Joergen [UNEP Risoe Centre, Technical University of Denmark (Denmark)
Energy poverty is widely recognized as a major obstacle to economic and social development and poverty alleviation. To help inform the design of appropriate and effective policies to reduce energy poverty, we present a brief analysis of the current macro financial flows in the electricity and gas distribution sectors in developing countries. We build on the methodology used to quantify the flows of investment in the climate change area. This methodology relies on national gross fixed capital formation, overseas development assistance, and foreign direct investment. These high-level and aggregated investment figures provide a sense of scale to policy-makers, but are only a small part of the information required to design financial vehicles. In addition, these figures tend to mask numerous variations between sectors and countries, as well as trends and other temporal fluctuations. Nonetheless, for the poorest countries, one can conclude that the current flows are considerably short (at least five times) of what will be required to provide a basic level of access to clean, modern energy services to the 'energy poor'.
Reza A. Maleki
Full Text Available This case study paper is the result of a project conducted on behalf of a company, hereon referred to as Midwest Assembly and Manufacturing or MAAN. The company's operations include component manufacturing, painting, and assembling products. The company also purchases a relatively large percentage of components and major assemblies that are needed to support final assembly operations. MAAN uses its own returnable containers to transport purchased parts from suppliers. Due to poor tracking of the containers, the company has been experiencing lost containers and occasional production disruptions at its facility well as at the supplier sites. The objective of this project was to develop a proposal to enable MAAN to more effectively track and manage its returnable containers. The research activities in support of this project included the analysis and documentation of both the physical flow and the information flow associated with the containers as well as some of the technologies that can help with automatic identification and tracking of containers. The focal point of this paper is on a macro?level approach for the analysis of container and information flow within the logistics chain. A companion paper deals with several of the automatic identification technologies that have the potential to improve the management of MAAN's returnable containers.
Durugbo, Christopher; Tiwari, Ashutosh; Alcock, Jeffery R.
The paper presents the findings of a survey of 40 microsystems companies that was carried out to determine the use and the purpose of use of media forms and information flow models within these companies. These companies as 'product-service systems' delivered integrated products and services to realise customer solutions. Data collection was carried out by means of an online survey over 3 months. The survey revealed that 42.5% of respondents made use of data flow diagrams and 10% made use of design structure matrices. The survey also suggests that a majority of companies (75%) made use of textual and diagrammatic media forms for communication, analysis, documentation and representation during design and development processes. The paper also discusses the implications of the survey findings to product-service systems.
Sun, Wen-Yang; Wang, Dong; Fang, Bao-Long; Ye, Liu
In this letter, the dynamics characteristics of quantum entanglement (negativity) and distinguishability (trace distance), and the flow of information for an open quantum system under relativistic motion are investigated. Explicitly, we propose a scenario that a particle A held by Alice suffers from an amplitude damping (AD) noise in a flat space-time and another particle B by Bob entangled with A travels with a fixed acceleration under a non-inertial frame. The results show that quantum distinguishability and entanglement are very vulnerable and fragile under the collective influence of AD noise and Unruh effect. Both of them will decrease with the growing intensity of the Unruh effect and the AD thermal bath. It means that the abilities of quantum distinguishability and entanglement to suppress the collective decoherence (AD noise and Unruh effect) are very weak. Furthermore, it turns out that the reduced quantum distinguishability of Alice’s system and Bob in the physically accessible region is distributed to another quantum distinguishability for Alice’s environment and Bob in the physically inaccessible region. That is, the information regarding the scenario is that the lost quantum distinguishability, as a fixed information, flows from the systems to the collective decoherence environment.
Full Text Available It is known that the thalamocortical loop plays a crucial role in the encoding of sensory-discriminative features of painful stimuli. However, only a few studies have addressed the changes in thalamocortical dynamics that may occur after the onset of chronic pain. Our goal was to evaluate how the induction of chronic neuropathic pain affected the flow of information within the thalamocortical loop throughout the brain states of the sleep-wake cycle. To address this issue we recorded local field potentials – LFPs – both before and after the establishment of neuropathic pain in awake freely moving adult rats chronically implanted with arrays of multielectrodes in the lateral thalamus and primary somatosensory cortex. Our results show that the neuropathic injury induced changes in the number of wake and slow-wave-sleep state episodes, and especially in the total number of transitions between brain states. Moreover, partial directed coherence – PDC – analysis revealed that the amount of information flow between cortex and thalamus in neuropathic animals decreased significantly, indicating that the overall thalamic activity had less weight over the cortical activity. However, thalamocortical LFPs displayed higher phase-locking during awake and slow-wave-sleep episodes after the nerve lesion, suggesting faster transmission of relevant information along the thalamocortical loop. The observed changes are in agreement with the hypothesis of thalamic dysfunction after the onset of chronic pain, and may result from diminished inhibitory effect of the primary somatosensory cortex over the lateral thalamus.
Kim, Ho-Yong; Oh, Gabjin
In this paper, we employ the variance decomposition method to measure the strength and the direction of interconnections among companies in the KOSDAQ (Korean Securities Dealers Automated Quotation) stock market. We analyze the 200 companies listed on the KOSDAQ market from January 2001 to December 2015. We find that the systemic risk, measured by using the interconnections, increases substantially during periods of financial crisis such as the bankruptcy of Lehman brothers and the European financial crisis. In particular, we find that the increases in the aggregated information flows can be used to predict the increment of the market volatility that may occur during a sub-prime financial crisis period.
Full Text Available The main goal of our research is to analyze and display causes of a bullwhip effect formation within a supply chain, as well as to provide the appropriate solutions to limit the occurrence of the bullwhip effect by using the proper information flow and partners’ cooperation within the supply chain. The bullwhip effect is one of the most important issues in the supply chain management and it is present in many companies. It preserves a character of invisibility because there are lots of causes for its formation and they are usually difficult to discern. The bullwhip effect is a phenomenon of an increase in the order variability within a supply chain. The higher we are within the supply chain, the higher is the order variability. The company encountered with the whip effect can successfully reduce its impact by improving the information flow, as well as improving partners’ cooperation within the supply chain. In this way the company can limit its negative repercussions and increase the profit. The article focuses on the overview of the bullwhip effect within a distribution chain, from its causes to suggestions and measures how to ease its negative repercussions on the organisation. Part of the causes could be found in the market demand variability and in the lack of communication about the actual marked demand within the supply chain. The rest of the causes are related to obstacles that emerge among different partners within the supply chain (role of culture. A qualitative analysis is applied on the basis of the selected cognitions from the supply chain management. The quantitative analysis is based on the theoretical research of the effective flow of information among the participants and its contribution to the reduction of the bullwhip impact. The article discusses two research questions: 1 The correct information flow within the supply chain and the improvement of the communication among partners can lead to the bullwhip effect reduction
Simin Banifatemi Kashi
Full Text Available This paper presents an empirical study to determine the effects of different factors including present profit, depreciation, working capital, operating cash flow and other accruals on future earnings stability. The study selects the information of 124 selected firms from Tehran Stock Exchange over the period 2006-2012. Using two regression analysis, the study has determined that as the fluctuation of profit increases, the profitability increases too. In addition, the study has concluded that firms with minimum fluctuations preserve more stable profitability. Moreover, firms with higher fluctuation in profitability maintain more volatile profitability for the next consecutive period.
Avci, H.I.; Cunnane, J.C.; Brandstetter, A.
A management tool consisting of calculation hierarchy and information flow diagrams is being prepared to address the resolution of major postclosure performance issues for a geologic high-level radioactive waste repository in the U.S.A. The diagrams will indicate the types of calculations and data needed to assess the postclosure performance of the repository. Separate diagrams will be generated for different scenario classes and conceptual models. The methodology used in developing these diagrams and their contents are illustrated for a single scenario and conceptual model. 5 refs., 5 figs
Sensoy, Ahmet; Sobaci, Cihat; Sensoy, Sadri; Alali, Fatih
We investigate the strength and direction of information flow between exchange rates and stock prices in several emerging countries by the novel concept of effective transfer entropy (an alternative non-linear causality measure) with symbolic encoding methodology. Analysis shows that before the 2008 crisis, only low level interaction exists between these two variables and exchange rates dominate stock prices in general. During crisis, strong bidirectional interaction arises. In the post-crisis period, the strong interaction continues to exist and in general stock prices dominate exchange rates
Smith, T R; Slater, P B
"A new family of migration models belonging to the elimination by aspects family is examined, with the spatial interaction model shown to be a special case. The models have simple forms; they incorporate information flow processes and choice set constraints; they are free of problems raised by the Luce Choice Axiom; and are capable of generating intransitive flows. Preliminary calibrations using the Continuous Work History Sample [time] series data indicate that the model fits the migration data well, while providing estimates of interstate job message flows. The preliminary calculations also indicate that care is needed in assuming that destination [attraction] are independent of origins." excerpt
Goldburg, C.B.; Lave, L.B.
The 1990 Clean Air Act is aimed at generators larger than 25 MW, as these are the largest polluters. Market incentives give each source an emissions allocation but also flexibility. If a plant has lower emissions than the target, it can sell the 'surplus' emissions as allowances to plants that fail to meet the target. Only a few trades have occurred to date. Market-based incentives should lower the costs of improving environmental quality significantly. However, currently institutional dificulties hamper implementation
Davey, E.; Matthews, G.
The control room workplace is the location from which all plant operations are supervised and controlled on a shift-to-shift basis. The activities comprising plant operations are structured into a number of work processes, and information is the common currency that is used to convey work requirements, communicate business and operating decisions, specify work practice, and describe the ongoing plant and work status. This paper describes the motivation for and early experience with developing a work process and information flow model of CANDU control room operations, and discusses some of the insights developed from model examination that suggest ways in which changes in control centre work specification, organization of resources, or asset layout could be undertaken to achieve operational improvements. (author)
Ha, Chang Hoon; Kim, Jong Hyun; Seong, Poong Hyun
In the main control room (MCR) of a nuclear power plant (NPP), there are lots of dynamic information sources for MCR operator's situation awareness. As the human-machine interface in MCR is advanced, operator's information acquisition, information gathering and decision-making is becoming an important part to maintain the effective and safe operation of NPPs. Diagnostic task in complex and huge systems like NPP is the most difficult and mental effort-demanding for operators. This research investigates the relation between operator's mental workload and information flow in accident diagnosis tasks. The amount of information flow is quantified, using information flow model and Conant's model, a kind of information theory. For the mental workload measure, eye blink rate, blink duration, fixation time, number of fixation, and gaze direction are measured during accident diagnosis tasks. Subjective methods such as NASA-Task Load Index (NASA-TLX) and Modified Cooper-Harper (MCH) method are also used in the experiment. It is shown that the operator's mental workload has significant relation to information flow of diagnosis task. It makes possible to predict the mental workload through the quantity of the information flow of a system
Pawlina, G.; Renneboog, L.D.R.
We investigate the investment-cash flow sensitivity of a large sample of the UK listed firms and confirm that investment is strongly cash flow-sensitive.Is this suboptimal investment policy the result of agency problems when managers with high discretion overinvest, or of asymmetric information when
Nandi, Anjan K; Sumana, Annagiri; Bhattacharya, Kunal
Social insects provide an excellent platform to investigate flow of information in regulatory systems since their successful social organization is essentially achieved by effective information transfer through complex connectivity patterns among the colony members. Network representation of such behavioural interactions offers a powerful tool for structural as well as dynamical analysis of the underlying regulatory systems. In this paper, we focus on the dominance interaction networks in the tropical social wasp Ropalidia marginata-a species where behavioural observations indicate that such interactions are principally responsible for the transfer of information between individuals about their colony needs, resulting in a regulation of their own activities. Our research reveals that the dominance networks of R. marginata are structurally similar to a class of naturally evolved information processing networks, a fact confirmed also by the predominance of a specific substructure-the 'feed-forward loop'-a key functional component in many other information transfer networks. The dynamical analysis through Boolean modelling confirms that the networks are sufficiently stable under small fluctuations and yet capable of more efficient information transfer compared to their randomized counterparts. Our results suggest the involvement of a common structural design principle in different biological regulatory systems and a possible similarity with respect to the effect of selection on the organization levels of such systems. The findings are also consistent with the hypothesis that dominance behaviour has been shaped by natural selection to co-opt the information transfer process in such social insect species, in addition to its primal function of mediation of reproductive competition in the colony. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
Engels, M M A; Yu, M; Stam, C J; Gouw, A A; van der Flier, W M; Scheltens, Ph; van Straaten, E C W; Hillebrand, A
In a recent magnetoencephalography (MEG) study, we found posterior-to-anterior information flow over the cortex in higher frequency bands in healthy subjects, with a reversed pattern in the theta band. A disruption of information flow may underlie clinical symptoms in Alzheimer's disease (AD). In AD, highly connected regions (hubs) in posterior areas are mostly disrupted. We therefore hypothesized that in AD the information flow from these hub regions would be disturbed. We used resting-state MEG recordings from 27 early-onset AD patients and 26 healthy controls. Using beamformer-based virtual electrodes, we estimated neuronal oscillatory activity for 78 cortical regions of interest (ROIs) and 12 subcortical ROIs of the AAL atlas, and calculated the directed phase transfer entropy (dPTE) as a measure of information flow between these ROIs. Group differences were evaluated using permutation tests and, for the AD group, associations between dPTE and general cognition or CSF biomarkers were determined using Spearman correlation coefficients. We confirmed the previously reported posterior-to-anterior information flow in the higher frequency bands in the healthy controls, and found it to be disturbed in the beta band in AD. Most prominently, the information flow from the precuneus and the visual cortex, towards frontal and subcortical structures, was decreased in AD. These disruptions did not correlate with cognitive impairment or CSF biomarkers. We conclude that AD pathology may affect the flow of information between brain regions, particularly from posterior hub regions, and that changes in the information flow in the beta band indicate an aspect of the pathophysiological process in AD.
Existing oil reservoirs might be more fully exploited if the properties of the flow of oil and water in porous media were better known. In laboratory experiments it is important to collect as much information as possible to make a descriptive model of the system, including position imaging and chemical binding information. This thesis develops nuclear methods for obtaining position image and chemical binding information from flow experiments of porous media. A combined positron emission tomography and single photon emission computed tomography system to obtain position images, and a time-differential perturbed angular correlation system to obtain chemical binding information, have been built and thoroughly tested. 68 refs., 123 figs., 14 tabs.
Cliff C. Kerr
Full Text Available The basal ganglia play a crucial role in the execution of movements, as demonstrated by the severe motor deficits that accompany Parkinson's disease (PD. Since motor commands originate in the cortex, an important question is how the basal ganglia influence cortical information flow, and how this influence becomes pathological in PD. To explore this, we developed a composite neuronal network/neural field model. The network model consisted of 4950 spiking neurons, divided into 15 excitatory and inhibitory cell populations in the thalamus and cortex. The field model consisted of the cortex, thalamus, striatum, subthalamic nucleus, and globus pallidus. Both models have been separately validated in previous work. Three field models were used: one with basal ganglia parameters based on data from healthy individuals, one based on data from individuals with PD, and one purely thalamocortical model. Spikes generated by these field models were then used to drive the network model. Compared to the network driven by the healthy model, the PD-driven network had lower firing rates, a shift in spectral power towards lower frequencies, and higher probability of bursting; each of these findings is consistent with empirical data on PD. In the healthy model, we found strong Granger causality in the beta and low gamma bands between cortical layers, but this was largely absent in the PD model. In particular, the reduction in Granger causality from the main "input" layer of the cortex (layer 4 to the main "output" layer (layer 5 was pronounced. This may account for symptoms of PD that seem to reflect deficits in information flow, such as bradykinesia. In general, these results demonstrate that the brain's large-scale oscillatory environment, represented here by the field model, strongly influences the information processing that occurs within its subnetworks. Hence, it may be preferable to drive spiking network models with physiologically realistic inputs rather than
Full Text Available For satisfactory traffic management of an intelligent transport system, it is vital that traffic microwave radar detectors (TMRDs can provide real-time traffic information with high accuracy. In this study, we develop several information-aided smart schemes for traffic detection improvements of TMRDs in multiple-lane environments. Specifically, we select appropriate thresholds not only for removing noise from fast Fourier transforms (FFTs of regional lane contexts but also for reducing FFT side lobes within each lane. The resulting FFTs of reflected vehicle signals and those of clutter are distinguishable. We exploit FFT and lane-/or time stamp-related information for developing smart schemes, which mitigate adverse effects of lane-crossing FFT side lobes of a vehicle signal. As such, the proposed schemes can enhance the detection accuracy of both lane vehicle flow and directional traffic volume. On-site experimental results demonstrate the advantages and feasibility of the proposed methods, and suggest the best smart scheme.
Full Text Available The analysis of data from electronic health records aspires to facilitate healthcare efficiencies and biomedical innovation. There are also ethical, legal and social implications from the handling of sensitive patient information. The paper explores the concerns, expectations and implications of the National Health Service (NHS England care.data programme: a national data sharing initiative of linked electronic health records for healthcare and other research purposes. Using Nissenbaum’s contextual integrity of privacy framework through a critical Science and Technology Studies (STS lens, it examines the way technologies and policies are developed to promote sustainability, governance and economic growth as the de facto social values, while reducing privacy to an individualistic preference. The state, acting as a new, central data broker reappropriates public ownership rights and establishes those information flows and transmission principles that facilitate the assetisation of NHS datasets for the knowledge economy. Various actors and processes from other contexts attempt to erode the public healthcare sector and privilege new information recipients. However, such data sharing initiatives in healthcare will be resisted if we continue to focus only on the monetary and scientific values of these datasets and keep ignoring their equally important social and ethical values.
Wang, Shiqin; Shao, Jingli; Song, Xianfang; Zhang, Yongbo; Huo, Zhibin; Zhou, Xiaoyuan
MODFLOW is a groundwater modeling program. It can be compiled and remedied according to the practical applications. Because of its structure and fixed data format, MODFLOW can be integrated with Geographic Information Systems (GIS) technology for water resource management. The North China Plain (NCP), which is the politic, economic and cultural center of China, is facing with water resources shortage and water pollution. Groundwater is the main water resource for industrial, agricultural and domestic usage. It is necessary to evaluate the groundwater resources of the NCP as an entire aquifer system. With the development of computer and internet information technology it is also necessary to integrate the groundwater model with the GIS technology. Because the geological and hydrogeological data in the NCP was mainly in MAPGIS format, the powerful function of GIS of disposing of and analyzing spatial data and computer languages such as Visual C and Visual Basic were used to define the relationship between the original data and model data. After analyzing the geological and hydrogeological conditions of the NCP, the groundwater flow numerical simulation modeling was constructed with MODFLOW. On the basis of GIS, a dynamic evaluation system for groundwater resources under the internet circumstance was completed. During the process of constructing the groundwater model, a water budget was analyzed, which showed a negative budget in the NCP. The simulation period was from 1 January 2002 to 31 December 2003. During this period, the total recharge of the groundwater system was 49,374 × 106 m3 and the total discharge was 56,530 × 106 m3 the budget deficit was -7,156 × 106 m3. In this integrated system, the original data including graphs and attribution data could be stored in the database. When the process of evaluating and predicting groundwater flow was started, these data were transformed into files that the core program of MODFLOW could read. The calculated water
Mário S. Alvim
Full Text Available In the inference attacks studied in Quantitative Information Flow (QIF, the attacker typically tries to interfere with the system in the attempt to increase its leakage of secret information. The defender, on the other hand, typically tries to decrease leakage by introducing some controlled noise. This noise introduction can be modeled as a type of protocol composition, i.e., a probabilistic choice among different protocols, and its effect on the amount of leakage depends heavily on whether or not this choice is visible to the attacker. In this work, we consider operators for modeling visible and hidden choice in protocol composition, and we study their algebraic properties. We then formalize the interplay between defender and attacker in a game-theoretic framework adapted to the specific issues of QIF, where the payoff is information leakage. We consider various kinds of leakage games, depending on whether players act simultaneously or sequentially, and on whether or not the choices of the defender are visible to the attacker. In the case of sequential games, the choice of the second player is generally a function of the choice of the first player, and his/her probabilistic choice can be either over the possible functions (mixed strategy or it can be on the result of the function (behavioral strategy. We show that when the attacker moves first in a sequential game with a hidden choice, then behavioral strategies are more advantageous for the defender than mixed strategies. This contrasts with the standard game theory, where the two types of strategies are equivalent. Finally, we establish a hierarchy of these games in terms of their information leakage and provide methods for finding optimal strategies (at the points of equilibrium for both attacker and defender in the various cases.
Zahran, Sammy; Tavani, Daniele; Weiler, Stephan
Casualties from natural disasters may depend on the day of the week they strike. With data from the Spatial Hazard Events and Losses Database for the United States (SHELDUS), daily variation in hurricane and tornado casualties from 5,043 tornado and 2,455 hurricane time/place events is analyzed. Hurricane forecasts provide at-risk populations with considerable lead time. Such lead time allows strategic behavior in choosing protective measures under hurricane threat; opportunity costs in terms of lost income are higher during weekdays than during weekends. On the other hand, the lead time provided by tornadoes is near zero; hence tornados generate no opportunity costs. Tornado casualties are related to risk information flows, which are higher during workdays than during leisure periods, and are related to sheltering-in-place opportunities, which are better in permanent buildings like businesses and schools. Consistent with theoretical expectations, random effects negative binomial regression results indicate that tornado events occurring on the workdays of Monday through Thursday are significantly less lethal than tornados that occur on weekends. In direct contrast, and also consistent with theory, the expected count of hurricane casualties increases significantly with weekday occurrences. The policy implications of observed daily variation in tornado and hurricane events are considered. © 2012 Society for Risk Analysis.
Lu, Mujie; Shang, Wenjie; Ji, Xinkai; Hua, Mingzhuang; Cheng, Kuo
Nowadays, intelligent transportation system (ITS) has already become the new direction of transportation development. Traffic data, as a fundamental part of intelligent transportation system, is having a more and more crucial status. In recent years, video observation technology has been widely used in the field of traffic information collecting. Traffic flow information contained in video data has many advantages which is comprehensive and can be stored for a long time, but there are still many problems, such as low precision and high cost in the process of collecting information. This paper aiming at these problems, proposes a kind of traffic target detection method with broad applicability. Based on three different ways of getting video data, such as aerial photography, fixed camera and handheld camera, we develop a kind of intelligent analysis software which can be used to extract the macroscopic, microscopic traffic flow information in the video, and the information can be used for traffic analysis and transportation planning. For road intersections, the system uses frame difference method to extract traffic information, for freeway sections, the system uses optical flow method to track the vehicles. The system was applied in Nanjing, Jiangsu province, and the application shows that the system for extracting different types of traffic flow information has a high accuracy, it can meet the needs of traffic engineering observations and has a good application prospect.
Full Text Available Abstract Background We study root cells from the model plant Arabidopsis thaliana and the communication channel conformed by the ethylene signal transduction pathway. A basic equation taken from our previous work relates the probability of expression of the gene ERF1 to the concentration of ethylene. Results The above equation is used to compute the Shannon entropy (H or degree of uncertainty that the genetic machinery has during the decoding of the message encoded by the ethylene specific receptors embedded in the endoplasmic reticulum membrane and transmitted into the nucleus by the ethylene signaling pathway. We show that the amount of information associated with the expression of the master gene ERF1 (Ethylene Response Factor 1 can be computed. Then we examine the system response to sinusoidal input signals with varying frequencies to determine if the cell can distinguish between different regimes of information flow from the environment. Our results demonstrate that the amount of information managed by the root cell can be correlated with the frequency of the input signal. Conclusion The ethylene signaling pathway cuts off very low and very high frequencies, allowing a window of frequency response in which the nucleus reads the incoming message as a sinusoidal input. Out of this window the nucleus reads the input message as an approximately non-varying one. From this frequency response analysis we estimate: a the gain of the system during the synthesis of the protein ERF1 (~-5.6 dB; b the rate of information transfer (0.003 bits during the transport of each new ERF1 molecule into the nucleus and c the time of synthesis of each new ERF1 molecule (~21.3 s. Finally, we demonstrate that in the case of the system of a single master gene (ERF1 and a single slave gene (HLS1, the total Shannon entropy is completely determined by the uncertainty associated with the expression of the master gene. A second proposition shows that the Shannon entropy
Jan H. Havenga
Full Text Available This article presents the results of a comprehensive disaggregated commodity flow model for South Africa. The wealth of data available enables a segmented analysis of future freight transportation demand in order to assist with the prioritisation of transportation investments, the development of transport policy and the growth of the logistics service provider industry. In 2011, economic demand for commodities in South Africa’s competitive surface-freight transport market amounted to 622 million tons and is predicted to increase to 1834m tons by 2041, which is a compound annual growth rate of 3.67%. Fifty percent of corridor freight constitutes break bulk; intermodal solutions are therefore critical in South Africa. Scenario analysis indicates that 80%of corridor break-bulk tons can by serviced by four intermodal facilities – in Gauteng, Durban, Cape Town and Port Elizabeth. This would allow for the development of an investment planning hierarchy, enable industry targeting (through commodity visibility, ensure capacity development ahead of demand and lower the cost of logistics in South Africa.
... advance of allowance. (a) Allowance. Step 2+3 and Step 3 grant agreements will include an allowance for facilities planning and design of the project and Step 7 agreements will include an allowance for facility... 40 Protection of Environment 1 2010-07-01 2010-07-01 false Allowance and advance of allowance. 35...
Full Text Available The world has changed a lot in the past years. The rapid advances in technology and the changing of the communication channels have changed the way people work and, for many, where do they work from. The Internet and mobile technology, the two most dynamic technological forces in modern information and communications technology (ICT are converging into one ubiquitous mobile Internet service, which will change our way of both doing business and dealing with our daily routine activities. As the use of ICT expands globally, there is need for further research into cultural aspects and implications of ICT. The acceptance of Information Technology (IT has become a fundamental part of the research plan for most organizations (Igbaria 1993. In IT research, numerous theories are used to understand users’ adoption of new technologies. Various models were developed including the Technology Acceptance Model, Theory of Reasoned Action, Theory of Planned Behavior, and recently, the Unified Theory of Acceptance and Use of Technology. Each of these models has sought to identify the factors which influence a citizen’s intention or actual use of information technology. Drawing on the UTAUT model and Flow Theory, this research composes a new hybrid theoretical framework to identify the factors affecting the acceptance and use of Mobile Internet -as an ICT application- in a consumer context. The proposed model incorporates eight constructs: Performance Expectancy, Effort Expectancy, Facilitating Conditions, Social Influences, Perceived Value, Perceived Playfulness, Attention Focus, and Behavioral intention. Data collected online from 238 respondents in Saudi Arabia were tested against the research model, using the structural equation modeling approach. The proposed model was mostly supported by the empirical data. The findings of this study provide several crucial implications for ICT and, in particular, mobile Internet service practitioners and researchers
... 50 Wildlife and Fisheries 6 2010-10-01 2010-10-01 false Allowable costs. 85.41 Section 85.41... Use/Acceptance of Funds § 85.41 Allowable costs. (a) Allowable grant costs are limited to those costs... applicable Federal cost principles in 43 CFR 12.60(b). Purchase of informational signs, program signs, and...
Full Text Available In the literature, positive investment cash flow sensitivity is attributed to either asymmetric information induced financing constraints or the agency costs of free cash flow. Using data from a sample of 68 manufacturing firms listed on the South African JSE, this paper contributes to the literature by investigating the source of investment cash flow sensitivity. We have found that asymmetric information explains the positive investment cash flow sensitivity better than agency costs. Furthermore, asymmetric information has been observed to be more pronounced in low-dividend-paying firms and small firms. Despite South Africa’s having a developed financial system by international standards, small firms are seen to be financially constrained. We attribute the absence of investment cash flow sensitivity due to agency costs to good corporate governance of South African listed firms. Thus the paper provides further evidence in support of the proposition in the literature that the source of investment cash flow sensitivity may depend on the institutional setting of a country, such as its corporate governance.
Overgaard, Anders; Kallesøe, Carsten Skovmose; Bendtsen, Jan Dimon
adgang til data, er ønsker at skabe en datadreven model til kontrol. Grundet den store mængde tilgængelig data anvendes der en metode til valg af inputs kaldet "Partial Mutual Information" (PMI). Denne artikel introducerer en metode til at inkluderer flow variable forsinkelser i PMI. Data fra en...... kontorbygning i Bjerringbro anvendes til analyse. Det vises at "Mutual Information" og et "Generalized Regression Neural Network" begge forbedres ved at anvende flow variabelt forsinkelse i forhold til at anvende konstante delay....
Miller, Robyn L; Vergara, Victor M; Calhoun, Vince D
Neuroscientists and clinical researchers are awash in data from an ever-growing number of imaging and other bio-behavioral modalities. This flow of brain imaging data, taken under resting and various task conditions, combines with available cognitive measures, behavioral information, genetic data plus other potentially salient biomedical and environmental information to create a rich but diffuse data landscape. The conditions being studied with brain imaging data are often extremely complex and it is common for researchers to employ more than one imaging, behavioral or biological data modality (e.g., genetics) in their investigations. While the field has advanced significantly in its approach to multimodal data, the vast majority of studies still ignore joint information among two or more features or modalities. We propose an intuitive framework based on conditional probabilities for understanding information exchange between features in what we are calling a feature meta-space; that is, a space consisting of many individual featurae spaces. Features can have any dimension and can be drawn from any data source or modality. No a priori assumptions are made about the functional form (e.g., linear, polynomial, exponential) of captured inter-feature relationships. We demonstrate the framework's ability to identify relationships between disparate features of varying dimensionality by applying it to a large multi-site, multi-modal clinical dataset, balance between schizophrenia patients and controls. In our application it exposes both expected (previously observed) relationships, and novel relationships rarely considered investigated by clinical researchers. To the best of our knowledge there is not presently a comparably efficient way to capture relationships of indeterminate functional form between features of arbitrary dimension and type. We are introducing this method as an initial foray into a space that remains relatively underpopulated. The framework we propose is
Sinnock, S.; Hartman, H.A.
Prototypes of information management tools have been developed that can help communicate the technical basis for nuclear waste disposal to a broad audience of program scientists and engineers, project managers, and informed observers from stakeholder organizations. These tools include system engineering concepts, parameter networks expressed as influence diagrams, associated model hierarchies, and a relational database. These tools are used to express relationships among data-collection parameters, model input parameters, model output parameters, systems requirements, physical elements of a system description, and functional analysis of the contribution of physical elements and their associated parameters in satisfying the system requirements. By organizing parameters, models, physical elements, functions, and requirements in a visually reviewable network and a relational database the severe communication challenges facing participants in the nuclear waste dialog can be addressed. The network identifies the influences that data collected in the field have on measures of repository suitability, providing a visual, traceable map that clarifies the role of data and models in supporting conclusions about repository suitability. The map allows conclusions to be traced directly to the underlying parameters and models. Uncertainty in these underlying elements can be exposed to open review in the context of the effects uncertainty has on judgements about suitability. A parameter network provides a stage upon which an informed social dialog about the technical merits of a nuclear waste repository can be conducted. The basis for such dialog must be that stage, if decisions about repository suitability are to be based on a repository's ability to meet requirements embodied in laws and regulations governing disposal of nuclear wastes
Robert A Gatenby
Full Text Available Normal cell function requires timely and accurate transmission of information from receptors on the cell membrane (CM to the nucleus. Movement of messenger proteins in the cytoplasm is thought to be dependent on random walk. However, Brownian motion will disperse messenger proteins throughout the cytosol resulting in slow and highly variable transit times. We propose that a critical component of information transfer is an intracellular electric field generated by distribution of charge on the nuclear membrane (NM. While the latter has been demonstrated experimentally for decades, the role of the consequent electric field has been assumed to be minimal due to a Debye length of about 1 nanometer that results from screening by intracellular Cl- and K+. We propose inclusion of these inorganic ions in the Debye-Huckel equation is incorrect because nuclear pores allow transit through the membrane at a rate far faster than the time to thermodynamic equilibrium. In our model, only the charged, mobile messenger proteins contribute to the Debye length.Using this revised model and published data, we estimate the NM possesses a Debye-Huckel length of a few microns and find this is consistent with recent measurement using intracellular nano-voltmeters. We demonstrate the field will accelerate isolated messenger proteins toward the nucleus through Coulomb interactions with negative charges added by phosphorylation. We calculate transit times as short as 0.01 sec. When large numbers of phosphorylated messenger proteins are generated by increasing concentrations of extracellular ligands, we demonstrate they generate a self-screening environment that regionally attenuates the cytoplasmic field, slowing movement but permitting greater cross talk among pathways. Preliminary experimental results with phosphorylated RAF are consistent with model predictions.This work demonstrates that previously unrecognized Coulomb interactions between phosphorylated messenger
Frank, Philipp; Schreiter, Joerg; Haefner, Sebastian; Paschew, Georgi; Voigt, Andreas; Richter, Andreas
Microfluidics is a great enabling technology for biology, biotechnology, chemistry and general life sciences. Despite many promising predictions of its progress, microfluidics has not reached its full potential yet. To unleash this potential, we propose the use of intrinsically active hydrogels, which work as sensors and actuators at the same time, in microfluidic channel networks. These materials transfer a chemical input signal such as a substance concentration into a mechanical output. This way chemical information is processed and analyzed on the spot without the need for an external control unit. Inspired by the development electronics, our approach focuses on the development of single transistor-like components, which have the potential to be used in an integrated circuit technology. Here, we present membrane isolated chemical volume phase transition transistor (MIS-CVPT). The device is characterized in terms of the flow rate from source to drain, depending on the chemical concentration in the control channel, the source-drain pressure drop and the operating temperature. PMID:27571209
Frank, Philipp; Schreiter, Joerg; Haefner, Sebastian; Paschew, Georgi; Voigt, Andreas; Richter, Andreas
Microfluidics is a great enabling technology for biology, biotechnology, chemistry and general life sciences. Despite many promising predictions of its progress, microfluidics has not reached its full potential yet. To unleash this potential, we propose the use of intrinsically active hydrogels, which work as sensors and actuators at the same time, in microfluidic channel networks. These materials transfer a chemical input signal such as a substance concentration into a mechanical output. This way chemical information is processed and analyzed on the spot without the need for an external control unit. Inspired by the development electronics, our approach focuses on the development of single transistor-like components, which have the potential to be used in an integrated circuit technology. Here, we present membrane isolated chemical volume phase transition transistor (MIS-CVPT). The device is characterized in terms of the flow rate from source to drain, depending on the chemical concentration in the control channel, the source-drain pressure drop and the operating temperature.
D. Ju. Chaly
Full Text Available Software-defined networks (SDN are a novel paradigm of networking which became an enabler technology for many modern applications such as network virtualization, policy-based access control and many others. Software can provide flexibility and fast-paced innovations in the networking; however, it has a complex nature. In this connection there is an increasing necessity of means for assuring its correctness and security. Abstract models for SDN can tackle these challenges. This paper addresses to confidentiality and some integrity properties of SDNs. These are critical properties for multi-tenant SDN environments, since the network management software must ensure that no confidential data of one tenant are leaked to other tenants in spite of using the same physical infrastructure. We define a notion of end-to-end security in context of software-defined networks and propose a semantic model where the reasoning is possible about confidentiality, and we can check that confidential information flows do not interfere with non-confidential ones. We show that the model can be extended in order to reason about networks with secure and insecure links which can arise, for example, in wireless environments.The article is published in the authors’ wording.
Full Text Available Using the scalp time-varying network method, the present study is the first to investigate the temporal influence of the reference on N170, a negative event-related potential component (ERP appeared about 170 ms that is elicited by facial recognition, in the network levels. Two kinds of scalp electroencephalogram (EEG references, namely, AR (average of all recording channels and reference electrode standardization technique (REST, were comparatively investigated via the time-varying processing of N170. Results showed that the latency and amplitude of N170 were significantly different between REST and AR, with the former being earlier and smaller. In particular, the information flow from right temporal-parietal P8 to left P7 in the time-varying network was earlier in REST than that in AR, and this phenomenon was reproduced by simulation, in which the performance of REST was closer to the true case at source level. These findings indicate that reference plays a crucial role in ERP data interpretation, and importantly, the newly developed approximate zero-reference REST would be a superior choice for precise evaluation of the scalp spatio-temporal changes relating to various cognitive events.
Bilek, Edda; Stößel, Gabriela; Schäfer, Axel; Clement, Laura; Ruf, Matthias; Robnik, Lydia; Neukel, Corinne; Tost, Heike; Kirsch, Peter; Meyer-Lindenberg, Andreas
Although borderline personality disorder (BPD)-one of the most common, burdensome, and costly psychiatric conditions-is characterized by repeated interpersonal conflict and instable relationships, the neurobiological mechanism of social interactive deficits remains poorly understood. To apply recent advancements in the investigation of 2-person human social interaction to investigate interaction difficulties among people with BPD. Cross-brain information flow in BPD was examined from May 25, 2012, to December 4, 2015, in pairs of participants studied in 2 linked functional magnetic resonance imaging scanners in a university setting. Participants performed a joint attention task. Each pair included a healthy control individual (HC) and either a patient currently fulfilling DSM-IV criteria for BPD (cBPD) (n = 23), a patient in remission for 2 years or more (rBPD) (n = 17), or a second HC (n = 20). Groups were matched for age and educational level. A measure of cross-brain neural coupling was computed following previously published work to indicate synchronized flow between right temporoparietal junction networks (previously shown to host neural coupling abilities in health). This measure is derived from an independent component analysis contrasting the time courses of components between pairs of truly interacting participants compared with bootstrapped control pairs. In the sample including 23 women with cBPD (mean [SD] age, 26.8 [5.7] years), 17 women with rBPD (mean [SD] age, 28.5 [4.3] years), and 80 HCs (mean [SD] age, 24.0 [3.4] years]) investigated as dyads, neural coupling was found to be associated with disorder state (η2 = 0.17; P = .007): while HC-HC pairs showed synchronized neural responses, cBPD-HC pairs exhibited significantly lower neural coupling just above permutation-based data levels (η2 = 0.16; P = .009). No difference was found between neural coupling in rBPD-HC and HC-HC pairs. The neural coupling in patients was
Proceedings of the ASIST Annual Meeting, 2001
Topics of Poster Presentations include: electronic preprints; intranets; poster session abstracts; metadata; information retrieval; watermark images; video games; distributed information retrieval; subject domain knowledge; data mining; information theory; course development; historians' use of pictorial images; information retrieval software;…
OS level, Flume  has even been shown to be information flow secure through abstractions such as processes, pipes, file systems etc, while seL4 ...Andronick, D. Cock, P. Derrin, D. Elkaduwe, K. Engelhardt, R. Kolanski, M. Norrish, T. Sewell, H. Tuch, and S. Winwood. sel4 : formal verification of an
Full Text Available People increasingly can and want to obtain and generate health information themselves. With the increasing do-it-yourself sentiment comes also the desire to be more involved in one’s health care decisions. Patient driven health-care and health research models are emerging; terms such as participatory medicine and quantified-self are visible increasingly. Given the health consumer’s desire to be more involved in health data generation and health care decision making processes the authors submit that it is important to be health policy literate, to understanding how health policies are developed, what themes are discussed among health policy researchers and policy makers, to understand how ones demands would be discussed within health policy discourses. The public increasingly obtains their knowledge through the internet by searching web browsers for keywords. Question is whether the “health consumer” to come has knowledge of key terms defining key health policy discourses which would enable them to perform targeted searches for health policy literature relevant to their situation. The authors found that key health policy terms are virtually absent from printed and online news media which begs the question how the “health consumer” might learn about key health policy terms needed for web based searches that would allow the “health consumer” to access health policy discourses relevant to them.
Yu, Lianchun; De Mazancourt, Marine; Hess, Agathe; Ashadi, Fakhrul R; Klein, Isabelle; Mal, Hervé; Courbage, Maurice; Mangin, Laurence
Breathing involves a complex interplay between the brainstem automatic network and cortical voluntary command. How these brain regions communicate at rest or during inspiratory loading is unknown. This issue is crucial for several reasons: (i) increased respiratory loading is a major feature of several respiratory diseases, (ii) failure of the voluntary motor and cortical sensory processing drives is among the mechanisms that precede acute respiratory failure, (iii) several cerebral structures involved in responding to inspiratory loading participate in the perception of dyspnea, a distressing symptom in many disease. We studied functional connectivity and Granger causality of the respiratory network in controls and patients with chronic obstructive pulmonary disease (COPD), at rest and during inspiratory loading. Compared with those of controls, the motor cortex area of patients exhibited decreased connectivity with their contralateral counterparts and no connectivity with the brainstem. In the patients, the information flow was reversed at rest with the source of the network shifted from the medulla towards the motor cortex. During inspiratory loading, the system was overwhelmed and the motor cortex became the sink of the network. This major finding may help to understand why some patients with COPD are prone to acute respiratory failure. Network connectivity and causality were related to lung function and illness severity. We validated our connectivity and causality results with a mathematical model of neural network. Our findings suggest a new therapeutic strategy involving the modulation of brain activity to increase motor cortex functional connectivity and improve respiratory muscles performance in patients. Hum Brain Mapp 37:2736-2754, 2016. © 2016 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc. © 2016 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.
Preparing a Minimum Information about a Flow Cytometry Experiment (MIFlowCyt) compliant manuscript using the International Society for Advancement of Cytometry (ISAC) FCS file repository (FlowRepository.org).
Spidlen, Josef; Breuer, Karin; Brinkman, Ryan
FlowRepository.org is a Web-based flow cytometry data repository provided by the International Society for Advancement of Cytometry (ISAC). It supports storage, annotation, analysis, and sharing of flow cytometry datasets. A fundamental tenet of scientific research is that published results should be open to independent validation and refutation. With FlowRepository, researchers can annotate their datasets in compliance with the Minimum Information about a Flow Cytometry Experiment (MIFlowCyt) standard, thus greatly facilitating third-party interpretation of their data. In this unit, we will mainly focus on the deposition, sharing, and annotation of flow cytometry data.
Kolodny, Michael A.
Today's battlefield space is extremely complex, dealing with an enemy that is neither well-defined nor well-understood. Adversaries are comprised of widely-distributed, loosely-networked groups engaging in nefarious activities. Situational understanding is needed by decision makers; understanding of adversarial capabilities and intent is essential. Information needed at any time is dependent on the mission/task at hand. Information sources potentially providing mission-relevant information are disparate and numerous; they include sensors, social networks, fusion engines, internet, etc. Management of these multi-dimensional informational sources is critical. This paper will present a new approach being undertaken to answer the challenge of enhancing battlefield understanding by optimizing the utilization of available informational sources (means) to required missions/tasks as well as determining the "goodness'" of the information acquired in meeting the capabilities needed. Requirements are usually expressed in terms of a presumed technology solution (e.g., imagery). A metaphor of the "magic rabbits" was conceived to remove presumed technology solutions from requirements by claiming the "required" technology is obsolete. Instead, intelligent "magic rabbits" are used to provide needed information. The question then becomes: "WHAT INFORMATION DO YOU NEED THE RABBITS TO PROVIDE YOU?" This paper will describe a new approach called Mission-Informed Needed Information - Discoverable, Available Sensing Sources (MINI-DASS) that designs a process that builds information acquisition missions and determines what the "magic rabbits" need to provide in a manner that is machine understandable. Also described is the Missions and Means Framework (MMF) model used, the process flow utilized, the approach to developing an ontology of information source means and the approach for determining the value of the information acquired.
We investigated the performance of a groundwater flow and solute transport model when different combinations of hydraulic head, seepage flux, and chloride concentration data were used in calibration of the model...
Soch, Joram; Deserno, Lorenz; Assmann, Anne; Barman, Adriana; Walter, Henrik; Richardson-Klavehn, Alan; Schott, Björn H
The default mode network (DMN), a network centered around the cortical midline, shows deactivation during most cognitive tasks and pronounced resting-state connectivity, but is actively engaged in self-reference and social cognition. It is, however, yet unclear how information reaches the DMN during social cognitive processing. Here, we addressed this question using dynamic causal modeling (DCM) of functional magnetic resonance imaging (fMRI) data acquired during self-reference (SR) and reference to others (OR). Both conditions engaged the left inferior frontal gyrus (LIFG), most likely reflecting semantic processing. Within the DMN, self-reference preferentially elicited rostral anterior cingulate and ventromedial prefrontal cortex (rACC/vmPFC) activity, whereas OR engaged posterior cingulate and precuneus (PCC/PreCun). DCM revealed that the regulation of information flow to the DMN was primarily inhibitory. Most prominently, SR elicited inhibited information flow from the LIFG to the PCC/PreCun, while OR was associated with suppression of the connectivity from the LIFG to the rACC/vmPFC. These results suggest that task-related DMN activation is enabled by inhibitory down-regulation of task-irrelevant information flow when switching from rest to stimulus-specific processing. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: firstname.lastname@example.org.
Full Text Available Abstract Background Finding the dominant direction of flow of information in densely interconnected regulatory or signaling networks is required in many applications in computational biology and neuroscience. This is achieved by first identifying and removing links which close up feedback loops in the original network and hierarchically arranging nodes in the remaining network. In mathematical language this corresponds to a problem of making a graph acyclic by removing as few links as possible and thus altering the original graph in the least possible way. The exact solution of this problem requires enumeration of all cycles and combinations of removed links, which, as an NP-hard problem, is computationally prohibitive even for modest-size networks. Results We introduce and compare two approximate numerical algorithms for solving this problem: the probabilistic one based on a simulated annealing of the hierarchical layout of the network which minimizes the number of "backward" links going from lower to higher hierarchical levels, and the deterministic, "greedy" algorithm that sequentially cuts the links that participate in the largest number of feedback cycles. We find that the annealing algorithm outperforms the deterministic one in terms of speed, memory requirement, and the actual number of removed links. To further improve a visual perception of the layout produced by the annealing algorithm, we perform an additional minimization of the length of hierarchical links while keeping the number of anti-hierarchical links at their minimum. The annealing algorithm is then tested on several examples of regulatory and signaling networks/pathways operating in human cells. Conclusion The proposed annealing algorithm is powerful enough to performs often optimal layouts of protein networks in whole organisms, consisting of around ~104 nodes and ~105 links, while the applicability of the greedy algorithm is limited to individual pathways with ~100
In the first part of this work we study a family of deterministic models for highway traffic flow which generalize cellular automaton rule 184. This family is parameterized by the speed limit m and another parameter k that represents degree of 'anticipatory driving'. We compare two driving strategies with identical maximum throughput: 'conservative' driving with high speed limit and 'anticipatory' driving with low speed limit. Those two strategies are evaluated in terms of accident probability. We also discuss fundamental diagrams of generalized traffic rules and examine limitations of maximum achievable throughput. Possible modifications of the model are considered. For rule 184, we present exact calculations of the order parameter in a transition from the moving phase to the jammed phase using the method of preimage counting, and use this result to construct a solution to the density classification problem. In the second part we propose a probabilistic cellular automaton model for the spread of innovations, rumors, news, etc., in a social system. We start from simple deterministic models, for which exact expressions for the density of adopters are derived. For a more realistic model, based on probabilistic cellular automata, we study the influence of a range of interaction R on the shape of the adoption curve. When the probability of adoption is proportional to the local density of adopters, and individuals can drop the innovation with some probability p, the system exhibits a second order phase transition. Critical line separating regions of parameter space in which asymptotic density of adopters is positive from the region where it is equal to zero converges toward the mean-field line when the range of the interaction increases. In a region between R=1 critical line and the mean-field line asymptotic density of adopters depends on R, becoming zero if R is too small (smaller than some critical value). This result demonstrates the importance of connectivity in
Ngo, Minh Tri
In today’s information-based society, guaranteeing information security plays an important role in all aspects of life: governments, military, companies, financial information systems, web-based services etc. With the existence of Internet, Google, and shared-information networks, it is easier than
Teng, Xian; Pei, Sen; Morone, Flaviano; Makse, Hernán A
Identifying the most influential spreaders that maximize information flow is a central question in network theory. Recently, a scalable method called "Collective Influence (CI)" has been put forward through collective influence maximization. In contrast to heuristic methods evaluating nodes' significance separately, CI method inspects the collective influence of multiple spreaders. Despite that CI applies to the influence maximization problem in percolation model, it is still important to examine its efficacy in realistic information spreading. Here, we examine real-world information flow in various social and scientific platforms including American Physical Society, Facebook, Twitter and LiveJournal. Since empirical data cannot be directly mapped to ideal multi-source spreading, we leverage the behavioral patterns of users extracted from data to construct "virtual" information spreading processes. Our results demonstrate that the set of spreaders selected by CI can induce larger scale of information propagation. Moreover, local measures as the number of connections or citations are not necessarily the deterministic factors of nodes' importance in realistic information spreading. This result has significance for rankings scientists in scientific networks like the APS, where the commonly used number of citations can be a poor indicator of the collective influence of authors in the community.
Teng, Xian; Pei, Sen; Morone, Flaviano; Makse, Hernán A.
Identifying the most influential spreaders that maximize information flow is a central question in network theory. Recently, a scalable method called “Collective Influence (CI)” has been put forward through collective influence maximization. In contrast to heuristic methods evaluating nodes’ significance separately, CI method inspects the collective influence of multiple spreaders. Despite that CI applies to the influence maximization problem in percolation model, it is still important to examine its efficacy in realistic information spreading. Here, we examine real-world information flow in various social and scientific platforms including American Physical Society, Facebook, Twitter and LiveJournal. Since empirical data cannot be directly mapped to ideal multi-source spreading, we leverage the behavioral patterns of users extracted from data to construct “virtual” information spreading processes. Our results demonstrate that the set of spreaders selected by CI can induce larger scale of information propagation. Moreover, local measures as the number of connections or citations are not necessarily the deterministic factors of nodes’ importance in realistic information spreading. This result has significance for rankings scientists in scientific networks like the APS, where the commonly used number of citations can be a poor indicator of the collective influence of authors in the community. PMID:27782207
Małyska, Aleksandra; Maciąg, Kamil; Twardowski, Tomasz
The issue of GMOs arouses constantly strong emotions in public discourse. At the same time opinions of people particularly interested in this issues such as researchers, or potential users of this technology (e.g. farmers) are rarely subjected to analysis. Moreover, lack of knowledge about the flow of information "from the laboratory to the consumer" hinders implementation of any changes in this field. By using triangulation (combining quantitative and qualitative research and the use of various research tools) we explored the attitudes of Polish scientists, agricultural advisers and farmers (large scale agricultural producers) to the use of GMOs in the economy. On the basis of the performed research we diagnosed the effectiveness of information flow among these groups about transgenic organisms. Copyright © 2013 Elsevier B.V. All rights reserved.
Ngo, Minh Tri
In today's information-based society, guaranteeing information security plays an important role in all aspects of life: communication between citizens and governments, military, companies, financial information systems, web-based services etc. With the increasing popularity of computer systems with
St Clair, James J H; Burns, Zackory T; Bettaney, Elaine M; Morrissey, Michael B; Otis, Brian; Ryder, Thomas B; Fleischer, Robert C; James, Richard; Rutz, Christian
Social-network dynamics have profound consequences for biological processes such as information flow, but are notoriously difficult to measure in the wild. We used novel transceiver technology to chart association patterns across 19 days in a wild population of the New Caledonian crow--a tool-using species that may socially learn, and culturally accumulate, tool-related information. To examine the causes and consequences of changing network topology, we manipulated the environmental availability of the crows' preferred tool-extracted prey, and simulated, in silico, the diffusion of information across field-recorded time-ordered networks. Here we show that network structure responds quickly to environmental change and that novel information can potentially spread rapidly within multi-family communities, especially when tool-use opportunities are plentiful. At the same time, we report surprisingly limited social contact between neighbouring crow communities. Such scale dependence in information-flow dynamics is likely to influence the evolution and maintenance of material cultures.
... planning organizations, and the business community. The Commodity Flow Survey is co-sponsored by the Bureau of Transportation Statistics, Research and Innovative Technology Administration, Department of..., industry, and mode of transportation. The Census Bureau will publish these shipment characteristics for the...
Full Text Available Flooding and its damages are not only found uplift water level in a region. In other words, the depth and speed parameters together have determining the level of flood risk at each point. This subject is visible in flooded plain with low height and high speed of 2 meters per second, which damages are extensive. According to the criteria of having both velocity and flow depth in the governing equation to the flows energy, this equation seems appropriate to analysis in this study. Various methods have been proposed for increase accuracy in flood zoning with different return periods and risks associated with it in land border of river. For example, some of these methods are considered factors such as analysis of past flooding in the area affected by floods, hydrological factors and consideration of hydraulic elements affecting in flood zoning (such as flow velocity. This paper investigates the effect of flood zoning by the energy flow in the areas affected by floods. Also risk due to flood based on energy flow in each section of the river is compared by the proposed graphs of hazard interval and other done flood zoning in this field. In this study, the FORDO river has been selected as the case study. This river is part of the rivers located in the city of QOM KAHAK. The characteristics of river in upstream and downstream are mountain, young and stable and adult, respectively. Also this river in different seasons is exposed the flood damage. The proposed method in this study can be improving recognition accuracy of flood risk in areas affected by flood. Also, this method facilitate the identify parts of the river bed, that is affected by severe flooding, for decision making to improve rivers organizing.
As it becomes common for Internet users to use hashtags when posting and searching information on social media, it is important to understand who builds a hashtag network and how information is circulated within the network. This article focused on unlocking the potential of the #AlphaGo hashtag network by addressing the following questions. First, the current study examined whether traditional opinion leadership (i.e., the influentials hypothesis) or grassroot participation by the public (i.e., the interpersonal hypothesis) drove dissemination of information in the hashtag network. Second, several unique patterns of information distribution by key users were identified. Finally, the association between attributes of key users who exerted great influence on information distribution (i.e., the number of followers and follows) and their central status in the network was tested. To answer the proffered research questions, a social network analysis was conducted using a large-scale hashtag network data set from Twitter (n = 21,870). The results showed that the leading actors in the network were actively receiving information from their followers rather than serving as intermediaries between the original information sources and the public. Moreover, the leading actors played several roles (i.e., conversation starters, influencers, and active engagers) in the network. Furthermore, the number of their follows and followers were significantly associated with their central status in the hashtag network. Based on the results, the current research explained how the information was exchanged in the hashtag network by proposing the reciprocal model of information flow.
Holba, C.; McGee, M.; Thompson, P.
On March 24, 1989, the supertanker Exxon Valdez struck a sub- merged rock pinnacle at Bligh Reef, puncturing eight of its storage tanks. Within hours, 11 million gallons of crude oil were dumped into the waters of Prince William Sound. The cleanup, damage assessment, and restoration activities undertaken for this environmentally complex area presented multifaceted challenges to public and private organizations and various professional disciplines. One of these challenges was obtaining and disseminating prespill, spill, and postspill information for both the private and public sector. The Oil Spill Public Information Center (OSPIC) was created for this purpose by the US Department of Justice on behalf of the federal trustees. Its management has since been assumed by the restoration team, an arm of the state-federal Exxon Valdez Oil Spill Trustee Council. On October 8, 1991, a settlement agreement was approved in United States District Court, which required Exxon to pay $1 billion in criminal restitution and civil damages to the United States and the state of Alaska. The settlement terms specify that the Trustee Council shall establish procedures providing for meaningful public participation in the injury assessment and restoration process. Consistent with that mandate, the OSPIC is responsible for providing a repository for all material related to the Exxon Valdez oil spill, The OSPIC is a specialized library open to the public. Its function is to collect, organize, and make accessible materials generated by state and federal agencies and the private sector as a result of the cleanup, damage assessment, and restoration activities of the spill. The OSPIC staff is also identifying and collecting baseline studies in the Prince William Sound and Gulf of Alaska areas, as well as materials on cold water marine spills. The OSPIC serves a variety of patrons, including industry, the oil spill response community, state and federal agencies, scientists, etc
Dallman, J.C.; Kirchner, W.L.
In this study, flow conditions in the upper plenum of a PWR during the reflood stage of a loss-of-coolant accident (LOCA) are simulated using water sprays and a draft-induced wind tunnel. The de-entrainment efficiencies of isolated structures are presented for a variety of air-water droplet cross flow conditions. Since droplet splashing and/or bouncing from the draining liquid film is not accounted for in classical inertial impaction theory, there is substantial disagreement between measurement and the theory. The de-entrainment efficiencies of isolated tubes are extrapolated to those of tubes in a multiple tube array, and a predictive relation is presented for the overall de-entrainment eficiency of multiple tube arrays
Dodson, Leslie Lynn
This dissertation describes the design, implementation and evaluation of a gender-inclusive information system linking rural women in Agni Hiya, Morocco and water project managers from the Association Dar Si-Hmad. This research was motivated by an interest in exploring the linkages between information and communication technologies (ICT), climate…
Salmistraro, Matteo; Zamarin, Marco; Rakêt, Lars Lau
Distributed Video Coding (DVC) is a video coding paradigm allowing a shift of complexity from the encoder to the decoder. Depth maps are images enabling the calculation of the distance of an object from the camera, which can be used in multiview coding in order to generate virtual views, but also...
Information management challenges facing the petroleum and natural gas industry are discussed in conjunction with the increasing difficulty of accessing information because of the sheer volume of it, plus the fact that most data systems are proprietary 'closed' systems. In this context, reference is made to a newly developed software system named PetroDesk, developed by Merak Petroleum. PetroDesk is a geographical information browser used for integration and analysis of public, proprietary and personal data under a common interface. The software can be used to plot land position, chart productivity of wells, and produce graphs of decline rates, reserves and production. The software, which was originally designed for engineering data, also has been found useful in determining costs, revenue projections and other information needed to obtain a real-time net present worth of a company, and also in identifying business opportunities. 2 figs
Stramaglia, Sebastiano; Angelini, Leonardo; Wu, Guorong; Cortes, Jesus M; Faes, Luca; Marinazzo, Daniele
We develop a framework for the analysis of synergy and redundancy in the pattern of information flow between subsystems of a complex network. The presence of redundancy and/or synergy in multivariate time series data renders difficulty to estimate the neat flow of information from each driver variable to a given target. We show that adopting an unnormalized definition of Granger causality, one may put in evidence redundant multiplets of variables influencing the target by maximizing the total Granger causality to a given target, over all the possible partitions of the set of driving variables. Consequently, we introduce a pairwise index of synergy which is zero when two independent sources additively influence the future state of the system, differently from previous definitions of synergy. We report the application of the proposed approach to resting state functional magnetic resonance imaging data from the Human Connectome Project showing that redundant pairs of regions arise mainly due to space contiguity and interhemispheric symmetry, while synergy occurs mainly between nonhomologous pairs of regions in opposite hemispheres. Redundancy and synergy, in healthy resting brains, display characteristic patterns, revealed by the proposed approach. The pairwise synergy index, here introduced, maps the informational character of the system at hand into a weighted complex network: the same approach can be applied to other complex systems whose normal state corresponds to a balance between redundant and synergetic circuits.
PUBLISHED We propose an extension of the asynchronous ?-calculus in which a variety of security properties may be captured using types. These are an extension of the input/output types for the ?-calculus in which I/O capabilities are assigned specific security levels. The main innovation is a uniform typing system that, by varying slightly the allowed set of types, captures different notions of security.We first define a typing system that ensures that processes running at security level ?...
Walderhaug, Ståle; Meland, Per Håkon; Mikalsen, Marius; Sagen, Terje; Brevik, John Ivar
Documentation of medical treatment and observation of patients during evacuation from the point of injury to definitive treatment is important both for optimizing patient treatment and managing the evacuation process. The current practice in military medical field documentation uses paper forms and voice communication. There are many shortcomings associated with this approach, especially with respect to information capture and sharing processes. Current research addresses the use of new technology for civilian ambulance-to-hospital communication. The research work presented in this article addresses information capture and sharing in extreme military conditions by evaluating a targeted computerized information system called EvacSys during a military exercise in northern Norway in December 2003. EvacSys was designed and implemented in close cooperation with military medical personnel in both Norway and the USA. The system was evaluated and compared to the traditional paper-based documentation method during a military exercise. The on-site evaluation was conducted in a military medical platoon in the Norwegian Armed Forces, using questionnaires, semi-structured interviews, observation and video recording to capture the users' system acceptance. A prototype software system running on a commercial off-the-shelf hardware platform was successfully developed. The evaluation of this system shows that the usability of digital information capturing and sharing are perceived to be at least as good as the traditional paper-based method. The medics found the new digital method to be more viable than the old one. No technical problems were encountered. Our research shows that it is feasible to utilize digital information systems for medical documentation in extreme outdoor environments. The usability concern is of utmost importance, and more research should be put into the design and alignment with existing workflow. Successful digitalization of information at the point of care
Borges, F. S.; Lameu, E. L.; Iarosz, K. C.; Protachevicz, P. R.; Caldas, I. L.; Viana, R. L.; Macau, E. E. N.; Batista, A. M.; Baptista, M. S.
The characterization of neuronal connectivity is one of the most important matters in neuroscience. In this work, we show that a recently proposed informational quantity, the causal mutual information, employed with an appropriate methodology, can be used not only to correctly infer the direction of the underlying physical synapses, but also to identify their excitatory or inhibitory nature, considering easy to handle and measure bivariate time series. The success of our approach relies on a surprising property found in neuronal networks by which nonadjacent neurons do "understand" each other (positive mutual information), however, this exchange of information is not capable of causing effect (zero transfer entropy). Remarkably, inhibitory connections, responsible for enhancing synchronization, transfer more information than excitatory connections, known to enhance entropy in the network. We also demonstrate that our methodology can be used to correctly infer directionality of synapses even in the presence of dynamic and observational Gaussian noise, and is also successful in providing the effective directionality of intermodular connectivity, when only mean fields can be measured.
Responses of focus group members from the region around a Nuclear Facility provide the data for this qualitative study concerning citizen perceptions of available site information. Analyses of three of the focus group discussion questions and the answers they elicited showed a dominant perception among participants of insufficient easily available information about the site. These respondents also indicated that most of them obtain site information through mass media and hearsay, that many lack trust in the information they have and would trust only an independent entity to provide accurate information. A new area in communication studies, variously called environmental risk communication, risk communication and health risk communication, continues to evolve among those working in various allied disciplines, some far removed from communication. As science attempts to solve environmental problems caused by technological advances, this field acquires numerous practitioners. Some of these risk communication experts may however, be overlooking basic and necessary components of effective communication, because their expertise is in another discipline. One result of this can be communication breakdown in which those involved, assume that meaning is shared, when in fact the opposite is true. This paper seeks to clarify a necessary ingredient of effective interpersonal risk communication, using data obtained from citizens living around one of the nation's nuclear facilities as an example
Full Text Available This article examines the role of system micro-blogging Twitter in the Tunisian protest movement mobilization that led to the downfall of President Ben Ali. Through a semio-pragmatic approach of discourses generated and shared online, turn out the interactions constructed between actors of informational mobilization and occur expressive logics deployed under the political and media context repressed. Analysis of « strategies » discourse of the actors involved on the platform shows the symbolic construction of social movement through the medium giving information support for militant actions undertaken in the street.
their audiences and, in the process of doing so, manage to make a profit. The Value of Information One interesting aspect of information is that it has...different nature of the attributes above in a business context. The “corporate war” between Coca-Cola and Pepsi in the 1980s was largely one of product...differentiation (Ramsey, 1987). Both Coca-Cola and Pepsi tried to increase their shares of the “cola” soft drink market by launching new differentiated
Hennessy, Matthew; Riely, James
We propose an extension of the asynchronous π-calculus in which a variety of security properties may be captured using types. These are an extension of the input/output types for the π-calculus in which I/O capabilities are assigned specific security levels. The main innovation is a uniform typing system that, by varying slightly the allowed set of types, captures different notions of security.We first define a typing system that ensures that processes running at security level σ cannot acces...
Inasaka, Fujio; Nariai, Hideki
Valuable experimental knowledge with flow boiling characteristics of the helical-coil type once-through steam generator was converted into an intelligent information data base program. The program was created as a windows application using the Visual Basic. Main functions of the program are as follows: (1) steady state flow boiling analysis of any helical-coil type once-through steam generator, (2) analysis and comparison with the experimental data, (3) reference and graph display of the steady state experimental data, (4) reference of the flow instability experimental data and display of the instability threshold correlated by each parameter, (5) summary of the experimental apparatus. (6) menu bar such as a help and print. In the steady state analysis, the region lengths of subcooled boiling, saturated boiling, and super-heating, and the temperature and pressure distributions etc. for secondary water calculated. Steady state analysis results agreed well with the experimental data, with the exception of the pressure drop at high mass velocity. The program will be useful for the design of not only the future integrated type marine water reactor but also the small sized water reactor with helical-coil type steam generator
This dissertation analyzes the geography of information in the 21st century where BigData, social networks, user generated production of content and geography combine to create new and complex patterns of space, context and sociability. Both online and offline, social networks are creating a space that simultaneously unifies individuals and…
Feng, Shihui; Hossain, Liaquat; Crawford, John W; Bossomaier, Terry
Social media provides us with a new platform on which to explore how the public responds to disasters and, of particular importance, how they respond to the emergence of infectious diseases such as Ebola. Provided it is appropriately informed, social media offers a potentially powerful means of supporting both early detection and effective containment of communicable diseases, which is essential for improving disaster medicine and public health preparedness. The 2014 West African Ebola outbreak is a particularly relevant contemporary case study on account of the large number of annual arrivals from Africa, including Chinese employees engaged in projects in Africa. Weibo (Weibo Corp, Beijing, China) is China's most popular social media platform, with more than 2 billion users and over 300 million daily posts, and offers great opportunity to monitor early detection and promotion of public health awareness. We present a proof-of-concept study of a subset of Weibo posts during the outbreak demonstrating potential and identifying priorities for improving the efficacy and accuracy of information dissemination. We quantify the evolution of the social network topology within Weibo relating to the efficacy of information sharing. We show how relatively few nodes in the network can have a dominant influence over both the quality and quantity of the information shared. These findings make an important contribution to disaster medicine and public health preparedness from theoretical and methodological perspectives for dealing with epidemics. (Disaster Med Public Health Preparedness. 2018;12:26-37).
Ha, Chang Hoon; Kim, Jong Hyun; Seong, Poong Hyun [KAIST, Taejon (Korea, Republic of)
In the main control room (MCR) of a nuclear power plant (NPP), there are lots of dynamic information sources for MCR operator's situation awareness. As the human-machine interface in MCR is advanced, operator's information acquisition, information gathering and decision-making is becoming an important part to maintain the effective and safe operation of NPPs. Diagnostic task in complex and huge systems like NPP is the most difficult and mental effort-demanding for operators. This research investigates the relation between operator's mental workload and information flow in accident diagnosis tasks. The amount of information flow is quantified, using information flow model and Conant's model, a kind of information theory. For the mental workload measure, eye blink rate, blink duration, fixation time, number of fixation, and gaze direction are measured during accident diagnosis tasks. Subjective methods such as NASA-Task Load Index (NASA-TLX) and Modified Cooper-Harper (MCH) method are also used in the experiment. It is shown that the operator's mental workload has significant relation to information flow of diagnosis task. It makes possible to predict the mental workload through the quantity of the information flow of a system.
Full Text Available This paper investigates the potential use of information systems (IS for enhancing the supply chains of organisations positioned in the intellectual property (IP sector. Exploratory research has been conducted through the lens of a patent and trade mark agent who is involved in advising on a range of IP issues. The research highlights the opportunities offered by IS (including online technologies for generally improving the provision of business services e.g. automating supply chain processes. More specifically, though, it investigates the potential IS have for integrating information flows and providing timely, in-depth and better presented information and the options for online filing. It also explores the capabilities for improving interactions with clients and enhancing relationships with key stakeholders in the supply chain e.g. government agencies, overseas patent agents and lawyers. The paper additionally outlines key challenges that are at the forefront and need to be addressed when using IS within the IP sector such as identity management, security and authentication. The key findings of the research will be of value to researchers and practitioners in the IP field but many of the issues and challenges faced will also be applicable to other sectors.
is used; mobility is classified as remote, local or micro. Furthermore an extensive literature study is employed along with both market screening for systems and case studies of companies adopting as well as rejecting the technology. Both within research communities, software manufacturers......Mobile information technology (IT) seems an ideal innovation to promote effectiveness of the construction process, particularly at the construction site; research has over the last 15 years focused on solutions, potentials and barriers with this field. This paper aim at the duality between research...... and industry for an updated and forward looking comprehension, and view of tendencies, of the roles and potentials of mobile IT at the construction site including potential for further research. Qualitative and interpretive methodology inspired by information systems and sociology of research and construction...
Alwahaishi, Saleh; Snásel, Václav
The world has changed a lot in the past years. The rapid advances in technology and the changing of the communication channels have changed the way people work and, for many, where do they work from. The Internet and mobile technology, the two most dynamic technological forces in modern information and communications technology (ICT) are converging into one ubiquitous mobile Internet service, which will change our way of both doing business and dealing with our daily routine activities. As th...
Saleh Alwahaishi; Václav Snášel
The world has changed a lot in the past years. The rapid advances in technology and the changing of the communication channels have changed the way people work and, for many, where do they work from. The Internet and mobile technology, the two most dynamic technological forces in modern information and communications technology (ICT) are converging into one ubiquitous mobile Internet service, which will change our way of both doing business and dealing with our daily routine activities. As th...
and market for these items. Even for com- mercial items, there is a lack of PLT benchmarking information, which contributes to buyers’ perception...an item is not on an outline agreement (long-term contract) ◦ Micro purchases ◦ Other purchases c. Supplier selection process ◦ Market research process...to find the relationship between Q*/σ and the corresponding reorder point R, fitting to a logarith- mic function R = α – δlog(Q*/σ).6 She arrived at
Diaz-Doce, D.; Bee, E. J.; Bell, P. D.; Marchant, A. P.; Reay, S.; Richardson, S. L.; Shelley, W. A.
Over half of the population of the UK own a smartphone, and about the same number of people uses social media such as Twitter. For the British Geological Survey (BGS) this means millions of potential reporters of real-time events and in-the-field data capturers, creating a new source of scientific information that could help to better understand and predict natural processes. BGS first started collecting citizen data, using crowd-sourcing, through websites and smartphone apps focused on gathering geological related information (e.g. mySoil and myVolcano). These tools ask volunteers to follow a guided form where they can upload data related to geology and geological events; including location, description, measurements, photos, videos, or even instructions on sending physical samples. This information is used to augment existing data collections. Social media provides a different channel for gathering useful scientific information from the public. BGS is starting to explore this route with the release of GeoSocial-Aurora , a web mapping tool that searches for tweets related to aurora sightings and locates them as markers on a map. Users are actively encouraged to contribute by sending tweets about aurora sightings in a specific format, which contains the #BGSaurora hashtag, the location of the sighting, and any comments or pictures. The tool harvests these tweets through the Twitter REST API and places them on the map, enabling the user to generate clusters and heatmaps. GeoSocial-Aurora provides scientists with a potential tool for gathering useful data for scientific analysis. It collects actual aurora sighting locations, enabling users to check where the aurora is taking place in real time. This may, in time, help scientists to improve future predictions of when and where auroras are visible.
Blok Marek; Kaczmarek Sylwester; Młynarczuk Magdalena; Narloch Marcin
In this paper the architecture of the software designed for management of position and identification data of floating and flying objects in Maritime areas controlled by Polish Border Guard is presented. The software was designed for managing information stored in a distributed system with two variants of the software, one for a mobile device installed on a vessel, an airplane or a car and second for a central server. The details of implementation of all functionalities of the MapServer in bo...
Stokes, D F
Over the past three decades, the long-term care community has seen continual increases in the complexity and sophistication of management information systems. These changes have been brought about by the ever-increasing demands on owners and managers to provide accurate and timely data to both regulators and financial investors. The evolution of these systems has increased rapidly in recent years as the nation attempts to reinvent the funding mechanisms for long-term care.
... prescription skin cream for the ``face, neck, hands, arms, or any area not covered by clothing may come into... the clothing or outergarment due to a second appliance or medication.'' This language will clarify that a second clothing allowance may be paid when a second appliance and/or medication increases the...
Lewandowsky, S.; Brown, G. D.; Cook, J.
Improved communication of scientific findings requires knowledge not only of how people process information, but also how such information spreads through society and how people's opinions are shaped by those of others. Recent advances in cognitive science have yielded mathematical modeling techniques that permit the detailed analysis of individuals' cognition as well as the behavior of communities in the aggregate. We present two case studies that highlight the insights that can be derived from mathematical models of cognition: We show how rational processing of information (i.e., Bayesian hypothesis revision) can nonetheless give rise to seemingly 'irrational' belief updating, as for example when acceptance of human-caused global warming decreases among conservatives in response to evidence for human-caused global warming. We also show in an agent-based simulation how social norms can lead to polarization of societies. The model assumes that agents located within a social network observe the behavior of neighbours and infer from their behavior the social distribution of particular attitudes (e.g. towards climate change). Agents are assumed to dislike behaviours that are extreme within their neighbourhood (social extremeness aversion), and hence have a tendency to conform. However, agents are also assumed to prefer choices that are consistent with their own true beliefs (authenticity preference). Expression of attitudes reflects a compromise between these opposing principles. The model sheds light on the role of perceived rather than actual social consensus on attitudes to climate change. This is particularly relevant given the widespread perception among those who reject climate science that the percentage of the public that is sharing their beliefs is much higher than it actually is.
Umar, Z; Wan Mohd Akib, W A A; Ahmad, A
Flash flood is the most common environmental hazard worldwide. This phenomenon is usually occurs due to intense and prolonged rainfall spells on saturated ground. When there is a rapid rise in water levels and high flow-velocities of the stream occur, the channel overflows and the result is a flash flood. Flash floods normally cause a dangerous wall of roaring water carrying rocks, mud and other debris. On Tuesday, July 24, 2012 at 18:00 pm, a flash flood (debris flow) struck Kuranji River whereby 19 urban villages in seven (7) sub-districts in the city of Padang were affected by this flood disaster. The temporary loss estimated is 40 Billion US Dollar reported by the West Sumatra Provincial Government due to many damages of the built environment infrastructures. This include damaged houses of 878 units, mosque 15 units, irrigation damaged 12 units, bridges 6 units, schools 2 units and health posts 1 unit. Generally, widely used methods for making a landslide study are Geographic Information System (GIS) and Remote Sensing techniques. The landslide information extracted from remotely sensed products is mainly related to morphology, vegetation and hydrologic conditions of a slope. While GIS is used to create a database, data management, data display and to analyze data such as thematic maps of land use/land cover, normalized difference vegetation index (NDVI), rainfall data and soil texture. This paper highlights the analysis of the condition of the Watershed Kuranji River experiencing flash floods, using remote sensing satellite image of Landsat ETM 7 in 2009 and 2012 and Geographic Information System (GIS). Furthermore, the data was analyzed to determine whether this flash flood occurred due to extreme rain or collapse of existing natural dams in the upstream of the Kuranji River
In this paper we give a necessary condition in order for a geometrical surface to allow for Abelian fractional statistics. In particular, we show that such statistics is possible only for two-dimentional oriented surfaces of genus zero, namely the sphere S 2 , the plane R 2 and the cylindrical surface R 1 *S 1 , and in general the connected sum of n planes R 2 -R 2 -R 2 -...-R 2 . (Author)
Gernot G Supp
Full Text Available The increase of induced gamma-band responses (iGBRs; oscillations >30 Hz elicited by familiar (meaningful objects is well established in electroencephalogram (EEG research. This frequency-specific change at distinct locations is thought to indicate the dynamic formation of local neuronal assemblies during the activation of cortical object representations. As analytically power increase is just a property of a single location, phase-synchrony was introduced to investigate the formation of large-scale networks between spatially distant brain sites. However, classical phase-synchrony reveals symmetric, pair-wise correlations and is not suited to uncover the directionality of interactions. Here, we investigated the neural mechanism of visual object processing by means of directional coupling analysis going beyond recording sites, but rather assessing the directionality of oscillatory interactions between brain areas directly. This study is the first to identify the directionality of oscillatory brain interactions in source space during human object recognition and suggests that familiar, but not unfamiliar, objects engage widespread reciprocal information flow. Directionality of cortical information-flow was calculated based upon an established Granger-Causality coupling-measure (partial-directed coherence; PDC using autoregressive modeling. To enable comparison with previous coupling studies lacking directional information, phase-locking analysis was applied, using wavelet-based signal decompositions. Both, autoregressive modeling and wavelet analysis, revealed an augmentation of iGBRs during the presentation of familiar objects relative to unfamiliar controls, which was localized to inferior-temporal, superior-parietal and frontal brain areas by means of distributed source reconstruction. The multivariate analysis of PDC evaluated each possible direction of brain interaction and revealed widespread reciprocal information-transfer during familiar
Misinformation and public misunderstanding have given emissions trading a bad reputation in the public marketplace, says William F. Malec, executive vice president of the Tennessee Valley Authority (TVA), in Knoxville, Tennessee. Media coverage of a May 1992 emissions-allowance trade between TVA and Wisconsin Power and Light open-quotes focused on the agreement's pollution-trading aspects, not its overall potential economic and environmental benefits,close quotes Malec says. Such negative portrayal of TVA's transaction sparked severe public criticism and charges that emissions trading gives utilities the right to pollute. open-quotes The fact is that TVA sought the emissions-trading agreement as a means to reduce overall emissions in the most cost-effective way,close quotes Malec explains. Emissions trading allows a company with emission levels lower than clean-air standards to earn open-quotes credits.close quotes These credits then may be purchased by a company with emission levels that exceed federal standards. Under this arrangement, the environment is protected and companies that buy credits save money because they do not have to purchase expensive emissions-control devices or reduce their production levels. Malec says TVA decided to enter into the emissions-allowance market, not only to cut costs, but also to publicize the existence and benefits of emissions trading. However, TVA's experience proves that open-quotes people will not accept what they do not understand,close quotes concludes Malec, open-quotes especially when complex environmental issues are involved.close quotes
Full Text Available We developed a model of the input circuitry of the FD1 cell, an identified motion-sensitive interneuron in the blowfly’s visual system. The model circuit successfully reproduces the FD1 cell’s most conspicuous property: Its larger responses to objects than to spatially extended patterns. The model circuit also mimics the time-dependent responses of FD1 to dynamically complex naturalistic stimuli, shaped by the blowfly’s saccadic flight and gaze strategy: The FD1 responses are enhanced when, as a consequence of self-motion, a nearby object crosses the receptive field during intersaccadic intervals. Moreover, the model predicts that these object-induced responses are superimposed by pronounced pattern-dependent fluctuations during movements on virtual test flights in a three-dimensional environment with systematic modifications of the environmental patterns. Hence, the FD1 cell is predicted to detect not unambiguously objects defined by the spatial layout of the environment, but to be also sensitive to objects distinguished by textural features. These ambiguous detection abilities suggest an encoding of information about objects - irrespective of the features by which the objects are defined - by a population of cells, with the FD1 cell presumably playing a prominent role in such an ensemble.
Sulis, William H
Synchronization has a long history in physics where it refers to the phase matching of two identical oscillators. This notion has been extensively studied in physics as well as in biology, where it has been applied to such widely varying phenomena as the flashing of fireflies and firing of neurons in the brain. Human behavior, however, may be recurrent but it is not oscillatory even though many physiological systems do exhibit oscillatory tendencies. Moreover, much of human behaviour is collaborative and cooperative, where the individual behaviours may be distinct yet contemporaneous (if not simultaneous) and taken collectively express some functionality. In the context of behaviour, the important aspect is the repeated co-occurrence in time of behaviours that facilitate the propagation of information or of functionality, regardless of whether or not these behaviours are similar or identical. An example of this weaker notion of synchronization is transient induced global response synchronization (TIGoRS). Previous work has shown that TIGoRS is a ubiquitous phenomenon among complex systems, enabling them to stably parse environmental transients into salient units to which they stably respond. This leads to the notion of Sulis machines, which emergently generate a primitive linguistic structure through their dynamics. This article reviews the notion of TIGoRS and its expression in several complex systems models including tempered neural networks, driven cellular automata and cocktail party automata. The emergent linguistics of Sulis machines are discussed. A new class of complex systems model, the dispositional cellular automaton is introduced. A new metric for TIGoRS, the excess synchronization, is introduced and applied to the study of TIGoRS in dispositional cellular automata. It is shown that these automata exhibit a nonlinear synchronization response to certain perturbing transients.
Kevin M. Bradley
Full Text Available Synthetic biologists wishing to self-assemble large DNA (L-DNA constructs from small DNA fragments made by automated synthesis need fragments that hybridize predictably. Such predictability is difficult to obtain with nucleotides built from just the four standard nucleotides. Natural DNA's peculiar combination of strong and weak G:C and A:T pairs, the context-dependence of the strengths of those pairs, unimolecular strand folding that competes with desired interstrand hybridization, and non-Watson–Crick interactions available to standard DNA, all contribute to this unpredictability. In principle, adding extra nucleotides to the genetic alphabet can improve the predictability and reliability of autonomous DNA self-assembly, simply by increasing the information density of oligonucleotide sequences. These extra nucleotides are now available as parts of artificially expanded genetic information systems (AEGIS, and tools are now available to generate entirely standard DNA from AEGIS DNA during PCR amplification. Here, we describe the OligArch (for "oligonucleotide architecting" software, an application that permits synthetic biologists to engineer optimally self-assembling DNA constructs from both six- and eight-letter AEGIS alphabets. This software has been used to design oligonucleotides that self-assemble to form complete genes from 20 or more single-stranded synthetic oligonucleotides. OligArch is therefore a key element of a scalable and integrated infrastructure for the rapid and designed engineering of biology.
Zhou Yunlong; Chen Fei; Sun Bin
Based on the characteristic that wavelet packet transform image can be decomposed by different scales, a flow regime identification method based on image wavelet packet information entropy feature and genetic neural network was proposed. Gas-liquid two-phase flow images were captured by digital high speed video systems in horizontal pipe. The information entropy feature from transformation coefficients were extracted using image processing techniques and multi-resolution analysis. The genetic neural network was trained using those eigenvectors, which was reduced by the principal component analysis, as flow regime samples, and the flow regime intelligent identification was realized. The test result showed that image wavelet packet information entropy feature could excellently reflect the difference between seven typical flow regimes, and the genetic neural network with genetic algorithm and BP algorithm merits were with the characteristics of fast convergence for simulation and avoidance of local minimum. The recognition possibility of the network could reach up to about 100%, and a new and effective method was presented for on-line flow regime. (authors)
Zhou Yunlong; Zhang Xueqing; Gao Yunpeng; Cheng Yue
For studying flow regimes of gas/liquid two-phase in a vertical upward pipe, the conductance fluctuation information of four typical flow regimes was collected by a measuring the system with self-made multiple conductivity probes. Owing to the non-stationarity of conductance fluctuation signals of gas-liquid two-phase flow, a kind of' flow regime identification method based on wavelet packet Multi-scale Information Entropy and Hidden Markov Model (HMM) was put forward. First of all, the collected conductance fluctuation signals were decomposed into eight different frequency bands signals. Secondly, the wavelet packet multi-scale information entropy of different frequency bands signals were regarded as the input characteristic vectors of all states HMM which had been trained. In the end the regime identification of' the gas-liquid two-phase flow could be performed. The study showed that the method that HMM was applied to identify the flow regime was superior to the one that BP neural network was used, and the results proved that the method was efficient and feasible. (authors)
Marie Barbiero; Marie Barbiero; Célia Rousseau; Célia Rousseau; Charalambos Papaxanthis; Charalambos Papaxanthis; Olivier White; Olivier White
Whether the central nervous system is capable to switch between contexts critically depends on experimental details. Motor control studies regularly adopt robotic devices to perturb the dynamics of a certain task. Other approaches investigate motor control by altering the gravitoinertial context itself as in parabolic flights and human centrifuges. In contrast to conventional robotic experiments, where only the hand is perturbed, these gravitoinertial or immersive settings coherently plunge p...
Corona, R.; Montaldo, N.
Mediterranean ecosystems are typically heterogeneous, with contrasting plant functional types (PFT, woody vegetation and grass) that compete for water use. Due to the complexity of these ecosystems there is still uncertainty on the estimate of the evapotranspiration (ET). Micrometerological measurements (e.g. eddy covariance method based, EC ) are widely used for ET estimate, but in heterogeneous systems one of the main assumption (surface homogeneity) is not preserved and the method may become less robust. In this sense, the coupled use of sap flow sensors for tree transpiration estimate, surface temperature sensors, remote sensing information for land surface characterization allow to estimate the ET components and the energy balances of the three main land surface components (woody vegetation, grass and bare soil), overtaking the EC method uncertainties. The experimental site of Orroli, in Sardinia (Italy), is a typical Mediterranean heterogeneous ecosystem, monitored from the University of Cagliari since 2003. With the intent to perform an intensive field campaign for the ET estimation, we verified the potentiality of coupling eddy covariance (EC) method, infrared sensors and thermal dissipation methods (i.e. sap flow technique) for tree transpiration estimate. As a first step 3 commercial sap flux sensors were installed in a wild olive clump where the skin temperature of one tree in the clump was monitored with an infrared transducer. Then, other 54 handmade sensors were installed in 14 clumps in the EC footprint. Measurements of diameter were recorded in all the clumps and the sapwood depth was derived from measurements in several trees. The field ET estimation from the 4 commercial sensors was obtained assuming 4 different relationship between the monitored sap flux and the diameter of the species in the footprint. Instead for the 54 handmade sensors a scaling procedure was applied based on the allometric relationships between sapwood area, diameter and
Cheng, Liang; Jiang, Yue; Ju, Hong; Sun, Jie; Peng, Jiajie; Zhou, Meng; Hu, Yang
Since the establishment of the first biomedical ontology Gene Ontology (GO), the number of biomedical ontology has increased dramatically. Nowadays over 300 ontologies have been built including extensively used Disease Ontology (DO) and Human Phenotype Ontology (HPO). Because of the advantage of identifying novel relationships between terms, calculating similarity between ontology terms is one of the major tasks in this research area. Though similarities between terms within each ontology have been studied with in silico methods, term similarities across different ontologies were not investigated as deeply. The latest method took advantage of gene functional interaction network (GFIN) to explore such inter-ontology similarities of terms. However, it only used gene interactions and failed to make full use of the connectivity among gene nodes of the network. In addition, all existent methods are particularly designed for GO and their performances on the extended ontology community remain unknown. We proposed a method InfAcrOnt to infer similarities between terms across ontologies utilizing the entire GFIN. InfAcrOnt builds a term-gene-gene network which comprised ontology annotations and GFIN, and acquires similarities between terms across ontologies through modeling the information flow within the network by random walk. In our benchmark experiments on sub-ontologies of GO, InfAcrOnt achieves a high average area under the receiver operating characteristic curve (AUC) (0.9322 and 0.9309) and low standard deviations (1.8746e-6 and 3.0977e-6) in both human and yeast benchmark datasets exhibiting superior performance. Meanwhile, comparisons of InfAcrOnt results and prior knowledge on pair-wise DO-HPO terms and pair-wise DO-GO terms show high correlations. The experiment results show that InfAcrOnt significantly improves the performance of inferring similarities between terms across ontologies in benchmark set.
The fear for nuclear energy and more particularly for radioactive wastes is analyzed in the sociological context. Everybody agree on the information need, information is available but there is a problem for their diffusion. Reactions of the public are analyzed and journalists, scientists and teachers have a role to play [fr
Allan, M C
To place the fundamentals of clinical drug safety surveillance in a conceptual framework that will facilitate understanding and application of adverse drug event data to protect the health of the public and support a market for pharmaceutical manufacturers' products. Part I of this series provides a background for the discussion of drug safety by defining the basic terms and showing the flow of safety information through a pharmaceutical company. The customers for adverse drug event data are identified to provide a basis for providing quality service. The development of a drug product is briefly reviewed to show the evolution of safety data. Drug development and safety are defined by federal regulations. These regulations are developed by the FDA with information from pharmaceutical manufacturers. The intent of the regulations and the accompanying guidelines is described. An illustration from the news media is cited to show an alternative, positive approach to handling an adverse event report. This review uses primary sources from the federal laws (regulations), commentaries, and summaries. Very complex topics are briefly summarized in the text and additional readings are presented in an appendix. Secondary sources, ranging from newspaper articles to judicial summaries, illustrate the interpretation of adverse drug events and opportunities for drug safety surveillance intervention. The reference materials used were articles theoretically or practically applicable in the day-to-day practice of drug safety surveillance. The role of clinical drug safety surveillance in product monitoring and drug development is described. The process of drug safety surveillance is defined by the Food and Drug Administration regulations, product labeling, product knowledge, and database management. Database management is subdivided into the functions of receipt, retention, retrieval, and review of adverse event reports. Emphasis is placed on the dynamic interaction ;of the components
Full Text Available Informe del doctor Egon Lichetenberger ante el Consejo Directivo de la Facultad, sobre el curso de especialización en Anatomía Patológica patrocinado por la Kellogg Foundation (Departamento de Patología
Witherspoon, P.A.; Wang, J.S.Y.; Iwai, K.; Gale, J.E.
The validity of the cubic law for laminar flow of fluids through open fractures consisting of parallel planar plates has been established by others over a wide range of conditions with apertures ranging down to a minimum of 0.2 μm. The law may be given in simplified form by Q/Δh = C(2b) 3 , where Q is the flow rate, Δh is the difference in hydraulic head, C is a constant that depends on the flow geometry and fluid properties, and 2b is the fracture aperture. The validity of this law for flow in a closed fracture where the surfaces are in contact and the aperture is being decreased under stress has been investigated at room temperature using homogeneous samples of granite, basalt, and marble. Tension fractures were artifically induced and the laboratory setup used radial as well as straight flow geometries. Apertures ranged from 250 μm down to 4 μm. The cubic law was found to be valid whether the fracture surfaces were held open or were being closed under stress, and the results are not dependent on rock type. Permeability was uniquely defined by fracture aperture and was independent of the stress history used in these investigations. The effects of deviations from the ideal parallel plate concept only cause an apparent reduction in flow and may be incorporated into the cubic law by replacing C by C/f. The factor f varied from 1.04 to 1.65 in these investigations. The model of a fracture that is being closed under normal stress is visualized as being controlled by the strength of the asperities that are in contact. These contact areas are able to withstand significant stresses while maintaining space for fluids to continue to flow as the fracture aperture decreases. The controlling factor is the magnitude of the aperture and since flow depends on (2b) 3 , a slight change in aperture evidently can easily dominate any other change in the geometry of the flow field
Nakagawa, Hiroko; Yuno, Tomoji; Itho, Kiichi
Recently, specific detection method for Bacteria, by flow cytometry method using nucleic acid staining, was developed as a function of automated urine formed elements analyzer for routine urine testing. Here, we performed a basic study on this bacteria analysis method. In addition, we also have a comparison among urine sediment analysis, urine Gram staining and urine quantitative cultivation, the conventional methods performed up to now. As a result, the bacteria analysis with flow cytometry method that uses nucleic acid staining was excellent in reproducibility, and higher sensitivity compared with microscopic urinary sediment analysis. Based on the ROC curve analysis, which settled urine culture method as standard, cut-off level of 120/microL was defined and its sensitivity = 85.7%, specificity = 88.2%. In the analysis of scattergram, accompanied with urine culture method, among 90% of rod positive samples, 80% of dots were appeared in the area of 30 degrees from axis X. In addition, one case even indicated that analysis of bacteria by flow cytometry and scattergram of time series analysis might be helpful to trace the progress of causative bacteria therefore the information supposed to be clinically significant. Reporting bacteria information with nucleic acid staining flow cytometry method is expected to contribute to a rapid diagnostics and treatment of urinary tract infections. Besides, the contribution to screening examination of microbiology and clinical chemistry, will deliver a more efficient solution to urine analysis.
... 46 Shipping 5 2010-10-01 2010-10-01 false Allowable stress. 154.440 Section 154.440 Shipping COAST... Tank Type A § 154.440 Allowable stress. (a) The allowable stresses for an independent tank type A must... Commandant (CG-522). (b) A greater allowable stress than required in paragraph (a)(1) of this section may be...
... 46 Shipping 5 2010-10-01 2010-10-01 false Allowable stress. 154.421 Section 154.421 Shipping COAST... § 154.421 Allowable stress. The allowable stress for the integral tank structure must meet the American Bureau of Shipping's allowable stress for the vessel's hull published in “Rules for Building and Classing...
... 34 Education 2 2010-07-01 2010-07-01 false Allowable costs. 304.21 Section 304.21 Education... Grantee § 304.21 Allowable costs. In addition to the allowable costs established in the Education... allowable expenditures by projects funded under the program: (a) Cost of attendance, as defined in Title IV...
... 2 Grants and Agreements 1 2010-01-01 2010-01-01 false Allowable costs. 215.27 Section 215.27... § 215.27 Allowable costs. For each kind of recipient, there is a set of Federal principles for determining allowable costs. Allowability of costs shall be determined in accordance with the cost principles...
... 50 Wildlife and Fisheries 6 2010-10-01 2010-10-01 false Allowable costs. 80.15 Section 80.15... WILDLIFE RESTORATION AND DINGELL-JOHNSON SPORT FISH RESTORATION ACTS § 80.15 Allowable costs. (a) What are allowable costs? Allowable costs are costs that are necessary and reasonable for accomplishment of approved...
... 49 Transportation 4 2010-10-01 2010-10-01 false Allowable costs. 266.11 Section 266.11... TRANSPORTATION ACT § 266.11 Allowable costs. Allowable costs include only the following costs which are properly allocable to the work performed: Planning and program operation costs which are allowed under Federal...
Norri-Sederholm, Teija; Paakkonen, Heikki; Kurola, Jouni; Saranto, Kaija
In prehospital emergency medical services, one of the key factors in the successful delivery of appropriate care is the efficient management and supervision of the area's emergency medical services units. Paramedic field supervisors have an important role in this task. One of the key factors in the daily work of paramedic field supervisors is ensuring that they have enough of the right type of information when co-operating with other authorities and making decisions. However, a gap in information sharing still exists especially due to information overload. The aim of this study was to find out what type of critical information paramedic field supervisors need during multi-authority missions in order to manage their emergency medical services area successfully. The study also investigated both the flow of information, and interactions with the paramedic field supervisors and the differences that occur depending on the incident type. Ten paramedic field supervisors from four Finnish rescue departments participated in the study in January-March 2012. The data were collected using semi-structured interviews based on three progressive real-life scenarios and a questionnaire. Data were analysed using deductive content analysis. Data management and analysis were performed using Atlas.ti 7 software. Five critical information categories were formulated: Incident data, Mission status, Area status, Safety at work, and Tactics. Each category's importance varied depending on the incident and on whether it was about information needed or information delivered by the paramedic field supervisors. The main communication equipment used to receive information was the authority radio network (TETRA). However, when delivering information, mobile phones and TETRA were of equal importance. Paramedic field supervisors needed more information relating to area status. Paramedic field supervisors communicate actively with EMS units and other authorities such as Emergency Medical Dispatch
Randall, Allan D.; Freehafer, Douglas A.
A variety of watershed properties available in 2015 from geographic information systems were tested in regression equations to estimate two commonly used statistical indices of the low flow of streams, namely the lowest flows averaged over 7 consecutive days that have a 1 in 10 and a 1 in 2 chance of not being exceeded in any given year (7-day, 10-year and 7-day, 2-year low flows). The equations were based on streamflow measurements in 51 watersheds in the Lower Hudson River Basin of New York during the years 1958–1978, when the number of streamflow measurement sites on unregulated streams was substantially greater than in subsequent years. These low-flow indices are chiefly a function of the area of surficial sand and gravel in the watershed; more precisely, 7-day, 10-year and 7-day, 2-year low flows both increase in proportion to the area of sand and gravel deposited by glacial meltwater, whereas 7-day, 2-year low flows also increase in proportion to the area of postglacial alluvium. Both low-flow statistics are also functions of mean annual runoff (a measure of net water input to the watershed from precipitation) and area of swamps and poorly drained soils in or adjacent to surficial sand and gravel (where groundwater recharge is unlikely and riparian water loss to evapotranspiration is substantial). Small but significant refinements in estimation accuracy resulted from the inclusion of two indices of stream geometry, channel slope and length, in the regression equations. Most of the regression analysis was undertaken with the ordinary least squares method, but four equations were replicated by using weighted least squares to provide a more realistic appraisal of the precision of low-flow estimates. The most accurate estimation equations tested in this study explain nearly 84 and 87 percent of the variation in 7-day, 10-year and 7-day, 2-year low flows, respectively, with standard errors of 0.032 and 0.050 cubic feet per second per square mile. The equations
Patient Participation at Health Care Conferences: Engaged Patients Increase Information Flow, Expand Propagation, and Deepen Engagement in the Conversation of Tweets Compared to Physicians or Researchers.
Utengen, Audun; Rouholiman, Dara; Gamble, Jamison G; Grajales, Francisco Jose; Pradhan, Nisha; Staley, Alicia C; Bernstein, Liza; Young, Sean D; Clauson, Kevin A; Chu, Larry F
Health care conferences present a unique opportunity to network, spark innovation, and disseminate novel information to a large audience, but the dissemination of information typically stays within very specific networks. Social network analysis can be adopted to understand the flow of information between virtual social communities and the role of patients within the network. The purpose of this study is to examine the impact engaged patients bring to health care conference social media information flow and how they expand dissemination and distribution of tweets compared to other health care conference stakeholders such as physicians and researchers. From January 2014 through December 2016, 7,644,549 tweets were analyzed from 1672 health care conferences with at least 1000 tweets who had registered in Symplur's Health Care Hashtag Project from 2014 to 2016. The tweet content was analyzed to create a list of the top 100 influencers by mention from each conference, who were then subsequently categorized by stakeholder group. Multivariate linear regression models were created using stepwise function building to identify factors explaining variability as predictor variables for the model in which conference tweets were taken as the dependent variable. Inclusion of engaged patients in health care conference social media was low compared to that of physicians and has not significantly changed over the last 3 years. When engaged patient voices are included in health care conferences, they greatly increase information flow as measured by total tweet volume (beta=301.6) compared to physicians (beta=137.3, Psocial media impressions created (beta=1,700,000) compared to physicians (beta=270,000, PSocial network analysis of hubs and authorities revealed that patients had statistically significant higher hub scores (mean 8.26×10-4, SD 2.96×10-4) compared to other stakeholder groups' Twitter accounts (mean 7.19×10-4, SD 3.81×10-4; t273.84=4.302, Psocial media of health care
Pool, Sandra; Viviroli, Daniel; Seibert, Jan
Applications of runoff models usually rely on long and continuous runoff time series for model calibration. However, many catchments around the world are ungauged and estimating runoff for these catchments is challenging. One approach is to perform a few runoff measurements in a previously fully ungauged catchment and to constrain a runoff model by these measurements. In this study we investigated the value of such individual runoff measurements when taken at strategic points in time for applying a bucket-type runoff model (HBV) in ungauged catchments. Based on the assumption that a limited number of runoff measurements can be taken, we sought the optimal sampling strategy (i.e. when to measure the streamflow) to obtain the most informative data for constraining the runoff model. We used twenty gauged catchments across the eastern US, made the assumption that these catchments were ungauged, and applied different runoff sampling strategies. All tested strategies consisted of twelve runoff measurements within one year and ranged from simply using monthly flow maxima to a more complex selection of observation times. In each case the twelve runoff measurements were used to select 100 best parameter sets using a Monte Carlo calibration approach. Runoff simulations using these 'informed' parameter sets were then evaluated for an independent validation period in terms of the Nash-Sutcliffe efficiency of the hydrograph and the mean absolute relative error of the flow-duration curve. Model performance measures were normalized by relating them to an upper and a lower benchmark representing a well-informed and an uninformed model calibration. The hydrographs were best simulated with strategies including high runoff magnitudes as opposed to the flow-duration curves that were generally better estimated with strategies that captured low and mean flows. The choice of a sampling strategy covering the full range of runoff magnitudes enabled hydrograph and flow-duration curve
Listerud, J.; Altas, S.W.; Axel, L.
One of the unique characteristics of magnetic resonance imaging (MRI) is its depiction of flow, even without the administration of intravascular contrast agents. Flow-related phenomena were recognized early in the development of nuclear magnetic resonance, well before imaging techniques were even devised. The appearance of flowing fluid (e.g., blood or CSF) is important to understand for several reasons. First, its signal intensity is quite variable, even in normal physiologic states, so the possibility of misinterpreting normal findings as representing pathologic conditions, such as vascular thrombosis, is reduced if one has a solid conceptualization of the physical basis of flow effects. Second, signal emanating from flowing spins often generates significant artifacts that can obscure anatomy and degrade images, thereby reducing the radiologist's ability to interpret images and identify lesions. The understanding of flow artifacts allows one to recognize them and implement maneuvers to eliminate or compensate for such effects. Third, it has become apparent that the signal information from flow can be exploited to provide previously unavailable physiologic information
Information is a valuable resource; anybody that has ... compliments the human memory thereby allowing the flow of .... components of managing these important ..... 7th ed. London: Gower. Pugh, M.J. (1994). Providing reference services for.
U.S. Environmental Protection Agency — The Allowances Query Wizard is part of a suite of Clean Air Markets-related tools that are accessible at http://camddataandmaps.epa.gov/gdm/index.cfm. The Allowances...
U.S. Environmental Protection Agency — The Allowance Holdings and Transfers Data Inventory contains measured data on holdings and transactions of allowances under the NOx Budget Trading Program (NBP), a...
Pelc, N.J.; Spritzer, C.E.; Lee, J.N.
A rapid, phase-contrast, MR imaging method of imaging flow has been implemented. The method, called VIGRE (velocity imaging with gradient recalled echoes), consists of two interleaved, narrow flip angle, gradient-recalled acquisitions. One is flow compensated while the second has a specified flow encoding (both peak velocity and direction) that causes signals to contain additional phase in proportion to velocity in the specified direction. Complex image data from the first acquisition are used as a phase reference for the second, yielding immunity from phase accumulation due to causes other than motion. Images with pixel values equal to MΔΘ where M is the magnitude of the flow compensated image and ΔΘ is the phase difference at the pixel, are produced. The magnitude weighting provides additional vessel contrast, suppresses background noise, maintains the flow direction information, and still allows quantitative data to be retrieved. The method has been validated with phantoms and is undergoing initial clinical evaluation. Early results are extremely encouraging
... applicable to the entity incurring the costs. Thus, allowability of costs incurred by State, local or... Circular A-87, “Cost Principles for State and Local Governments.” The allowability of costs incurred by non... Principles for Non-Profit Organizations.” The allowability of costs incurred by institutions of higher...
... cost principles applicable to the entity incurring the costs. Thus, allowability of costs incurred by... Governments.” The allowability of costs incurred by non-profit organizations is determined in accordance with... Organizations.” The allowability of costs incurred by institutions of higher education is determined in...
... to the entity incurring the costs. Thus, allowability of costs incurred by State, local or federally..., “Cost Principles for State and Local Governments.” The allowability of costs incurred by non-profit...-Profit Organizations.” The allowability of costs incurred by institutions of higher education is...
... cost principles applicable to the entity incurring the costs. Thus, allowability of costs incurred by... at 2 CFR part 225. The allowability of costs incurred by non-profit organizations is determined in... at 2 CFR part 230. The allowability of costs incurred by institutions of higher education is...
... applicable to the entity incurring the costs. Thus, allowability of costs incurred by State, local or... Circular A-87, “Cost Principles for State and Local Governments.” The allowability of costs incurred by non... Principles for Non-Profit Organizations.” The allowability of costs incurred by institutions of higher...
... applicable to the entity incurring the costs. Thus, allowability of costs incurred by State, local or... Circular A-87, “Cost Principles for State and Local Governments.” The allowability of costs incurred by non... Principles for Non-Profit Organizations.” The allowability of costs incurred by institutions of higher...
... 46 Shipping 5 2010-10-01 2010-10-01 false Allowable stress. 154.428 Section 154.428 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) CERTAIN BULK DANGEROUS CARGOES SAFETY STANDARDS FOR... § 154.428 Allowable stress. The membrane tank and the supporting insulation must have allowable stresses...
... 46 Shipping 5 2010-10-01 2010-10-01 false Allowable stress. 154.447 Section 154.447 Shipping COAST... Tank Type B § 154.447 Allowable stress. (a) An independent tank type B designed from bodies of revolution must have allowable stresses 3 determined by the following formulae: 3 See Appendix B for stress...
... 42 Public Health 3 2010-10-01 2010-10-01 false Allowable costs. 417.802 Section 417.802 Public... PLANS Health Care Prepayment Plans § 417.802 Allowable costs. (a) General rule. The costs that are considered allowable for HCPP reimbursement are the same as those for reasonable cost HMOs and CMPs specified...
... 45 Public Welfare 3 2010-10-01 2010-10-01 false Allowable costs. 1180.56 Section 1180.56 Public... by a Grantee General Administrative Responsibilities § 1180.56 Allowable costs. (a) Determination of costs allowable under a grant is made in accordance with government-wide cost principles in applicable...
... costs. An institution's share of allowable costs may be in cash or in the form of services. The... 34 Education 3 2010-07-01 2010-07-01 false Allowable costs. 675.33 Section 675.33 Education... costs. (a)(1) Allowable and unallowable costs. Except as provided in paragraph (a)(2) of this section...
Sue E. Jackson
Full Text Available Studies that apply indigenous ecological knowledge to contemporary resource management problems are increasing globally; however, few of these studies have contributed to environmental water management. We interviewed three indigenous landowning groups in a tropical Australian catchment subject to increasing water resource development pressure and trialed tools to integrate indigenous and scientific knowledge of the biology and ecology of freshwater fish to assess their water requirements. The differences, similarities, and complementarities between the knowledge of fish held by indigenous people and scientists are discussed in the context of the changing socioeconomic circumstances experienced by indigenous communities of north Australia. In addition to eliciting indigenous knowledge that confirmed field fish survey results, the approach generated knowledge that was new to both science and indigenous participants, respectively. Indigenous knowledge influenced (1 the conceptual models developed by scientists to understand the flow ecology and (2 the structure of risk assessment tools designed to understand the vulnerability of particular fish to low-flow scenarios.
Martha, Cornelius T; Hoogendoorn, Jan-Carel; Irth, Hubertus; Niessen, Wilfried M A
Current development in catalyst discovery includes combinatorial synthesis methods for the rapid generation of compound libraries combined with high-throughput performance-screening methods to determine the associated activities. Of these novel methodologies, mass spectrometry (MS) based flow chemistry methods are especially attractive due to the ability to combine sensitive detection of the formed reaction product with identification of introduced catalyst complexes. Recently, such a mass spectrometry based continuous-flow reaction detection system was utilized to screen silver-adducted ferrocenyl bidentate catalyst complexes for activity in a multicomponent synthesis of a substituted 2-imidazoline. Here, we determine the merits of different ionization approaches by studying the combination of sensitive detection of product formation in the continuous-flow system with the ability to simultaneous characterize the introduced [ferrocenyl bidentate+Ag](+) catalyst complexes. To this end, we study the ionization characteristics of electrospray ionization (ESI), atmospheric-pressure chemical ionization (APCI), no-discharge APCI, dual ESI/APCI, and dual APCI/no-discharge APCI. Finally, we investigated the application potential of the different ionization approaches by the investigation of ferrocenyl bidentate catalyst complex responses in different solvents. Copyright © 2011 Elsevier B.V. All rights reserved.
Moss, Thomas; Ihlefeld, Curtis; Slack, Barry
thermal equilibrium with the test flow of GN2. The temperature drop of each branch from its "no flow" stable temperature peak to its stable "with flow" temperature will allow the operator to determine whether a minimum level of flow exists. An alternative operation has the operator turning on the software only long enough to record the ambient temperature of the tubing before turning on the heaters and initiating GN2 flow. The stable temperature of the heated tubing with GN2 flow is then compared with the ambient tubing temperature to determine if flow is present in each branch. To help quantify the level of flow in the manifolds, each branch will be bench calibrated to establish its thermal properties using the flow detection system and different flow rates. These calibration values can then be incorporated into the software application to provide more detailed flow rate information.
Ricks, Wendell; Corker, Kevin
Primary Flight Display (PFD) information management and cockpit display of information management research is presented in viewgraph form. The information management problem in the cockpit, information management burdens, the key characteristics of an information manager, the interface management system handling the flow of information and the dialogs between the system and the pilot, and overall system architecture are covered.
Liboriussen, L.; Landkildehus, F.; Meerhoff, M.
design details, operating characteristics, and background information on a currently operating experimental flow-through mesocosm system that allows investigation of the interactions between simulated climate warming and eutrophication and their impacts on biological structure and ecosystem processes...
Mills, Freya; Petterson, Susan; Norman, Guy
Public health benefits are often a key political driver of urban sanitation investment in developing countries, however, pathogen flows are rarely taken systematically into account in sanitation investment choices. While several tools and approaches on sanitation and health risks have recently been developed, this research identified gaps in their ability to predict faecal pathogen flows, to relate exposure risks to the existing sanitation services, and to compare expected impacts of improvements. This paper outlines a conceptual approach that links faecal waste discharge patterns with potential pathogen exposure pathways to quantitatively compare urban sanitation improvement options. An illustrative application of the approach is presented, using a spreadsheet-based model to compare the relative effect on disability-adjusted life years of six sanitation improvement options for a hypothetical urban situation. The approach includes consideration of the persistence or removal of different pathogen classes in different environments; recognition of multiple interconnected sludge and effluent pathways, and of multiple potential sites for exposure; and use of quantitative microbial risk assessment to support prediction of relative health risks for each option. This research provides a step forward in applying current knowledge to better consider public health, alongside environmental and other objectives, in urban sanitation decision making. Further empirical research in specific locations is now required to refine the approach and address data gaps. PMID:29360775
... 42 Public Health 1 2010-10-01 2010-10-01 false Benefits: Stipends; dependency allowances; travel...; dependency allowances; travel allowances; vacation. Individuals awarded regular fellowships shall be entitled...) Stipend. (b) Dependency allowances. (c) When authorized in advance, separate allowances for travel. Such...
... 42 Public Health 1 2010-10-01 2010-10-01 false Payments: Stipends; dependency allowances; travel... FELLOWSHIPS, INTERNSHIPS, TRAINING FELLOWSHIPS Regular Fellowships § 61.9 Payments: Stipends; dependency allowances; travel allowances. Payments for stipends, dependency allowances, and the travel allowances...
... allowable. The amount of compensation allowable is limited to the actual net reduction or loss of earnings or profits suffered. Calculations for net reductions or losses must clearly reflect adjustments for... available; (d) Any saved overhead or normal expenses not incurred as a result of the incident; and (e) State...
... to the type of entity incurring the cost as follows: (1) For-profit organizations. Allowability of costs incurred by for-profit organizations and those nonprofit organizations listed in Attachment C to... specifically authorized in the award document. (2) Other types of organizations. Allowability of costs incurred...
... accounting standards that comply with cost principles acceptable to the Federal agency. [53 FR 8069, 8087... LOCAL GOVERNMENTS Post-Award Requirements Financial Administration § 97.22 Allowable costs. (a... increment above allowable costs) to the grantee or subgrantee. (b) Applicable cost principles. For each kind...
... Procedures, or uniform cost accounting standards that comply with cost principles acceptable to the Federal... AGREEMENTS TO STATE AND LOCAL GOVERNMENTS Post-Award Requirements Financial Administration § 135.22 Allowable... principles. For each kind of organization, there is a set of Federal principles for determining allowable...
... Procedures or uniform cost accounting standards that comply with cost principles acceptable to ED. (b) The... OF HIGHER EDUCATION, HOSPITALS, AND OTHER NON-PROFIT ORGANIZATIONS Post-Award Requirements Financial... principles for determining allowable costs. Allowability of costs are determined in accordance with the cost...
... uniform cost accounting standards that comply with cost principles acceptable to the Federal agency. ... STATE AND LOCAL GOVERNMENTS Post-Award Requirements Financial Administration § 13.22 Allowable costs. (a... increment above allowable costs) to the grantee or subgrantee. (b) Applicable cost principles. For each kind...
... uniform cost accounting standards that comply with cost principles acceptable to the Federal agency. ... TRIBAL GOVERNMENTS Post-Award Requirements Financial Administration § 85.22 Allowable costs. (a... increment above allowable costs) to the grantee or subgrantee. (b) Applicable cost principles. For each kind...
... uniform cost accounting standards that comply with cost principles acceptable to the Federal agency. ... GOVERNMENTS Post-Award Requirements Financial Administration § 1207.22 Allowable costs. (a) Limitation on use... increment above allowable costs) to the grantee or subgrantee. (b) Applicable cost principles. For each kind...
... accounting standards that comply with cost principles acceptable to the Federal agency. ... Post-Award Requirements Financial Administration § 33.22 Allowable costs. (a) Limitation on use of... allowable costs) to the grantee or subgrantee. (b) Applicable cost principles. For each kind of organization...
... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Allowable projects. 631.84 Section 631.84... THE JOB TRAINING PARTNERSHIP ACT Disaster Relief Employment Assistance § 631.84 Allowable projects...) Shall be used exclusively to provide employment on projects that provide food, clothing, shelter and...
... accounting standards that comply with cost principles acceptable to the Federal agency. ... the grantee or subgrantee. (b) Applicable cost principles. For each kind of organization, there is a set of Federal principles for determining allowable costs. Allowable costs will be determined in...
... GRANTS AND AGREEMENTS WITH INSTITUTIONS OF HIGHER EDUCATION, HOSPITALS, OTHER NON-PROFIT, AND COMMERCIAL ORGANIZATIONS Post-Award Requirements Financial and Program Management § 14.27 Allowable costs. For each kind of... Organizations.” The allowability of costs incurred by institutions of higher education is determined in...
... GRANTS AND AGREEMENTS WITH INSTITUTIONS OF HIGHER EDUCATION, HOSPITALS, AND OTHER NON-PROFIT ORGANIZATIONS Post-Award Requirements Financial and Program Management § 2543.27 Allowable costs. For each kind... Organizations.” The allowability of costs incurred by institutions of higher education is determined in...
... AND AGREEMENTS (INCLUDING SUBAWARDS) WITH INSTITUTIONS OF HIGHER EDUCATION, HOSPITALS AND OTHER NON-PROFIT ORGANIZATIONS Post-Award Requirements Financial and Program Management § 70.27 Allowable costs. (a... Organizations.” The allowability of costs incurred by institutions of higher education is determined in...
... ADMINISTRATIVE REQUIREMENTS FOR GRANTS AND AGREEMENTS WITH INSTITUTIONS OF HIGHER EDUCATION, HOSPITALS, AND OTHER NON-PROFIT ORGANIZATIONS Post-Award Requirements Financial and Program Management § 49.27 Allowable...-Profit Organizations.” The allowability of costs incurred by institutions of higher education is...
... AGREEMENTS WITH INSTITUTIONS OF HIGHER EDUCATION, HOSPITALS, OTHER NON-PROFIT ORGANIZATIONS, AND COMMERCIAL ORGANIZATIONS Post-Award Requirements Financial and Program Management § 435.27 Allowable costs. For each kind... Organizations.” (c) Allowability of costs incurred by institutions of higher education is determined in...
... ADMINISTRATIVE REQUIREMENTS FOR GRANTS AND AGREEMENTS WITH INSTITUTIONS OF HIGHER EDUCATION, HOSPITALS, AND OTHER NON-PROFIT ORGANIZATIONS Post-Award Requirements Financial and Program Management § 30.27 Allowable...-Profit Organizations.” The allowability of costs incurred by institutions of higher education is...
Beverly, James E.; Xue, Lan; Lee, Chung-Shing
Reports on the use of the Internet and World Wide Web as a virtual technology market (VTM) for information and technology transfer. The project focuses on creating awareness of technology demand (problems) and linking it to technology supply (solutions) in the field of particle technology and multiphase processes in the chemical industry. Benefits…
Vášková, M.; Mejstříková, E.; Kalina, T.; Martinková, Patrícia; Omelka, M.; Trka, J.; Starý, J.; Hrušák, O.
Roč. 19, č. 5 (2005), s. 876-878 ISSN 0887-6924 Source of funding: V - iné verejné zdroje Keywords : transfer * genomics * information * cytometry * expression * discriminates * subtypesacute * lymphoblastic * leukemia Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 6.612, year: 2005
Curcic-Blake, Branislava; Swart, Marte; Aleman, Andre
Everyday language is replete with descriptions of emotional events that people have experienced and wish to share with others. Such descriptions presumably rely on pairings of affective words and visual information (such as events and pictures) that have been learnt throughout one's development. To
Yang, Qinghua; Yang, Fan; Zhou, Chun
Purpose: The purpose of this paper is to investigate how the information about haze, a term used in China to describe the air pollution problem, is portrayed on Chinese social media by different types of organizations using the theoretical framework of the health belief model (HBM). Design/methodology/approach: A content analysis was conducted…
Murry, D.A.; Nan, G.D.; Harrington, B.M.
In recent years interest rates have fluctuated from exceptionally high levels in the early 1980s to their current levels, the lowest in two decades. Observers and analysts generally have assumed that allowed returns by regulatory commissions follow the movement of interest rates; indeed some analysts use a risk premium method to estimate the cost of common equity, assuming a constant and linear relationship between interest rates and the cost of common equity. That suggests we could expect a relatively stable relationship between interest rates and allowed returns, as well. However, a simple comparison of allowed returns and interest rates shows that this is not the case in recent years. The relationship between market interest rates and the returns allowed by commissions varies and is obviously a great deal more complicated. Empirically, there appears to be only a narrow range where market interest rates significantly affect the allowed returns on common stock set by state commissions, at least for electric and combination utilities. If rates are at historically low levels, allowed returns based largely on market rates will hasten subsequent rate filings, and commissions appear to look beyond the low rate levels. Conversely, it appears that regulators do not let historically high market rates determine allowed returns either. At either high or low interest levels, caution seems to be the policy
Rosenzweig, K.M.; Villarreal, J.A.
The Clean Air Act Amendments of 1990 (CAAA) established a sulfur dioxide emission allowance system to be implemented by the US Environmental Protection Agency (EPA). Under the two-phase implementation of the program, electric utilities responsible for approximately 70 percent of SO 2 emissions in the United States will be issued emission allowances, each representing authorization to emit one ton of sulfur dioxide during a specified calendar year or a later year. Allowances will be issued to utilities with electric-generating units affected by the CAAA limits, as well as to certain entities which may choose to opt-in to the program. Each utility or other emission source must hold a number of allowances at least equal to its total SO 2 emissions during any given year. Unused allowances may be sold, traded, or held in inventory for use against SO 2 emissions in future years. Anyone can buy and hold allowances, including affected utilities, non-utility companies, SO 2 allowances brokers and dealers, environmental groups, and individuals. During Phase I of the program, allowances equivalent to approximately 6.4 million tons of SO 2 emissions will be allocated annually to a group of 110 large, high-SO 2 -emitting power plants. In Phase II, virtually all power-generating utilities (representing approximately 99.4 percent of total US utility emissions) will be subject to the program. The number of allowances issued will increase to approximately 8.9 million a year, with certain special allocations raising the actual number issued to 9.48 million between the years 2000 to 2009, and 8.95 million yearly thereafter. Thus, the CAAA goal of annual emissions of 9 million tons should be achieved by 2010, when virtually all US emission sources will be participating in the program
This paper provides a short history of family allowances and documents the fact that Keynes supported family allowances as early as the 1920s, continuing through the 1930s and early 1940s. Keynes saw this policy as a way to help households raise their children and also as a way to increase consumption without reducing business investment. The paper goes on to argue that a policy of family allowances is consistent with Keynesian economics. Finally, the paper uses the Luxembourg Income Study to...
Full Text Available Using a large sample of firms listed on the Korea Stock Exchange over 1998–2007, this study investigates whether and how trading by foreign and domestic institutional investors improves the extent to which firm-specific information is incorporated into stock prices, captured by stock price synchronicity. We find, first, that stock price synchronicity decreases significantly with the intensity of trading by foreign investors and domestic institutional investors. Second, trading by foreign investors facilitates the incorporation of firm-specific information into stock prices to a greater extent than trading by aggregate domestic institutions. Third, among domestic institutions with differing investment horizons, short-term investing institutions, such as securities and investment trust companies, play a more important role in incorporating firm-specific information into stock prices via their trading activities, compared with long-term investing institutions, such as banks and insurance companies. Finally, we provide evidence suggesting that trading by foreign and domestic short-term institutions reduces the extent of accrual mispricing. Our results are robust to a variety of sensitivity checks.
Full Text Available Knowing the depth structure of the environment is crucial for moving animals in many behavioral contexts, such as collision avoidance, targeting objects, or spatial navigation. An important source of depth information is motion parallax. This powerful cue is generated on the eyes during translatory self-motion with the retinal images of nearby objects moving faster than those of distant ones. To investigate how the visual motion pathway represents motion-based depth information we analyzed its responses to image sequences recorded in natural cluttered environments with a wide range of depth structures. The analysis was done on the basis of an experimentally validated model of the visual motion pathway of insects, with its core elements being correlation-type elementary motion detectors (EMDs. It is the key result of our analysis that the absolute EMD responses, i.e. the motion energy profile, represent the contrast-weighted nearness of environmental structures during translatory self-motion at a roughly constant velocity. In other words, the output of the EMD array highlights contours of nearby objects. This conclusion is largely independent of the scale over which EMDs are spatially pooled and was corroborated by scrutinizing the motion energy profile after eliminating the depth structure from the natural image sequences. Hence, the well-established dependence of correlation-type EMDs on both velocity and textural properties of motion stimuli appears to be advantageous for representing behaviorally relevant information about the environment in a computationally parsimonious way.
Habibnejad Korayem, M.; Ghariblu, H.
This paper develops a computational technique for finding the maximum allowable load of mobile manipulator during a given trajectory. The maximum allowable loads which can be achieved by a mobile manipulator during a given trajectory are limited by the number of factors; probably the dynamic properties of mobile base and mounted manipulator, their actuator limitations and additional constraints applied to resolving the redundancy are the most important factors. To resolve extra D.O.F introduced by the base mobility, additional constraint functions are proposed directly in the task space of mobile manipulator. Finally, in two numerical examples involving a two-link planar manipulator mounted on a differentially driven mobile base, application of the method to determining maximum allowable load is verified. The simulation results demonstrates the maximum allowable load on a desired trajectory has not a unique value and directly depends on the additional constraint functions which applies to resolve the motion redundancy
...) MARINE POLLUTION FINANCIAL RESPONSIBILITY AND COMPENSATION OIL SPILL LIABILITY TRUST FUND; CLAIMS... allowable under paragraph (a) of this section must be reduced by— (1) All compensation made available to the... under § 136.235. Government Revenues ...
Fahad Khalil; Jacques Lawarree; Sungho Yun
Rewards to prevent supervisors from accepting bribes create incentives for extortion. This raises the question whether a supervisor who can engage in bribery and extortion can still be useful in providing incentives. By highlighting the role of team work in forging information, we present a notion of soft information that makes supervision valuable. We show that a fear of inducing extortion may make it optimal to allow bribery, but extortion is never tolerated. Even though both increase incen...
Full Text Available Changes of neural oscillations at a variety of physiological rhythms are effectively associated with cognitive performance. The present study investigated whether the directional indices of neural information flow (NIF could be used to symbolize the synaptic plasticity impairment in hippocampal CA3-CA1 network in a rat model of melamine. Male Wistar rats were employed while melamine was administered at a dose of 300 mg/kg/day for 4 weeks. Behavior was measured by the Morris water maze(MWMtest. Local field potentials (LFPs were recorded before long-term potentiation (LTP induction. Generalized partial directed coherence (gPDC and phase-amplitude coupling conditional mutual information (PAC_CMI were used to measure the unidirectional indices in both theta and low gamma oscillations (LG, ~ 30-50 Hz. Our results showed that melamine induced the cognition deficits consistent with the reduced LTP in CA1 area. Phase locking values (PLVs showed that the synchronization between CA3 and CA1 in both theta and LG rhythms was reduced by melamine. In both theta and LG rhythms, unidirectional indices were significantly decreased in melamine treated rats while a similar variation trend was observed in LTP reduction, implying that the effects of melamine on cognitive impairment were possibly mediated via profound alterations of NIF on CA3-CA1 pathway in hippocampus. The results suggested that LFPs activities at these rhythms were most likely involved in determining the alterations of information flow in the hippocampal CA3-CA1 network, which might be associated with the alteration of synaptic transmission to some extent.
Estimated trends in emission allowance (EA) values have been of interest to all those affected by the Clean Air Act Amendments of 1990 since it became law in 1990. The authors published estimates of the values of EAs in December 1991, and revised their estimate in November 1992. The summary trends of the 1992 estimate is shown here. General estimates such as these are no longer useful. Everyone directly involved in complying with the Act or in buying and selling allowances has developed their own outlook on EA values. Many recent trades have been publicized. The prices from the first auction are also well known. Therefore this article is concerned only with what might happening the long-run. Once Phase 2 compliance is essentially complete and emissions roughly match Emission Allowance allocations of some 9.8 million tons annually, what pressures will there be on prices? What will be the direction of values after Phase 2 is in balance?
The SO 2 tradable allowance program has been introduced into an electric industry undergoing dramatic changes. Entry of nonutilities into the industry and the emergence of stranded costs are two major changes that are shown to have an impact on the market for allowances and the industry's incentives to switch to cleaner fuels. The degree of impact depends on the extent to which consumers bypass traditional utilities and buy from entrants, and on public utility commission policies regarding the recovery of stranded costs. In turn, the amount of stranded costs depends on fuel switching. The results follow from simulations of a two-utility model that illustrate the qualitative effects of changing policies
Bai, Wei; Yang, Hui; Yu, Ao; Xiao, Hongyun; He, Linkuan; Feng, Lei; Zhang, Jie
The leakage of confidential information is one of important issues in the network security area. Elastic Optical Networks (EON) as a promising technology in the optical transport network is under threat from eavesdropping attacks. It is a great demand to support confidential information service (CIS) and design efficient security strategy against the eavesdropping attacks. In this paper, we propose a solution to cope with the eavesdropping attacks in routing and spectrum allocation. Firstly, we introduce probability theory to describe eavesdropping issue and achieve awareness of eavesdropping attacks. Then we propose an eavesdropping-aware routing and spectrum allocation (ES-RSA) algorithm to guarantee information security. For further improving security and network performance, we employ multi-flow virtual concatenation (MFVC) and propose an eavesdropping-aware MFVC-based secure routing and spectrum allocation (MES-RSA) algorithm. The presented simulation results show that the proposed two RSA algorithms can both achieve greater security against the eavesdropping attacks and MES-RSA can also improve the network performance efficiently.
Emily Silver Huff
Full Text Available Privately owned woodlands are an important source of timber and ecosystem services in North America and worldwide. Impacts of management on these ecosystems and timber supply from these woodlands are difficult to estimate because complex behavioral theory informs the owner's management decisions. The decision-making environment consists of exogenous market factors, internal cognitive processes, and social interactions with fellow landowners, foresters, and other rural community members. This study seeks to understand how social interactions, information flow, and peer-to-peer networks influence timber harvesting behavior using an agent-based model. This theoretical model includes forested polygons in various states of 'harvest readiness' and three types of agents: forest landowners, foresters, and peer leaders (individuals trained in conservation who use peer-to-peer networking. Agent rules, interactions, and characteristics were parameterized with values from existing literature and an empirical survey of forest landowner attitudes, intentions, and demographics. The model demonstrates that as trust in foresters and peer leaders increases, the percentage of the forest that is harvested sustainably increases. Furthermore, peer leaders can serve to increase landowner trust in foresters. Model output and equations will inform forest policy and extension/outreach efforts. The model also serves as an important testing ground for new theories of landowner decision making and behavior.
Chinnov, Evgeny A.; Guzanov, Vladimir V.; Cheverda, Vyacheslav; Markovich, Dmitry M.; Kabov, Oleg A.
Experimental study of two-phase flow in the short rectangular horizontal channel with height 440 μm has been performed. Characteristics of liquid motion inside the channel have been registered and measured by the Laser Induced Fluorescence technique. New information has allowed determining more precisely the characteristics of churn regime and boundaries between different regimes of two-phase flow. It was shown that formation of some two-phase flow regimes and transitions between them are determined by instability of the flow in the lateral parts of the channel.
... Relations AGENCY FOR INTERNATIONAL DEVELOPMENT ADMINISTRATION OF ASSISTANCE AWARDS TO U.S. NON-GOVERNMENTAL ORGANIZATIONS Post-award Requirements Financial and Program Management § 226.27 Allowable costs. For each kind... organizations is determined in accordance with the provisions of OMB Circular A-122, “Cost Principles for Non...
... to that circular 48 CFR part 31. Contract Cost Principles and Procedures, or uniform cost accounting... Financial Administration § 143.22 Allowable costs. (a) Limitation on use of funds. Grant funds may be used... grantee or subgrantee. (b) Applicable cost principles. For each kind of organization, there is a set of...
... accounting standards that comply with cost principles acceptable to the Federal agency. ... Requirements Financial Administration § 43.22 Allowable costs. (a) Limitation on use of funds. Grant funds may... the grantee or subgrantee. (b) Applicable cost principles. For each kind of organization, there is a...
... to that circular 48 CFR part 31. Contract Cost Principles and Procedures, or uniform cost accounting... Financial Administration § 1470.22 Allowable costs. (a) Limitation on use of funds. Grant funds may be used... grantee or subgrantee. (b) Applicable cost principles. For each kind of organization, there is a set of...
... accounting standards that comply with cost principles acceptable to the Federal agency. ... Requirements Financial Administration § 31.22 Allowable costs. (a) Limitation on use of funds. Grant funds may... the grantee or sub-grantee. (b) Applicable cost principles. For each kind of organization, there is a...
... CFR part 31. Contract Cost Principles and Procedures, or uniform cost accounting standards that comply... COOPERATIVE AGREEMENTS TO STATE AND LOCAL GOVERNMENTS Post-Award Requirements Financial Administration § 80.22... kind of organization, there is a set of Federal principles for determining allowable costs. For the...
... to that circular 48 CFR Part 31. Contract Cost Principles and Procedures, or uniform cost accounting... Financial Administration § 92.22 Allowable costs. (a) Limitation on use of funds. Grant funds may be used... grantee or subgrantee. (b) Applicable cost principles. For each kind of organization, there is a set of...
... uniform cost accounting standards that comply with cost principles acceptable to the Federal agency. ... Regulations of the Department of Agriculture (Continued) OFFICE OF THE CHIEF FINANCIAL OFFICER, DEPARTMENT OF... GOVERNMENTS Post-Award Requirements Financial Administration § 3016.22 Allowable costs. (a) Limitation on use...
... part for labor, weatherization materials, and related matters for a renewable energy system, shall not... beginning in calendar year 2010 and the $3,000 average for renewable energy systems will be adjusted... 10 Energy 3 2010-01-01 2010-01-01 false Allowable expenditures. 440.18 Section 440.18 Energy...
... typical “provider” costs, and costs (such as marketing, enrollment, membership, and operation of the HMO... principles applicable to provider costs, as set forth in § 417.536. (2) The allowability of other costs is determined in accordance with principles set forth in §§ 417.538 through 417.550. (3) Costs for covered...
... 44 Emergency Management and Assistance 1 2010-10-01 2010-10-01 false Allowable compensation. 295.21 Section 295.21 Emergency Management and Assistance FEDERAL EMERGENCY MANAGEMENT AGENCY, DEPARTMENT... no-cost crisis counseling services available in the community. FEMA will not reimburse for treatment...
... rehabilitation facility or sheltered workshop; independent instructor; institutional non-farm cooperative: Full...) VOCATIONAL REHABILITATION AND EDUCATION Vocational Rehabilitation and Employment Under 38 U.S.C. Chapter 31... rehabilitation program under 38 U.S.C. Chapter 31 will receive a monthly subsistence allowance at the rates in...
... uniform cost accounting standards that comply with cost principles acceptable to the Federal agency. ... COST PRINCIPLES FOR ASSISTANCE PROGRAMS Uniform Administrative Requirements for Grants and Cooperative... increment above allowable costs) to the grantee or subgrantee. (b) Applicable cost principles. For each kind...
... FOR AWARDS AND SUBAWARDS TO INSTITUTIONS OF HIGHER EDUCATION, HOSPITALS, OTHER NONPROFIT ORGANIZATIONS, AND COMMERCIAL ORGANIZATIONS Post-Award Requirements Financial and Program Management § 74.27... Organizations” and paragraph (b) of this section. The allowability of costs incurred by institutions of higher...
... Relations DEPARTMENT OF STATE CIVIL RIGHTS GRANTS AND AGREEMENTS WITH INSTITUTIONS OF HIGHER EDUCATION, HOSPITALS, AND OTHER NON-PROFIT ORGANIZATIONS Post-Award Requirements Financial and Program Management § 145...-Profit Organizations.” The allowability of costs incurred by institutions of higher education is...
... INSTITUTIONS OF HIGHER EDUCATION, HOSPITALS, AND OTHER NON-PROFIT ORGANIZATIONS Post-Award Requirements Financial and Program Management § 518.27 Allowable costs. For each kind of recipient, there is a set of... by institutions of higher education is determined in accordance with the provisions of OMB Circular A...
...) MARINE POLLUTION FINANCIAL RESPONSIBILITY AND COMPENSATION OIL SPILL LIABILITY TRUST FUND; CLAIMS PROCEDURES; DESIGNATION OF SOURCE; AND ADVERTISEMENT Procedures for Particular Claims § 136.217 Compensation... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false Compensation allowable. 136.217...
...) MARINE POLLUTION FINANCIAL RESPONSIBILITY AND COMPENSATION OIL SPILL LIABILITY TRUST FUND; CLAIMS PROCEDURES; DESIGNATION OF SOURCE; AND ADVERTISEMENT Procedures for Particular Claims § 136.205 Compensation... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false Compensation allowable. 136.205...
... otherwise indicated below, direct and indirect costs shall be charged in accordance with 41 CFR 29-70 and 41... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Allowable costs. 632.37 Section 632.37 Employees' Benefits EMPLOYMENT AND TRAINING ADMINISTRATION, DEPARTMENT OF LABOR INDIAN AND NATIVE AMERICAN...
Hane, G.J.; Lewis, P.M.; Hutchinson, R.A.; Rubinger, B.; Willis, A.
Purpose of this study is to explore the status of R and D in Japan and the ability of US researchers to keep abreast of Japanese technical advances. US researchers familiar with R and D activities in Japan were interviewed in ten fields that are relevant to the more efficient use of energy: amorphous metals, biotechnology, ceramics, combustion, electrochemical energy storage, heat engines, heat transfer, high-temperature sensors, thermal and chemical energy storage, and tribology. The researchers were questioned about their perceptions of the strengths of R and D in Japan, comparative aspects of US work, and the quality of available information sources describing R and D in Japan. Of the ten related fields, the researchers expressed a strong perception that significant R and D is under way in amorphous metals, biotechnology, and ceramics, and that the US competitive position in these technologies will be significantly challenged. Researchers also identified alternative emphases in Japanese R and D programs in these areas that provide Japan with stronger technical capabilities. For example, in biotechnology, researchers noted the significant Japanese emphasis on industrial-scale bioprocess engineering, which contrasts with a more meager effort in the US. In tribology, researchers also noted the strength of the chemical tribology research in Japan and commented on the effective mix of chemical and mechanical tribology research. This approach contrasts with the emphasis on mechanical tribology in the US.
Yoon, S.; Williams, J. R.; Juanes, R.; Kang, P. K.
Managed aquifer recharge (MAR) is becoming an important solution for ensuring sustainable water resources and mitigating saline water intrusion in coastal aquifers. Accurate estimates of hydrogeological parameters in subsurface flow and solute transport models are critical for making predictions and managing aquifer systems. In the presence of a density difference between the injected freshwater and ambient saline groundwater, the pressure field is coupled to the spatial distribution of salinity distribution, and therefore experiences transient changes. The variable-density effects can be quantified by a mixed convection ratio between two characteristic types of convection: free convection due to density contrast, and forced convection due to a hydraulic gradient. We analyze the variable-density effects on the value-of-information of pressure and concentration data for saline aquifer characterization. An ensemble Kalman filter is used to estimate permeability fields by assimilating the data, and the performance of the estimation is analyzed in terms of the accuracy and the uncertainty of estimated permeability fields and the predictability of arrival times of breakthrough curves in a realistic push-pull setting. This study demonstrates that: 1. Injecting fluids with the velocity that balances the two characteristic convections maximizes the value of data for saline aquifer characterization; 2. The variable-density effects on the value of data for the inverse estimation decrease as the permeability heterogeneity increases; 3. The advantage of joint inversion of pressure and concentration data decreases as the coupling effects between flow and transport increase.
For application to industrial heating of large pools by immersed heat exchangers, the socalled maximum allowable (or open-quotes criticalclose quotes) heat flux is studied for unconfined tube bundles aligned horizontally in a pool without forced flow. In general, we are considering boiling after the pool reaches its saturation temperature rather than sub-cooled pool boiling which should occur during early stages of transient operation. A combination of literature review and simple approximate analysis has been used. To date our main conclusion is that estimates of q inch chf are highly uncertain for this configuration
Jeddi, Fatemeh Rangraz; Akbari, Hossein; Rasouli, Somayeh
Tele-homecare methods can be used to provide home care for the elderly, if information management is provided. The aim of this study was to compare the places and methods of the data collection and media that use Tele-homecare for the elderly in selected countries in 2015. A comparative-applied library study was conducted in 2015. The study population were five countries, including Canada, Australia, England, Denmark, and Taiwan. The data collection tool was a checklist based on the objectives of study. Persian and English papers from 1998 to 2014, related to the Electronic Health Record, home care and the elderly were extracted from authentic journals and reference books as well as academic and research websites. Data were collected by reviewing the papers. After collecting data, comparative tables were prepared and the weak and strong points of each case were investigated and analyzed in selected countries. Clinical, laboratory, imaging and pharmaceutical data were obtained from hospitals, physicians' offices, clinics, pharmacies and long-term healthcare centers. Mobile and tablet-based technologies and personal digital assistants were used to collect data. Data were published via Internet, online and offline databanks, data exchange and dissemination via registries and national databases. Managed care methods were telehealth management systems and point of service. For continuity of care, it is necessary to consider managed care and equipment with regard to obtaining data in various forms from various sources, sharing data with registries and national databanks as well as the Electronic Health Record. With regard to the emergence of wearable technology and its use in home care, it is suggested to study the integration of its data with Electronic Health Records.
Pazderin, A. V.; Sof'in, V. V.; Samoylenko, V. O.
Efforts aimed at improving energy efficiency in all branches of the fuel and energy complex shall be commenced with setting up a high-tech automated system for monitoring and accounting energy resources. Malfunctions and failures in the measurement and information parts of this system may distort commercial measurements of energy resources and lead to financial risks for power supplying organizations. In addition, measurement errors may be connected with intentional distortion of measurements for reducing payment for using energy resources on the consumer's side, which leads to commercial loss of energy resource. The article presents a universal mathematical method for verifying the validity of measurement information in networks for transporting energy resources, such as electricity and heat, petroleum, gas, etc., based on the state estimation theory. The energy resource transportation network is represented by a graph the nodes of which correspond to producers and consumers, and its branches stand for transportation mains (power lines, pipelines, and heat network elements). The main idea of state estimation is connected with obtaining the calculated analogs of energy resources for all available measurements. Unlike "raw" measurements, which contain inaccuracies, the calculated flows of energy resources, called estimates, will fully satisfy the suitability condition for all state equations describing the energy resource transportation network. The state equations written in terms of calculated estimates will be already free from residuals. The difference between a measurement and its calculated analog (estimate) is called in the estimation theory an estimation remainder. The obtained large values of estimation remainders are an indicator of high errors of particular energy resource measurements. By using the presented method it is possible to improve the validity of energy resource measurements, to estimate the transportation network observability, to eliminate
Tabatabai, Patrik; Henke, Stefanie; Sušac, Katharina; Kisanga, Oberlin M E; Baumgarten, Inge; Kynast-Wolf, Gisela; Ramroth, Heribert; Marx, Michael
Strategies to improve maternal health in low-income countries are increasingly embracing partnership approaches between public and private stakeholders in health. In Tanzania, such partnerships are a declared policy goal. However, implementation remains challenging as unfamiliarity between partners and insufficient recognition of private health providers prevail. This hinders cooperation and reflects the need to improve the evidence base of private sector contribution. To map and analyse the capacities of public and private hospitals to provide maternal health care in southern Tanzania and the population reached with these services. A hospital questionnaire was applied in all 16 hospitals (public n=10; private faith-based n=6) in 12 districts of southern Tanzania. Areas of inquiry included selected maternal health service indicators (human resources, maternity/delivery beds), provider-fees for obstetric services and patient turnover (antenatal care, births). Spatial information was linked to the 2002 Population Census dataset and a geographic information system to map patient flows and socio-geographic characteristics of service recipients. The contribution of faith-based organizations (FBOs) to hospital maternal health services is substantial. FBO hospitals are primarily located in rural areas and their patient composition places a higher emphasis on rural populations. Also, maternal health service capacity was more favourable in FBO hospitals. We approximated that 19.9% of deliveries in the study area were performed in hospitals and that the proportion of c-sections was 2.7%. Mapping of patient flows demonstrated that women often travelled far to seek hospital care and where catchment areas of public and FBO hospitals overlap. We conclude that the important contribution of FBOs to maternal health services and capacity as well as their emphasis on serving rural populations makes them promising partners in health programming. Inclusive partnerships could increase
A final analysis of J π =0 + ->0 + super-allowed Fermi transitions yields vertical bar V ud vertical bar 2 =0.9500±0.0007; vertical bar V ud vertical bar 2 + vertical bar V us vertical bar 2 + vertical bar V ub vertical bar 2 =0.9999±0.0011 with the operational vector coupling constant G V */(-bar c) 3 =(1.15052±0.00021)x10 -5 GeV -2
Fawcett, Tina; Hvelplund, Frede; Meyer, Niels I
The Chapter highligts the importance of introducing new, efficient schemes for mitigation of global warming. One such scheme is Personal Carbon Allowances (PCA), whereby individuals are allotted a tradable ration of CO2 emission per year.This chapter reviews the fundamentals of PCA and analyzes its...... merits and problems. The United Kingdom and Denmark have been chosen as case studies because the energy situation and the institutional setup are quite different between the two countries....
Kahle, Sue C.; Caldwell, Rodney R.; Bartolino, James R.
The U.S. Geological Survey, in cooperation with the Idaho Department of Water Resources and Washington Department of Ecology compiled and described geologic, hydrologic, and ground-water flow modeling information about the Spokane Valley-Rathdrum Prairie (SVRP) aquifer in northern Idaho and northeastern Washington. Descriptions of the hydrogeologic framework, water-budget components, ground- and surface-water interactions, computer flow models, and further data needs are provided. The SVRP aquifer, which covers about 370 square miles including the Rathdrum Prairie, Idaho and the Spokane valley and Hillyard Trough, Washington, was designated a Sole Source Aquifer by the U.S. Environmental Protection Agency in 1978. Continued growth, water management issues, and potential effects on water availability and water quality in the aquifer and in the Spokane and Little Spokane Rivers have illustrated the need to better understand and manage the region's water resources. The SVRP aquifer is composed of sand, gravel, cobbles, and boulders primarily deposited by a series of catastrophic glacial outburst floods from ancient Glacial Lake Missoula. The material deposited in this high-energy environment is coarser-grained than is typical for most basin-fill deposits, resulting in an unusually productive aquifer with well yields as high as 40,000 gallons per minute. In most places, the aquifer is bounded laterally by bedrock composed of granite, metasedimentary rocks, or basalt. The lower boundary of the aquifer is largely unknown except along the margins or in shallower parts of the aquifer where wells have penetrated its entire thickness and reached bedrock or silt and clay deposits. Based on surface geophysics, the thickness of the aquifer is about 500 ft near the Washington-Idaho state line, but more than 600 feet within the Rathdrum Prairie and more than 700 feet in the Hillyard trough based on drilling records. Depth to water in the aquifer is greatest in the northern
Vasilikos, Panagiotis; Nielson, Flemming; Nielson, Hanne Riis
. In this paper, we develop a formal approach of information flow for timed automata that allows intentional information leaks. The security of a timed automaton is then defined using a bisimulation relation that takes account of the non-determinism and the clocks of timed automata. Finally, we define...... of security goals for confidentiality and integrity. Notions of security based on Information flow control, such as non-interference, provide strong guarantees that no information is leaked; however, many cyberphysical systems leak intentionally some information in order to achieve their purposes...... an algorithm that traverses a timed automaton and imposes information flow constraints on it and we prove that our algorithm is sound with respect to our security notion....
Aijälä, G; Lumley, D
Tighter discharge permits often require wastewater treatment plants to maximize utilization of available facilities in order to cost-effectively reach these goals. Important aspects are minimizing internal disturbances and using available information in a smart way to improve plant performance. In this study, flow control throughout a large highly automated wastewater treatment plant (WWTP) was implemented in order to reduce internal disturbances and to provide a firm foundation for more advanced process control. A modular flow control system was constructed based on existing instrumentation and soft sensor flow models. Modules were constructed for every unit process in water treatment and integrated into a plant-wide model. The flow control system is used to automatically control recirculation flows and bypass flows at the plant. The system was also successful in making accurate flow estimations at points in the plant where it is not possible to have conventional flow meter instrumentation. The system provides fault detection for physical flow measuring devices. The module construction allows easy adaptation for new unit processes added to the treatment plant.
Full Text Available In three experiments, participants named target pictures by means of German compound words (e.g., Gartenstuhl - garden chair, each accompanied by two different distractor pictures (e.g., lawn mower and swimming pool.Targets and distractor pictures were semantically related, either associatively (garden chair and lawn mower or by a shared semantic category (garden chair and wardrobe. Within each type of semantic relation, target and distractor pictures either shared morpho-phonological (word-form information (Gartenstuhl with Gartenzwerg, garden gnome, and Gartenschlauch, garden hose or not. A condition with two completely unrelated pictures served as baseline. Target naming was facilitated when distractor and target pictures were morpho-phonologically related. This is clear evidence for the activation of lexical information of distractor pictures. Effects were larger for associatively than for categorically related distractors and targets, which constitutes evidence for lexical competition. Mere categorical relatedness, in the absence of morpho-phonological overlap, resulted in null effects (Experiments 1 and 2, and only speeded target naming when effects reflect only conceptual, not lexical processing (Experiment 3. Given that distractor pictures activate their word forms, the data cannot be easily reconciled with discrete serial models. The results fit well with models that allow information to cascade forward from conceptual to word-form levels.
Bölte, Jens; Böhl, Andrea; Dobel, Christian; Zwitserlood, Pienie
In three experiments, participants named target pictures by means of German compound words (e.g., Gartenstuhl-garden chair), each accompanied by two different distractor pictures (e.g., lawn mower and swimming pool). Targets and distractor pictures were semantically related either associatively (garden chair and lawn mower) or by a shared semantic category (garden chair and wardrobe). Within each type of semantic relation, target and distractor pictures either shared morpho-phonological (word-form) information (Gartenstuhl with Gartenzwerg, garden gnome, and Gartenschlauch, garden hose) or not. A condition with two completely unrelated pictures served as baseline. Target naming was facilitated when distractor and target pictures were morpho-phonologically related. This is clear evidence for the activation of word-form information of distractor pictures. Effects were larger for associatively than for categorically related distractors and targets, which constitute evidence for lexical competition. Mere categorical relatedness, in the absence of morpho-phonological overlap, resulted in null effects (Experiments 1 and 2), and only speeded target naming when effects reflect only conceptual, but not lexical processing (Experiment 3). Given that distractor pictures activate their word forms, the data cannot be easily reconciled with discrete serial models. The results fit well with models that allow information to cascade forward from conceptual to word-form levels.
Pirnia, E.; Jakobsson, A.; Gudmundson, E.
the transversal blood flow. In this paper, we propose a novel data-adaptive blood flow estimator exploiting this modulation scheme. Using realistic Field II simulations, the proposed estimator is shown to achieve a notable performance improvement as compared to current state-of-the-art techniques.......The examination of blood flow inside the body may yield important information about vascular anomalies, such as possible indications of, for example, stenosis. Current Medical ultrasound systems suffer from only allowing for measuring the blood flow velocity along the direction of irradiation......, posing natural difficulties due to the complex behaviour of blood flow, and due to the natural orientation of most blood vessels. Recently, a transversal modulation scheme was introduced to induce also an oscillation along the transversal direction, thereby allowing for the measurement of also...
Kumar, Surender; Managi, Shunsuke
The US Clean Air Act Amendments introduce an emissions trading system to regulate SO 2 emissions. This study finds that changes in SO 2 emissions prices are related to innovations induced by these amendments. We find that electricity-generating plants are able to increase electricity output and reduce emissions of SO 2 and NO x from 1995 to 2007 due to the introduction of the allowance trading system. However, compared to the approximate 8% per year of exogenous technological progress, the induced effect is relatively small, and the contribution of the induced effect to overall technological progress is about 1-2%. (author)
Irani Lauer Lellis
Full Text Available The practice of giving allowance is used by several parents in different parts of the world and can contribute to the economic education of children. This study aimed to investigate the purposes of the allowance with 32 parents of varying incomes. We used the focus group technique and Alceste software to analyze the data. The results involved two classes related to the process of using the allowance. These classes have covered aspects of the role of socialization and education allowance, serving as an instrument of reward, but sometimes encouraging bad habits in children. The justification of the fathers concerning the amount of money to be given to the children and when to stop giving allowance were also highlighted. Keywords: allowance; economic socialization; parenting practices.
The healthcare field contains a multitude of opportunities for science communication. Given the many stakeholders dancing together in a multidirectional tango of communication, we need to ask how much does the deficit model apply to the health field? History dictates that healthcare professionals are the holders of all knowledge, and the patients and other stakeholders are the ones that need the scientific information communicated to them. This essay argues otherwise, in part due to the rise of shared decision-making and patients and other stakeholders acting as partners in healthcare. The traditional deficit model in health held that: (1) doctors were experts and patients were consumers, (2) it is impossible for the public to grasp the many disciplines of knowledge in medicine, (3) if experts have trouble keeping up with medical research then the public surely can't keep up, and (4) it is safer for healthcare professionals to communicate to the public using a deficit model. However, with the rise of partnerships with patients in healthcare decision-making, the deficit model might be weakening. Examples of public participation in healthcare decision-making include: (1) crowd-sourcing public participation in systematic reviews, (2) public participation in health policy, (3) public collaboration in health research, and (4) health consumer groups acting as producers of health information. With the challenges to the deficit model in science communication in health, caution is needed with the increasing role of technology and social media, and how these may affect the legitimacy of healthcare information flows away from the healthcare professional. © The Author(s) 2016.
Inamura, K; Umeda, T; Harauchi, H; Kondoh, H; Hasegawa, T; Kozuka, T; Takeda, H; Inoue, M
The effectiveness of a hospital information system (HIS) and a radiological information system (RIS) was evaluated to optimize preparation for the planned full clinical operation of a picture archiving and communication system (PACS), which is now linked experimentally to the HIS and the RIS. One thousand IC (integrated circuit) cards were used for time studies and flow studies in the hospital. Measurements were performed on image examination order entry, image examination, reporting, and image delivery times. Even though after the HIS and the RIS operation only a small amount of time savings were realized in each time fraction component, such as in the patient movement time, examination time, and film delivery time, the total turn-around time was shortened markedly, by more than 23 hours on average. It was verified that the HIS and the RIS was beneficial in the outpatient clinics of the orthopedic department. Our method of measurement employing IC cards before and after HIS and RIS operations can be applied in other hospitals.
Breinholt, Anders; Grum, Morten; Madsen, Henrik
to assess parameter and flow simulation uncertainty using a simplified lumped sewer model that accounts for three separate flow contributions: wastewater, fast runoff from paved areas, and slow infiltrating water from permeable areas. Recently GLUE methodology has been critisised for generating prediction...... rain inputs and more accurate flow observations to reduce parameter and model simulation uncertainty. © Author(s) 2013....
However there are cases  where FTL signaling is defi- nitely prohibited ... Alice now performs measurements on the joint system of particles 1 and 2 to distinguish between the ... To recover the original state, Bob then has to apply the appro-.
Rodriguez Lorite, M.; Martin Lopez-Suevos, C.
Activities performed in most companies are based on the flow of information between their different departments and personnel. Most of this information is on paper (delivery notes, invoices, reports, etc). The percentage of information transmitted electronically (electronic transactions, spread sheets, files from word processors, etc) is usually low. The implementation of systems to control and speed up this work flow is the aim of work flow management systems. This article presents a prototype for applying work flow management systems to a specific area: the basic life cycle of a purchase order in a nuclear power plant, which requires the involvement of various computer applications: purchase order management, warehouse management, accounting, etc. Once implemented, work flow management systems allow optimisation of the execution of different tasks included in the managed life cycles and provide parameters to, if necessary, control work cycles, allowing their temporary or definitive modification. (Author)
... FOREIGN LANGUAGE AND AREA STUDIES OR FOREIGN LANGUAGE AND INTERNATIONAL STUDIES What Conditions Must Be... 34 Education 3 2010-07-01 2010-07-01 false What are allowable costs and limitations on allowable costs? 656.30 Section 656.30 Education Regulations of the Offices of the Department of Education...
... 40 Protection of Environment 17 2010-07-01 2010-07-01 false Grant of essential use allowances and critical use allowances. 82.8 Section 82.8 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Albemarle Bill Clark Pest Control, Inc. Burnside Services, Inc. Cardinal Professional Products Chemtura Corp...
... GENERAL SERVICES ADMINISTRATION [GSA Bulletin FTR 10-04] Federal Travel Regulation (FTR); Relocation Allowances-- Relocation Income Tax Allowance (RITA) Tables AGENCY: Office of Governmentwide Policy... (73 FR 35952) specifying that GSA would no longer publish the RITA tables found in 41 CFR Part 301-17...
... reflection of the actual tax impact on the employee. Therefore, this proposed rule offers the one-year RITA... to estimate the additional income tax liability that you incur as a result of relocation benefits and... Allowances (Taxes); Relocation Allowances (Taxes) AGENCY: Office of Governmentwide Policy (OGP), General...
Sinclair, Michael B.; Jones, Howland D. T.
A hyperspectral imaging flow cytometer can acquire high-resolution hyperspectral images of particles, such as biological cells, flowing through a microfluidic system. The hyperspectral imaging flow cytometer can provide detailed spatial maps of multiple emitting species, cell morphology information, and state of health. An optimized system can image about 20 cells per second. The hyperspectral imaging flow cytometer enables many thousands of cells to be characterized in a single session.
Colvert, Brendan; Chen, Kevin; Kanso, Eva
Empirical evidence suggests that many aquatic organisms sense differential hydrodynamic signals.This sensory information is decoded to extract relevant flow properties. This task is challenging because it relies on local and partial measurements, whereas classical flow characterization methods depend on an external observer to reconstruct global flow fields. Here, we introduce a mathematical model in which a bioinspired sensory array measuring differences in local flow velocities characterizes the flow type and intensity. We linearize the flow field around the sensory array and express the velocity gradient tensor in terms of frame-independent parameters. We develop decoding algorithms that allow the sensory system to characterize the local flow and discuss the conditions under which this is possible. We apply this framework to the canonical problem of a circular cylinder in uniform flow, finding excellent agreement between sensed and actual properties. Our results imply that combining suitable velocity sensors with physics-based methods for decoding sensory measurements leads to a powerful approach for understanding and developing underwater sensory systems.
Full Text Available In this paper we try to develop an algorithm for visual obstacle avoidance of autonomous mobile robot. The input of the algorithm is an image sequence grabbed by an embedded camera on the B21r robot in motion. Then, the optical flow information is extracted from the image sequence in order to be used in the navigation algorithm. The optical flow provides very important information about the robot environment, like: the obstacles disposition, the robot heading, the time to collision and the depth. The strategy consists in balancing the amount of left and right side flow to avoid obstacles, this technique allows robot navigation without any collision with obstacles. The robustness of the algorithm will be showed by some examples.
Soap bubbles were used for visualizing flows. The tests effected allowed some characteristics of flows around models in blow tunnels to be precised at mean velocities V 0 5 . The velocity of a bubble is measured by chronophotography, the bulk envelope of the trajectories is also registered [fr
Rosa, Henrique; Suslick, Saul B.; Sousa, Sergio H.G. de [Universidade Estadual de Campinas, SP (Brazil). Inst. of Geosciences; Castro, Jonas Q. [ANP - Brazilian National Petroleum Agency, Rio de Janeiro, RJ (Brazil)
This paper is focused on the elaboration of a standardization model for the existing flow of information between the Petroleum National Agency (ANP) and the concessionaire companies in the event of the discovery of any potentially commercial hydrocarbon resources inside their concession areas. The method proposed by Rosa (2003) included the analysis of a small sample of Oil and Gas Discovery Assessment Plans (PADs), elaborated by companies that operate in exploratory blocks in Brazil, under the regulatory context introduced by the Petroleum Law (Law 9478, August, 6th, 1997). The analysis of these documents made it possible to identify and target the problems originated from the lack of standardization. The results obtained facilitated the development of a model that helps the creation process of Oil and Gas Discovery Assessment Plans. It turns out that the standardization procedures suggested provide considerable advantages while speeding up several technical and regulatory steps. A software called 'ePADs' was developed to consolidate the automation of the several steps in the model for the standardization of the Oil and Gas Discovery Assessment Plans. A preliminary version has been tested with several different types of discoveries indicating a good performance by complying with all regulatory aspects and operational requirements. (author)
Underwood, S.R.; Firmin, D.N.; Klipstein, R.H.; Rees, R.S.O.; Longmore, D.B.
Velocity mapping by means of cine-MR imaging allows accurate measurement of velocity and flow within the cardiovascular system. A cine display and color coding simplify interpretation. The author have used the technique in a variety of patients to illustrate its potential. Velocity mapping in coronary artery by pass grafts in six patients provided a measure of graft function. Coronary artery velocities were measured in three subjects. Flow was measured through defects in the atrial septum, the ventricular septum, and a Gerbode defect. Velocity was reduced distal to coarctation of the aorta and was increased at the level of a partial venous occlusion by thrombosis. In a patient with isomerism, velocity mapping in the central vessels aided interpretation. Cine-MR imaging velocity mapping combined with conventional imaging yields important functional information on the cardiovascular system
Jones, D Gareth; Nie, Jing-Bao
Confucianism has been widely perceived as a major moral and cultural obstacle to the donation of bodies for anatomical purposes. The rationale for this is the Confucian stress on xiao (filial piety), whereby individuals' bodies are to be intact at death. In the view of many, the result is a prohibition on the donation of bodies to anatomy departments for the purpose of dissection. The role of dissection throughout the development of anatomy within a Confucian context is traced, and in contemporary China the establishment of donation programs and the appearance of memorial monuments is noted. In reassessing Confucian attitudes, the stress laid on a particular interpretation of filial piety is questioned, and an attempt is made to balance this with the Confucian emphasis on a moral duty to those outside one's immediate family. The authors argue that the fundamental Confucian norm ren (humaneness or benevolence) allows for body donation as people have a moral duty to help others. Moreover, the other central Confucian value, li (rites), offers important insights on how body donation should be performed as a communal activity, particularly the necessity of developing ethically and culturally appropriate rituals for body donation. In seeking to learn from this from a Western perspective, it is contended that in all societies the voluntary donation of bodies is a deeply human activity that is to reflect the characteristics of the community within which it takes place. This is in large part because it has educational and personal repercussions for students. Anat Sci Educ. © 2018 American Association of Anatomists. © 2018 American Association of Anatomists.
Gephart, Melanie Hayden; Derstine, Pamela; Oyesiku, Nelson M; Grady, M Sean; Burchiel, Kim; Batjer, H Hunt; Popp, A John; Barbaro, Nicholas M
Subspecialization of physicians and regional centers concentrate the volume of certain rare cases into fewer hospitals. Consequently, the primary institution of a neurological surgery training program may not have sufficient case volume to meet the current Residency Review Committee case minimum requirements in some areas. To ensure the competency of graduating residents through a comprehensive neurosurgical education, programs may need for residents to travel to outside institutions for exposure to cases that are either less common or more regionally focused. We sought to evaluate off-site rotations to better understand the changing demographics and needs of resident education. This would also allow prospective monitoring of modifications to the neurosurgery training landscape. We completed a survey of neurosurgery program directors and query of data from the Accreditation Council of Graduate Medical Education to characterize the current use of away rotations in neurosurgical education of residents. We found that 20% of programs have mandatory away rotations, most commonly for exposure to pediatric, functional, peripheral nerve, or trauma cases. Most of these rotations are done during postgraduate year 3 to 6, lasting 1 to 15 months. Twenty-six programs have 2 to 3 participating sites and 41 have 4 to 6 sites distinct from the host program. Programs frequently offset potential financial harm to residents rotating at a distant site by support of housing and transportation costs. As medical systems experience fluctuating treatment paradigms and demographics, over time, more residency programs may adapt to meet the Accreditation Council of Graduate Medical Education case minimum requirements through the implementation of away rotations.
Runge, M.C.; Sauer, J.R.; Avery, M.L.; Blackwell, B.F.; Koneff, M.D.
Legal removal of migratory birds from the wild occurs for several reasons, including subsistence, sport harvest, damage control, and the pet trade. We argue that harvest theory provides the basis for assessing the impact of authorized take, advance a simplified rendering of harvest theory known as potential biological removal as a useful starting point for assessing take, and demonstrate this approach with a case study of depredation control of black vultures (Coragyps atratus) in Virginia, USA. Based on data from the North American Breeding Bird Survey and other sources, we estimated that the black vulture population in Virginia was 91,190 (95% credible interval = 44,520?212,100) in 2006. Using a simple population model and available estimates of life-history parameters, we estimated the intrinsic rate of growth (rmax) to be in the range 7?14%, with 10.6% a plausible point estimate. For a take program to seek an equilibrium population size on the conservative side of the yield curve, the rate of take needs to be less than that which achieves a maximum sustained yield (0.5 x rmax). Based on the point estimate for rmax and using the lower 60% credible interval for population size to account for uncertainty, these conditions would be met if the take of black vultures in Virginia in 2006 was < 3,533 birds. Based on regular monitoring data, allowable harvest should be adjusted annually to reflect changes in population size. To initiate discussion about how this assessment framework could be related to the laws and regulations that govern authorization of such take, we suggest that the Migratory Bird Treaty Act requires only that take of native migratory birds be sustainable in the long-term, that is, sustained harvest rate should be < rmax. Further, the ratio of desired harvest rate to 0.5 x rmax may be a useful metric for ascertaining the applicability of specific requirements of the National Environmental Protection Act.
Kyoden, Tomoaki; Akiguchi, Shunsuke; Tajiri, Tomoki; Andoh, Tsugunobu; Hachiga, Tadashi
The development of a system for in vivo visualization of occluded distal blood vessels for diabetic patients is the main target of our research. We herein describe two-beam multipoint laser Doppler velocimetry (MLDV), which measures the instantaneous multipoint flow velocity and can be used to observe the blood flow velocity in peripheral blood vessels. By including a motorized stage to shift the measurement points horizontally and in the depth direction while measuring the velocity, the path of the blood vessel in the skin could be observed using blood flow velocity in three-dimensional space. The relationship of the signal power density between the blood vessel and the surrounding tissues was shown and helped us identify the position of the blood vessel. Two-beam MLDV can be used to simultaneously determine the absolute blood flow velocity distribution and identify the blood vessel position in skin.
Full Text Available Sensory-motor learning is commonly considered as a mapping process, whereby sensory information is transformed into the motor commands that drive actions. However, this directional mapping, from inputs to outputs, is part of a loop; sensory stimuli cause actions and vice versa. Here, we explore whether actions affect the understanding of the sensory input that they cause. Using a visuo-motor task in humans, we demonstrate two types of learning-related behavioral effects. Stimulus-dependent effects reflect stimulus-response learning, while action-dependent effects reflect a distinct learning component, allowing the brain to predict the forthcoming sensory outcome of actions. Together, the stimulus-dependent and the action-dependent learning components allow the brain to construct a complete internal representation of the sensory-motor loop.
Nietert, R.E.; Abdelk-Khalik, S.I.
An experimental investigation has been conducted to determine the heat transfer characteristics of gravity-flowing particle beds using a special heat transfer loop. Glass microspheres were allowed to flow by gravity at controlled rates through an electrically heated stainless steel tubular test section. Values of the local and average convective heat transfer coefficient as a function of the average bed velocity, particle size and heat flux were determined. Such information is necessary for the design of gravity-flowing particle-bed type fusion reactor-blankets and associated tritium recovery systems. (orig.)
Abrahams, J R; Hiller, N
Signal Flow Analysis provides information pertinent to the fundamental aspects of signal flow analysis. This book discusses the basic theory of signal flow graphs and shows their relation to the usual algebraic equations.Organized into seven chapters, this book begins with an overview of properties of a flow graph. This text then demonstrates how flow graphs can be applied to a wide range of electrical circuits that do not involve amplification. Other chapters deal with the parameters as well as circuit applications of transistors. This book discusses as well the variety of circuits using ther
Chang, Paul K
Interdisciplinary and Advanced Topics in Science and Engineering, Volume 3: Separation of Flow presents the problem of the separation of fluid flow. This book provides information covering the fields of basic physical processes, analyses, and experiments concerning flow separation.Organized into 12 chapters, this volume begins with an overview of the flow separation on the body surface as discusses in various classical examples. This text then examines the analytical and experimental results of the laminar boundary layer of steady, two-dimensional flows in the subsonic speed range. Other chapt
Mccormick, Patrick S [Los Alamos National Laboratory; Brownlee, Carson S [Los Alamos National Laboratory; Pegoraro, Vincent [UNIV OF UTAH; Shankar, Siddharth [UNIV OF UTAH; Hansen, Charles D [UNIV OF UTAH
Understanding fluid flow is a difficult problem and of increasing importance as computational fluid dynamics produces an abundance of simulation data. Experimental flow analysis has employed techniques such as shadowgraph and schlieren imaging for centuries which allow empirical observation of inhomogeneous flows. Shadowgraphs provide an intuitive way of looking at small changes in flow dynamics through caustic effects while schlieren cutoffs introduce an intensity gradation for observing large scale directional changes in the flow. The combination of these shading effects provides an informative global analysis of overall fluid flow. Computational solutions for these methods have proven too complex until recently due to the fundamental physical interaction of light refracting through the flow field. In this paper, we introduce a novel method to simulate the refraction of light to generate synthetic shadowgraphs and schlieren images of time-varying scalar fields derived from computational fluid dynamics (CFD) data. Our method computes physically accurate schlieren and shadowgraph images at interactive rates by utilizing a combination of GPGPU programming, acceleration methods, and data-dependent probabilistic schlieren cutoffs. Results comparing this method to previous schlieren approximations are presented.
... other FTR Bulletins can be found at http://www.gsa.gov/ftrbulletin . The RIT allowance tables are located at http://www.gsa.gov/relocationpolicy . DATES: This notice is effective March 24, 2011. FOR... CFR part 301-17 Appendices A through D. The tables will be published at http://www.gsa.gov...
...: The GSA published FTR Amendment 2008-04, in the Federal Register on June 25, 2008 (73 FR 35952), specifying that GSA would no longer publish the RIT Allowance tables in Title 41 of the Code of Federal..., 2013. Carolyn Austin-Diggs, Principal Deputy Administrator, Office of Asset and Transportation...
Ogburn, S. E.; Calder, E. S.
Pyroclastic flows are among the most destructive volcanic phenomena. Hazard mitigation depends upon accurate forecasting of possible flow paths, often using computational models. Two main metrics have been proposed to describe the mobility of pyroclastic flows. The Heim coefficient, height-dropped/run-out (H/L), exhibits an inverse relationship with flow volume. This coefficient corresponds to the coefficient of friction and informs computational models that use Coulomb friction laws. Another mobility measure states that with constant shear stress, planimetric area is proportional to the flow volume raised to the 2/3 power (A∝V^(2/3)). This relationship is incorporated in models using constant shear stress instead of constant friction, and used directly by some empirical models. Pyroclastic flows from Soufriere Hills Volcano, Montserrat; Unzen, Japan; Colima, Mexico; and Augustine, Alaska are well described by these metrics. However, flows in specific valleys exhibit differences in mobility. This study investigates the effect of topography on pyroclastic flow mobility, as measured by the above mentioned mobility metrics. Valley width, depth, and cross-sectional area all influence flow mobility. Investigating the appropriateness of these mobility measures, as well as the computational models they inform, indicates certain circumstances under which each model performs optimally. Knowing which conditions call for which models allows for better model selection or model weighting, and therefore, more realistic hazard predictions.
O'Leary, T J
Flow cytometry (FCM) is a useful adjunct to cytologic examination, because the quantitative biochemical information it provides complements the morphologic information gained during visual examination. It aids in the interpretation of bladder washings, and is particularly useful for the assessment of lymphoid lesions, whether they originate from fine-needle aspiration, cerebrospinal fluid, or effusions. Optimal use of FCM frequently requires assessment of more than one parameter; simultaneous use of cell differentiation markers and nuclear DNA quantitation is often significantly more useful than either alone. Despite the utility of FCM, however, the potential for future development appears to be limited. Improvements in image cytometry allow reasonable assessment of ploidy and S-fraction to be made from specimens prepared on glass slides. Multiparameter measurements may also be accomplished with imaging techniques, which allow the further advantage of visual identification of cells with equivocal morphologic changes. The development of artificial intelligence methods for use with imaging technology has also significantly exceeded that of FCM. Finally, image cytometry is often more useful for samples with few cells. Other challenges are posed by immunocytochemical methods which compete with flow cytometry as tools for assessment of proliferation. Given the relatively high cost of FCM instrumentation, survival of FCM as an ancillary technique in cytopathology will require further technical refinements to offset the advantages currently associated with image cytometry and immunocytochemistry.
Claes, N.; Paige, G. B.; Parsekian, A.
Flood irrigation, which constitutes a large part of agricultural water use, accounts for a significant amount of the water that is diverted from western streams. Return flow, the portion of the water applied to irrigated areas that returns to the stream, is important for maintaining base flows in streams and ecological function of riparian zones and wetlands hydrologically linked with streams. Prediction of timing and volumes of return flow during and after flood irrigation pose a challenge due to the heterogeneity of pedogenic and soil physical factors that influence vadose zone processes. In this study, we quantify volumes of return flow and potential pathways in the subsurface through a vadose zone flow model that is informed by both hydrological and geophysical observations in a Bayesian setting. We couple a two-dimensional vadose zone flow model through a Bayesian Markov Chain Monte Carlo approach with time lapse ERT, borehole NMR datasets that are collected during and after flood irrigation experiments, and soil physical lab analysis. The combination of both synthetic models and field observations leads to flow path identification and allows for quantification of volumes and timing and associated uncertainties of subsurface return that stems from flood irrigation. The quantification of the impact of soil heterogeneity enables us to translate these results to other sites and predict return flow under different soil physical settings. This is key when managing irrigation water resources and predictions of outcomes of different scenarios have to be evaluated.
Vidstrand, Patrik; Naeslund, Jens-Ove; Hartikainen, Juha; Svensson, Urban
In the earlier modelling for SFR-SAFE it was concluded that the groundwater flow would increase with time along with the shoreline displacement. Even though the numerical results are different the same conclusion is drawn after this study. General conclusions from the present study are that: The upper boundary conditions have a significant impact on the groundwater flow in the geosphere. The characteristic of the surface in regards of being a recharge or discharge area affects the results. In general, a discharge area will experience an increase in groundwater flow under changed conditions. The presence of caging fracture zones affects the results, and, for the tested un-frozen SFR situation, the resulting effect is an increase in groundwater flow. Specific conclusions regarding the relative change of groundwater flow due to different surface conditions are that: The permafrost scenarios, along with the development from sporadic permafrost to continuous permafrost, yield increased groundwater flows in unfrozen parts of the domain. The increase is one order of magnitude or less. In the permafrost, the flow is negligible. The ice sheet scenarios yield situations with significantly increased groundwater flow. The results indicate an increase by two to three orders of magnitude. These increased values, however, apply only for short duration intervals. It is possible that such intervals may be only a couple of years. In the selected climate Base variant, repeating the conditions for the last glacial cycle, permafrost conditions occur after 8,000 years. In the climate variant affected by increased greenhouse warming, permafrost conditions do not occur until after more than 50,000 years. In the chosen climate variants, ice sheets reach the Forsmark area and cause significantly increased groundwater flow, after ∼60,000 years or more
Vidstrand, Patrik (Bergab, Goeteborg (SE)); Naeslund, Jens-Ove (Swedish Nuclear Fuel and Waste Management Co., Stockholm (SE)); Hartikainen, Juha (Helsinki Univ. of Technology, Helsinki (FI)); Svensson, Urban (CFE AB, Karlskrona (SE))
In the earlier modelling for SFR-SAFE it was concluded that the groundwater flow would increase with time along with the shoreline displacement. Even though the numerical results are different the same conclusion is drawn after this study. General conclusions from the present study are that: The upper boundary conditions have a significant impact on the groundwater flow in the geosphere. The characteristic of the surface in regards of being a recharge or discharge area affects the results. In general, a discharge area will experience an increase in groundwater flow under changed conditions. The presence of caging fracture zones affects the results, and, for the tested un-frozen SFR situation, the resulting effect is an increase in groundwater flow. Specific conclusions regarding the relative change of groundwater flow due to different surface conditions are that: The permafrost scenarios, along with the development from sporadic permafrost to continuous permafrost, yield increased groundwater flows in unfrozen parts of the domain. The increase is one order of magnitude or less. In the permafrost, the flow is negligible. The ice sheet scenarios yield situations with significantly increased groundwater flow. The results indicate an increase by two to three orders of magnitude. These increased values, however, apply only for short duration intervals. It is possible that such intervals may be only a couple of years. In the selected climate Base variant, repeating the conditions for the last glacial cycle, permafrost conditions occur after 8,000 years. In the climate variant affected by increased greenhouse warming, permafrost conditions do not occur until after more than 50,000 years. In the chosen climate variants, ice sheets reach the Forsmark area and cause significantly increased groundwater flow, after approx60,000 years or more
杨迎辉; 李建华; 丁未; 南明莉
An efficient and reasonable information flowing pattern is the precondition and important guaran-tee for effective implementation of an air offensive campaign (AOC).To construct an information flowing pat-tern for AOC,basic definitions of the information flowing pattern are given based on the complex network theo-ry,and the information flowing relationships between intelligence security,operational command and weapon control in the three stages of AOC are analyzed.By introducing the concept and operation rules of the interval number,and taking information flow,information quality and transmission time as optimization targets,the multi-objective programming model of the information flowing pattern for AOC based on the interval number is constructed.Combined with methods of linear weight and interval number satisfaction,the model is translated into a two-level nesting optimization problem of signal-objective with a certain type,and the differential evolve-ment algorithm is chosen to solve it.Finally,the rationality and feasibility of the model and method are verified through a specific example.%高效合理的信息流转模式是有效实施空中进攻作战的前提和重要保障。针对空中进攻作战信息流转模式构建问题，给出了基于复杂网络理论的信息流转模式基本定义，解析了空中进攻作战3个阶段的情报保障、作战指挥和武器控制信息的流转关系。引入区间数概念及运算规则，以信息流量、信息质量和传输时间为优化目标，构建了基于区间数的空中进攻作战信息流转模式多目标规划模型，结合线性加权法和区间数满意度，将其划归为单目标确定型两层嵌套优化模型，并利用微分进化算法进行求解。最后结合实例，验证了该模型与方法的合理性与可行性。
Houaidia, Chiraz; Idoudi, Hanen; Van den Bossche, Adrien; Saidane, Leila; Val, Thierry
In this paper, we address the problem of QoS support in an heterogeneous multi-rate wireless mesh network. We propose a new routing metric that provides information about link quality, based on PHY and MAC characteristics, including the link availability, the loss rate and the available bandwidth. This metric allows to apprehend inter-flow interferences and avoid bottleneck formation by balancing traffic load on the links. Based on the conflict graph model and calculation of maximal cliques, ...
Robinson, Julie A.; Tate-Brown, Judy M.
Using a commercial software CD and minimal up-mass, SNFM monitors the Payload local area network (LAN) to analyze and troubleshoot LAN data traffic. Validating LAN traffic models may allow for faster and more reliable computer networks to sustain systems and science on future space missions. Research Summary: This experiment studies the function of the computer network onboard the ISS. On-orbit packet statistics are captured and used to validate ground based medium rate data link models and enhance the way that the local area network (LAN) is monitored. This information will allow monitoring and improvement in the data transfer capabilities of on-orbit computer networks. The Serial Network Flow Monitor (SNFM) experiment attempts to characterize the network equivalent of traffic jams on board ISS. The SNFM team is able to specifically target historical problem areas including the SAMS (Space Acceleration Measurement System) communication issues, data transmissions from the ISS to the ground teams, and multiple users on the network at the same time. By looking at how various users interact with each other on the network, conflicts can be identified and work can begin on solutions. SNFM is comprised of a commercial off the shelf software package that monitors packet traffic through the payload Ethernet LANs (local area networks) on board ISS.
Full Text Available The horizontal microchannel with the height of 50 micrometres and width of 40 mm of a rectangular cross-section has been used to study two-phase flow. The classical patterns of two-phase flow in the channel (bubble, stratified, churn, jet, and annular have been detected. Experimental information allows us to define the characteristics of the regimes and to determine precisely the boundaries between the patterns of the two-phase flows.
Full Text Available Experimental study for study of deformation of drops in air flow is considered. Experimental setup includes a module for obtaining the drops, an air flow system and measuring system. Module for formation of drops is in the form of vertically arranged dropper with capillary with the possibility of formation of fixed drops. Air flow supply system comprises an air pump coupled conduit through a regulating valve with a cylindrical pipe, installed coaxially with dropper. The measuring system includes the video camera located with possibility of visualization of drop and the Pitot gage for measurement of flow rate of air located in the output section of branch pipe. This experimental setup allows to provide reliable and informative results of the investigation of deformation of drops in the air flow.
Carlos, W.C.; Brehm, W.F.; Larrick, A.P.; Divine, J.R.
The Multi-Function Waste Tank Facility carbon steel tanks will contain mixer pumps that circulate the waste. On the basis of flow characteristics of the system and data from the literature, an erosion allowance of 0.075 mm/y (3 mil/year) was recommended for the tank bottoms, in addition to the 0.025 mm/y (1 mil/year) general corrosion allowance
Bowles, J. A.; Gee, J. S.; Jackson, M. J.
Ash flow tuffs (ignimbrites) are common worldwide, frequently contain fine-grained magnetite hosted in the glassy matrix, and often have high-quality 40Ar/39Ar ages. This makes them attractive candidates for paleointensity studies, potentially allowing for a substantial increase in the number of well-dated paleointensity estimates. However, the timing and nature of remanence acquisition in ignimbrites are not sufficiently understood to allow confident interpretation of paleointensity data from ash flows. The remanence acquisition may be a complex function of mineralogy and thermal history. Emplacement conditions and post-emplacement processes vary considerably between and within tuffs and may potentially affect the ability to recover ancient field intensity information. To better understand the relevant magnetic recording assemblage(s) and remanence acquisition processes we have collected samples from two well-documented historical ignimbrites, the 1980 ash flows at Mt. St. Helens (MSH), Washington, and the 1912 flows from Mt. Katmai in the Valley of Ten Thousand Smokes (VTTS), Alaska. Data from these relatively small, poorly- to non-welded historical flows are compared to the more extensive and more densely welded 0.76 Ma Bishop Tuff. This sample set enables us to better understand the geologic processes that destroy or preserve paleointensity information so that samples from ancient tuffs may be selected with care. Thellier-type paleointensity experiments carried out on pumice blocks sampled from the MSH flows resulted in a paleointensity of 55.8 μT +/- 0.8 (1 standard error). This compares favorably with the actual value of 56.0 μT. Excluded specimens of poor technical quality were dominantly from sites that were either emplaced at low temperature (600°C) temperatures does not corrupt the paleointensity signal, and additional data will be presented which explores this more fully.
... 49 Transportation 4 2010-10-01 2010-10-01 false Maximum allowable stress. 230.24 Section 230.24... Allowable Stress § 230.24 Maximum allowable stress. (a) Maximum allowable stress value. The maximum allowable stress value on any component of a steam locomotive boiler shall not exceed 1/4 of the ultimate...
S.M.T. Fatemi Ghomi; N. Azad
ENGLISH ABSTRACT: In traditional supply chain inventory management, orders are the only information firms exchange, but information technology now allows firms to share demand and inventory data quickly and inexpensively. To have an integrated plan, a manufacturer not only needs to know demand information from its customers but also supply information from its suppliers. In this paper, information flow is incorporated in a three-echelon supply chain model. Also to decrease the risk o...
Mazalu, N.; Negut, Gh.
The purpose of this evaluation was to obtain accurate information on each channel flow that enables us to assess precisely the level of reactor thermal power and, for reasons of safety, to establish which channel is boiling. In order to assess the channel flow parameters, computer simulations were done with the NUCIRC code and the results were checked by measurements. The complete channel flow measurements were made in the zero power cold condition. In hot conditions there were made flow measurements using the Shut Down System 1 (SDS 1) flow devices from 0.1 % F.P. up to 100 % F.P. The NUCIRC prediction for CANDU channel flows and the measurements by Ultrasonic Flow Meter at zero power cold conditions and SDS 1 flow channel measurements at different reactor power levels showed an acceptable agreement. The 100 % F.P. average errors for channel flow of R, shows that suitable NUCIRC flow assessment can be made. So, it can be done a fair prediction of the reactor power distribution. NUCIRC can predict accurately the onset of boiling and helps to warn at the possible power instabilities at high powers or it can detect the flow blockages. The thermal hydraulic analyst has in NUCIRC a suitable tool to do accurate predictions for the thermal hydraulic parameters for different steady state power levels which subsequently leads to an optimal CANDU reactor operation. (authors)
Douglas G. Moore
Full Text Available The study of collective behavior has traditionally relied on a variety of different methodological tools ranging from more theoretical methods such as population or game-theoretic models to empirical ones like Monte Carlo or multi-agent simulations. An approach that is increasingly being explored is the use of information theory as a methodological framework to study the flow of information and the statistical properties of collectives of interacting agents. While a few general purpose toolkits exist, most of the existing software for information theoretic analysis of collective systems is limited in scope. We introduce Inform, an open-source framework for efficient information theoretic analysis that exploits the computational power of a C library while simplifying its use through a variety of wrappers for common higher-level scripting languages. We focus on two such wrappers here: PyInform (Python and rinform (R. Inform and its wrappers are cross-platform and general-purpose. They include classical information-theoretic measures, measures of information dynamics and information-based methods to study the statistical behavior of collective systems, and expose a lower-level API that allow users to construct measures of their own. We describe the architecture of the Inform framework, study its computational efficiency and use it to analyze three different case studies of collective behavior: biochemical information storage in regenerating planaria, nest-site selection in the ant Temnothorax rugatulus, and collective decision making in multi-agent simulations.
Lad, N; Adebayo, D; Aroussi, A
Particle image velocimetry (PIV) is a successful flow mapping technique which can optically quantify large portions of a flow regime. This enables the method to be completely non-intrusive. The ability to be non-intrusive to any flow has allowed PIV to be used in a large range of industrial sectors for many applications. However, a fundamental disadvantage of the conventional PIV technique is that it cannot easily be used with flows which have no or limited optical access. Flows which have limited optical access for PIV measurement have been addressed using endoscopic PIV techniques. This system uses two separate probes which relay a light sheet and imaging optics to a planar position within the desired flow regime. This system is effective in medical and engineering applications. The present study has been involved in the development of a new endoscopic PIV system which integrates the illumination and imaging optics into one rigid probe. This paper focuses on the validation of the images taken from the novel single stem endoscopic PIV system. The probe is used within atomized spray flow and is compared with conventional PIV measurement and also pitot-static data. The endoscopic PIV system provides images which create localized velocity maps that are comparable with the global measurement of the conventional PIV system. The velocity information for both systems clearly show similar results for the spray characterization and are also validated using the pitot-static data
Giacalone, Giuliano; Yan, Li; Ollitrault, Jean-Yves
Higher Fourier harmonics of anisotropic flow (v4 and beyond) get large contributions induced by elliptic and triangular flow through nonlinear response. We present a general framework of nonlinear hydrodynamic response which encompasses the existing one and allows us to take into account the mutual correlation between the nonlinear couplings affecting Fourier harmonics of any order. Using Large Hadron Collider data on Pb+Pb collisions at s =2.76 TeV, we perform an application of our formalism to hexagonal flow, v6, a coefficient affected by several nonlinear contributions which are of the same order of magnitude. We obtain the first experimental measure of the coefficient χ624, which couples v6 to v2 and v4. This is achieved by putting together the information from several analyses: event-plane correlations, symmetric cumulants, and higher order moments recently analyzed by the ALICE Collaboration. The value of χ624 extracted from data is in fair agreement with hydrodynamic calculations, although with large error bars, which would be dramatically reduced by a dedicated analysis. We argue that within our formalism the nonlinear structure of a given higher order harmonic can be determined more accurately than the harmonic itself, and we emphasize potential applications to future measurements of v7 and v8.
Carpenter, Stephen R; Brock, William A; Folke, Carl; van Nes, Egbert H; Scheffer, Marten
Variable flows of food, water, or other ecosystem services complicate planning. Management strategies that decrease variability and increase predictability may therefore be preferred. However, actions to decrease variance over short timescales (2-4 y), when applied continuously, may lead to long-term ecosystem changes with adverse consequences. We investigated the effects of managing short-term variance in three well-understood models of ecosystem services: lake eutrophication, harvest of a wild population, and yield of domestic herbivores on a rangeland. In all cases, actions to decrease variance can increase the risk of crossing critical ecosystem thresholds, resulting in less desirable ecosystem states. Managing to decrease short-term variance creates ecosystem fragility by changing the boundaries of safe operating spaces, suppressing information needed for adaptive management, cancelling signals of declining resilience, and removing pressures that may build tolerance of stress. Thus, the management of variance interacts strongly and inseparably with the management of resilience. By allowing for variation, learning, and flexibility while observing change, managers can detect opportunities and problems as they develop while sustaining the capacity to deal with them.
Nadi Helena Presser
Full Text Available This article reflects on the ways of appropriation in organizations. The notion of Information Myopia is characterized by the lack of knowledge about the available informational capabilities in organizations, revealing a narrow view of the information environment. This analysis has focused on the process for renewing the software licenses contracts of a large multinational group, in order to manage its organizational assets in information technology. The collected, explained and justified information allowed to elaborate an action proposal, which enabled the creation of new organizational knowledge. In its theoretical dimension, the value of information was materialized by its use, in a collective process of organizational learning.
... 40 Protection of Environment 6 2010-07-01 2010-07-01 false Submission of Hg allowance transfers... Times for Coal-Fired Electric Steam Generating Units Hg Allowance Transfers § 60.4160 Submission of Hg allowance transfers. An Hg authorized account representative seeking recordation of a Hg allowance transfer...
... 40 Protection of Environment 6 2010-07-01 2010-07-01 false Hg allowance allocations. 60.4142... Coal-Fired Electric Steam Generating Units Hg Allowance Allocations § 60.4142 Hg allowance allocations. (a)(1) The baseline heat input (in MMBtu) used with respect to Hg allowance allocations under...
... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Special allowance reserve. 73.27 Section 73.27 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) SULFUR DIOXIDE ALLOWANCE SYSTEM Allowance Allocations § 73.27 Special allowance reserve. (a...
... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Allowance tracking system accounts. 73.30 Section 73.30 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) SULFUR DIOXIDE ALLOWANCE SYSTEM Allowance Tracking System § 73.30 Allowance tracking system...
Staebler, T.; Meyer, L.; Schulenberg, T.; Laurien, E.
The phenomena of flow reversal in stratified flows are investigated in a horizontal channel with application to the Emergency Core Cooling System (ECCS) in Pressurized Water Reactors (PWR). In case of a Loss-of-Coolant-Accident (LOCA), coolant can be injected through a secondary pipe within the feeding line of the primary circuit, the so called hot leg, counter-currently to the steam flow. It is essential that the coolant reaches the reactor core to prevent overheating. Due to high temperatures in such accident scenarios, steam is generated in the core, which escapes from the reactor vessel through the hot leg. In case of sufficiently high steam flow rates, only a reduced amount of coolant or even no coolant will be delivered to the reactor core. The WENKA test facility at the Institute for Nuclear and Energy Technologies (IKET) at Forschungszentrum Karlsruhe is capable to investigate the fluid dynamics of two-phase flows in such scenarios. Water and air flow counter-currently in a horizontal channel made of clear acrylic glass to allow full optical access. Flow rates of water and air can be varied independently within a wide range. Once flow reversal sets in, a strong hysteresis effect must be taken into account. This was quantified during the present investigations. Local experimental data are needed to expand appropriate models on flow reversal in horizontal two-phase flow and to include them into numerical codes. Investigations are carried out by means of Particle Image Velocimetry (PIV) to obtain local flow velocities without disturbing the flow. Due to the wavy character of the flow, strong reflections at the interfacial area must be taken into account. Using fluorescent particles and an optical filter allows eliminating the reflections and recording only the signals of the particles. The challenges in conducting local investigations in stratified wavy flows by applying optical measurement techniques are discussed. Results are presented and discussed allowing
Childs, Peter R N
Rotating flow is critically important across a wide range of scientific, engineering and product applications, providing design and modeling capability for diverse products such as jet engines, pumps and vacuum cleaners, as well as geophysical flows. Developed over the course of 20 years' research into rotating fluids and associated heat transfer at the University of Sussex Thermo-Fluid Mechanics Research Centre (TFMRC), Rotating Flow is an indispensable reference and resource for all those working within the gas turbine and rotating machinery industries. Traditional fluid and flow dynamics titles offer the essential background but generally include very sparse coverage of rotating flows-which is where this book comes in. Beginning with an accessible introduction to rotating flow, recognized expert Peter Childs takes you through fundamental equations, vorticity and vortices, rotating disc flow, flow around rotating cylinders and flow in rotating cavities, with an introduction to atmospheric and oceanic circul...
Left Gastric Vein Visualization with Hepatopetal Flow Information in Healthy Subjects Using Non-Contrast-Enhanced Magnetic Resonance Angiography with Balanced Steady-State Free-Precession Sequence and Time-Spatial Labeling Inversion Pulse.
Furuta, Akihiro; Isoda, Hiroyoshi; Ohno, Tsuyoshi; Ono, Ayako; Yamashita, Rikiya; Arizono, Shigeki; Kido, Aki; Sakashita, Naotaka; Togashi, Kaori
To selectively visualize the left gastric vein (LGV) with hepatopetal flow information by non-contrast-enhanced magnetic resonance angiography under a hypothesis that change in the LGV flow direction can predict the development of esophageal varices; and to optimize the acquisition protocol in healthy subjects. Respiratory-gated three-dimensional balanced steady-state free-precession scans were conducted on 31 healthy subjects using two methods (A and B) for visualizing the LGV with hepatopetal flow. In method A, two time-spatial labeling inversion pulses (Time-SLIP) were placed on the whole abdomen and the area from the gastric fornix to the upper body, excluding the LGV area. In method B, nonselective inversion recovery pulse was used and one Time-SLIP was placed on the esophagogastric junction. The detectability and consistency of LGV were evaluated using the two methods and ultrasonography (US). Left gastric veins by method A, B, and US were detected in 30 (97%), 24 (77%), and 23 (74%) subjects, respectively. LGV flow by US was hepatopetal in 22 subjects and stagnant in one subject. All hepatopetal LGVs by US coincided with the visualized vessels in both methods. One subject with non-visualized LGV in method A showed stagnant LGV by US. Hepatopetal LGV could be selectively visualized by method A in healthy subjects.
Henry, J.B. II; Radulski, D.R.; Ellingson, E.G.; Engels, J.P.
Clean Air Capital Markets, an investment bank structuring SO 2 Allowance transactions, has designed two allowance value models. The first forecasts an equilibrium allowance value based on coal supply and demand. The second estimates the sulfur premium of all reported coal deliveries to utilities. Both models demonstrate that the fundamental allowance value is approximately double current spot market prices for small volumes of off-system allowances
The regulatory treatment of compliance costs and allowances will significantly affect both the utility's CAAA compliance decisions and the cost of compliance. Sections in this chapter include ratemaking treatment of allowances, utility buy-ins, the market test of compliance costs and utility incentive, FERC account classification, measuring the value of allowances, inventory methods for allowances, expense recognition of allowances, regulatory-created assets and liabilities, and application of the FERC proposal. 8 refs., 1 tab
Lavinia Elena BRÎNDESCU OLARIU
Full Text Available The financing decision is taken based on the expectations concerning the future cash-flows generated in the operating activity, which should provide coverage for the debt service and allow for an increase of the shareholders’ wealth. Still, the future cash-flows are affected by risk, which makes the sensitivity analysis a very important part of the decision process. The current research sets to evaluate the sensitivity of the payment capacity to variations of the payments for raw materials and consumables. The study employs 391 forecasted yearly cash-flow statements collected from 50 companies together with detailed information concerning the hypotheses of the forecasts. The results of the study allow for the establishment of benchmarks for the payment capacity’s sensitivity, the determination of the mechanisms through which the variation of payments for raw materials and consumables impacts the payment capacity, as well as the identification of the possible causes of such a variation.
This discussion of the ethics of the information process provides a brief review of the process of information supply and flow, primarily in science and technology; looks at various points in the flow of information; and highlights particular ethical concerns. Facets of the process discussed in more detail include ways in which some scientists…
Altafini, C.R.; Silva Ferreira, R.T. da
The analogy between the water flow with a free surface and the compressible fluid flow, commonly called hydraulic analogy, is analyzed and its limitations are identified. The water table is the equipment used for this simulation, which allows the quatitative analysis of subsonic and supersonic flow with a low cost apparatus. The hydraulic analogy is applied to subsonic flow around circular cylinders and supersonic flow around cones. The results are compared with available theoretical and experimental data and a good agreement is achieved. (Author) [pt
Flow Visualization describes the most widely used methods for visualizing flows. Flow visualization evaluates certain properties of a flow field directly accessible to visual perception. Organized into five chapters, this book first presents the methods that create a visible flow pattern that could be investigated by visual inspection, such as simple dye and density-sensitive visualization methods. It then deals with the application of electron beams and streaming birefringence. Optical methods for compressible flows, hydraulic analogy, and high-speed photography are discussed in other cha
de Ridder, Luc; Filies, Olaf; Rodriguez, Ben; Kuijken, Aart
Through application of modern supply chain concepts in combination with state-of-the-art information technology, mask manufacturing performance and customer satisfaction can be improved radically. The AutoMOPS solution emphasizes on the elimination of the order verification through paperless, electronically linked information sharing/exchange between chip design, mask production and prototype production stages.
Full Text Available Monitoring of flows in sewer systems is increasingly applied to calibrate urban drainage models used for long-term simulation. However, most often models are calibrated without considering the uncertainties. The generalized likelihood uncertainty estimation (GLUE methodology is here applied to assess parameter and flow simulation uncertainty using a simplified lumped sewer model that accounts for three separate flow contributions: wastewater, fast runoff from paved areas, and slow infiltrating water from permeable areas. Recently GLUE methodology has been critisised for generating prediction limits without statistical coherence and consistency and for the subjectivity in the choice of a threshold value to distinguish "behavioural" from "non-behavioural" parameter sets. In this paper we examine how well the GLUE methodology performs when the behavioural parameter sets deduced from a calibration period are applied to generate prediction bounds in validation periods. By retaining an increasing number of parameter sets we aim at obtaining consistency between the GLUE generated 90% prediction limits and the actual containment ratio (CR in calibration. Due to the large uncertainties related to spatio-temporal rain variability during heavy convective rain events, flow measurement errors, possible model deficiencies as well as epistemic uncertainties, it was not possible to obtain an overall CR of more than 80%. However, the GLUE generated prediction limits still proved rather consistent, since the overall CRs obtained in calibration corresponded well with the overall CRs obtained in validation periods for all proportions of retained parameter sets evaluated. When focusing on wet and dry weather periods separately, some inconsistencies were however found between calibration and validation and we address here some of the reasons why we should not expect the coverage of the prediction limits to be identical in calibration and validation periods in real
Renelson Ribeiro Sampaio
topologia livre de escala, o que torna os fluxos de informação e conhecimento resistentes a "ataques" aleatórios (e.g. saída espontânea de um membro da equipe, mas vulneráveis a "ataques" planejados (e.g. retirada de indivíduos bem conectados ou hubs.Business competitiveness requires the company's ability to define and implement competitive strategies that will allow its survival and development in the long term - in other words, its sustainability. Among these strategies, technological innovation associated with the development of the technical-scientific capacity of the company, and therefore, of the individuals who work in its various processes, has acquired a position of growing importance in corporate planning and actions. Accordingly, interpersonal relations are a critical factor for innovation processes, which must be continuously improved. The main objective of this study is to contribute to the understanding of the dynamics associated with these processes from the case study conducted focusing on procedures and information flows generated by the Capacity Management of a large telecommunications company in Brazil. The research was carried out based on qualitative and quantitative approaches. On the one hand, from the philosophical perspective of interpretation and based on the qualitative research, it was possible to study the social phenomenon related to the diffusion of knowledge among members of a team responsible for the actions and technical proposals in the capacity management of the telecom company being studied. On the other hand, the quantitative approach allowed the use of mathematical models in order to measure properties (e.g. cohesion, centrality, prestige and connectivity of the networks studied and to provide their topological characterization. The present study was developed based on the model proposed by Nonaka and Takeuchi (1977 in order to analyze and describe the processes of creation and dissemination of knowledge in the organization
The information theory has been previously used to develop metrics that allowed to characterize temporal patterns in soil moisture dynamics, and to evaluate and to compare performance of soil water flow models. The objective of this study was to apply information and complexity measures to characte...
An introduction into the problem of two-phase flows is presented. Flow regimes arizing in two-phase flows are described, and classification of these regimes is given. Structures of vertical and horizontal two-phase flows and a method of their identification using regime maps are considered. The limits of this method application are discussed. The flooding phenomena and phenomena of direction change (flow reversal) of the flow and interrelation of these phenomena as well as transitions from slug regime to churn one and from churn one to annular one in vertical flows are described. Problems of phase transitions and equilibrium are discussed. Flow regimes in tubes where evaporating liquid is running, are described [ru
A promising method based on fast X-ray imaging has been developed to investigate the dynamics and the structure of complex two-phase flows. It has been applied in this work on cavitating flows created inside a Venturi-type test section and helped therefore to better understand flows inside cavitation pockets. Seeding particles were injected into the flow to trace the liquid phase. Thanks to the characteristics of the beam provided by the APS synchrotron (Advance Photon Source, USA), high definition X-ray images of the flow containing simultaneously information for both liquid and vapour were obtained. Velocity fields of both phases were thus calculated using image cross-correlation algorithms. Local volume fractions of vapour have also been obtained using local intensities of the images. Beforehand however, image processing is required to separate phases for velocity measurements. Validation methods of all applied treatments were developed, they allowed to characterise the measurement accuracy. This experimental technique helped us to have more insight into the dynamic of cavitating flows and especially demonstrates the presence of significant slip velocities between phases. (author)
Cox, George B., Jr.
Liquid-flow controller allows pressure in liquid to increase steeply with flow as flow starts, then provides more-gradual nearly linear rise of pressure with flow as flow and pressure increase beyond preset breakpoint. Controller alternative version of mechanism described in "Liquid-Flow Controller Responds To Pressure" (MFS-28329) and "Liquid-Flow Controller With Preset Break Pressure" (MFS-28330). Material cut out of cone at tip of pintle. Liquid always passes from shell, albeit at low rate. When pressure in shell great enough to force orifice away from pintle, liquid flows at greater rate.
Melentev Vladimir Anatolevich
Full Text Available In presented article the questions connected with the general concepts and bases of functioning of document flow in borders of any economic object (the enterprise, establishment, the organization are considered. Gostirovanny definition of document flow, classification of types of documentary streams is given. The basic principles of creation of document flow, following which are considered allows to create optimum structure документопотока and nature of movement of documents; interrelation of external and internal influences. Further basic elements of medical document flow are considered; the main problems of medical document flow being, besides, major factors, distinguishing medical document flow from document flow of manufacturing enterprises or other economic objects are specified. From consideration of these problems the conclusion about an initial stage of their decision - standardization of the medical document flow, being, besides, is drawn by the first stage of creation of a common information space of medical branch.
Zemlyanaya, N. V.; Gulyakin, A. V.
The uniformity of flow distribution in perforated manifolds is a relevant task. The efficiency of water supply, sewerage and perflation systems is determined by hydraulics of the flow with a variable mass. The extensive study of versatile available information showed that achieving a uniform flow distribution through all of the outlets is almost impossible. The analysis of the studies conducted by other authors and our numerical experiments performed with the help of the software package ANSYS 16.1 were made in this work. The results allowed us to formulate the main causes of non-uniform flow distribution. We decided to suggest a hypothesis to explain the static pressure rise problem at the end of a perforated manifold.
... 42 Public Health 1 2010-10-01 2010-10-01 false Allowable cost of drugs. 50.504 Section 50.504... APPLICABILITY Maximum Allowable Cost for Drugs § 50.504 Allowable cost of drugs. (a) The maximum amount which may be expended from program funds for the acquisition of any drug shall be the lowest of (1) The...
... 46 Shipping 2 2010-10-01 2010-10-01 false Corrosion allowance. 54.25-5 Section 54.25-5 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PRESSURE VESSELS Construction With Carbon, Alloy, and Heat Treated Steels § 54.25-5 Corrosion allowance. The corrosion allowance...
... allowable cost. As prescribed in 2131.270, insert the following clause: Accounting and Allowable Cost (OCT... cost; (ii) Incurred with proper justification and accounting support; (iii) Determined in accordance... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Accounting and allowable...
... 45 Public Welfare 4 2010-10-01 2010-10-01 false Allowance for books. 1801.43 Section 1801.43... HARRY S. TRUMAN SCHOLARSHIP PROGRAM Payments to Finalists and Scholars § 1801.43 Allowance for books. The cost allowance for a Scholar's books is $1000 per year, or such higher amount published on the...
... 40 Protection of Environment 6 2010-07-01 2010-07-01 false Recordation of Hg allowance allocations... Times for Coal-Fired Electric Steam Generating Units Hg Allowance Tracking System § 60.4153 Recordation of Hg allowance allocations. (a) By December 1, 2006, the Administrator will record in the Hg Budget...
...; and (iii) The current realizable market value, determined as of the close of the market on the last... 17 Commodity and Securities Exchanges 1 2010-04-01 2010-04-01 false Calculation of allowed net... BANKRUPTCY § 190.07 Calculation of allowed net equity. Allowed net equity shall be computed as follows: (a...
... 32 National Defense 6 2010-07-01 2010-07-01 false Depreciation and maximum allowances. 842.35... LITIGATION ADMINISTRATIVE CLAIMS Personnel Claims (31 U.S.C. 3701, 3721) § 842.35 Depreciation and maximum allowances. The military services have jointly established the “Allowance List-Depreciation Guide” to...
... 50 Wildlife and Fisheries 9 2010-10-01 2010-10-01 false Allowable gear and gear restrictions. 665... Fisheries § 665.127 Allowable gear and gear restrictions. (a) American Samoa coral reef ecosystem MUS may be taken only with the following allowable gear and methods: (1) Hand harvest; (2) Spear; (3) Slurp gun; (4...
... 50 Wildlife and Fisheries 9 2010-10-01 2010-10-01 false Allowable gear and gear restrictions. 665... Island Area Fisheries § 665.627 Allowable gear and gear restrictions. (a) Coral reef ecosystem MUS may be taken only with the following allowable gear and methods: (1) Hand harvest; (2) Spear; (3) Slurp gun; (4...
... 50 Wildlife and Fisheries 9 2010-10-01 2010-10-01 false Allowable gear and gear restrictions. 665... Fisheries § 665.227 Allowable gear and gear restrictions. (a) Hawaii coral reef ecosystem MUS may be taken only with the following allowable gear and methods: (1) Hand harvest; (2) Spear; (3) Slurp gun; (4...
... 50 Wildlife and Fisheries 9 2010-10-01 2010-10-01 false Allowable gear and gear restrictions. 665... Archipelago Fisheries § 665.427 Allowable gear and gear restrictions. (a) Mariana coral reef ecosystem MUS may be taken only with the following allowable gear and methods: (1) Hand harvest; (2) Spear; (3) Slurp...
Reinartz, Ines; Sinner, Claude; Nettels, Daniel; Stucki-Buchli, Brigitte; Stockmar, Florian; Panek, Pawel T.; Jacob, Christoph R.; Nienhaus, Gerd Ulrich; Schuler, Benjamin; Schug, Alexander
Fully understanding biomolecular function requires detailed insight into the systems' structural dynamics. Powerful experimental techniques such as single molecule Förster Resonance Energy Transfer (FRET) provide access to such dynamic information yet have to be carefully interpreted. Molecular simulations can complement these experiments but typically face limits in accessing slow time scales and large or unstructured systems. Here, we introduce a coarse-grained simulation technique that tackles these challenges. While requiring only few parameters, we maintain full protein flexibility and include all heavy atoms of proteins, linkers, and dyes. We are able to sufficiently reduce computational demands to simulate large or heterogeneous structural dynamics and ensembles on slow time scales found in, e.g., protein folding. The simulations allow for calculating FRET efficiencies which quantitatively agree with experimentally determined values. By providing atomically resolved trajectories, this work supports the planning and microscopic interpretation of experiments. Overall, these results highlight how simulations and experiments can complement each other leading to new insights into biomolecular dynamics and function.
Full Text Available Flow is a condition when individual merges within his/her activity. When a person in flow state, he/she can develop his/her abilities and more success in learning. The purpose of the study is to understand flow experience in learning among undergraduate student. The study used case study qualitative approach. Informant of this research was an undergraduate student which had flow experience. Data was collected by an interview. According to the result, the subject did not experience flow in the learning process, as likes he was in meditation. It happened because when he learned something, he felt be pressed by tasks. It’s important for individual to relax when they are learning.
Filipiuk, Piotr; Terepeta, Michal Tomasz; Nielson, Hanne Riis
to the approach taken by Monotone Frameworks and other classical analyses. We present a generic framework for static analysis based on flow algebras and program graphs. Program graphs are often used in Model Checking to model concurrent and distributed systems. The framework allows to induce new flow algebras...
Stability of Parallel Flows provides information pertinent to hydrodynamical stability. This book explores the stability problems that occur in various fields, including electronics, mechanics, oceanography, administration, economics, as well as naval and aeronautical engineering. Organized into two parts encompassing 10 chapters, this book starts with an overview of the general equations of a two-dimensional incompressible flow. This text then explores the stability of a laminar boundary layer and presents the equation of the inviscid approximation. Other chapters present the general equation
Kim, Hyun Seok; Koo, Won W. [Center for Agricultural Policy and Trade Studies, Department of Agribusiness and Applied Economics, North Dakota State University, Dept 7610, P.O. Box 6050, Fargo, ND 58103-6050 (United States)
The US carbon allowance market has different characteristic and price determination process from the EU ETS market, since emitting installations voluntarily participate in emission trading scheme. This paper examines factors affecting the US carbon allowance market. An autoregressive distributed lag model is used to examine the short- and long-run relationships between the US carbon allowance market and its determinant factors. In the long-run, the price of coal is a main factor in the determination of carbon allowance trading. In the short-run, on the other hand, the changes in crude oil and natural gas prices as well as coal price have significant effects on carbon allowance market. (author)
Kim, Hyun Seok; Koo, Won W.
The US carbon allowance market has different characteristic and price determination process from the EU ETS market, since emitting installations voluntarily participate in emission trading scheme. This paper examines factors affecting the US carbon allowance market. An autoregressive distributed lag model is used to examine the short- and long-run relationships between the US carbon allowance market and its determinant factors. In the long-run, the price of coal is a main factor in the determination of carbon allowance trading. In the short-run, on the other hand, the changes in crude oil and natural gas prices as well as coal price have significant effects on carbon allowance market.
Jang, S Mo; Mckeever, Brooke W; Mckeever, Robert; Kim, Joon Kyoung
Despite increasing warnings about inaccurate information online, little is known about how social media contribute to the widespread diffusion of unverified health information. This study addresses this issue by examining the vaccine-autism controversy. By looking into a large dataset of Twitter, Reddit posts, and online news over 20 months in the US, Canada, and the UK, our time-series analysis shows that Twitter drives news agendas, and Reddit follows news agendas regarding the vaccine-autism debate. Additionally, the results show that both Twitter and Reddit are more likely to discuss the vaccine-autism link compared to online news content.
... 40 Protection of Environment 17 2010-07-01 2010-07-01 false Availability of consumption allowances in addition to baseline consumption allowances for class I controlled substances. 82.10 Section 82.10... STRATOSPHERIC OZONE Production and Consumption Controls § 82.10 Availability of consumption allowances in...
Baber, C; Stanton, N A; Atkinson, J; McMaster, R; Houghton, R J
The concept of common operational pictures (COPs) is explored through the application of social network analysis (SNA) and agent-based modelling to a generic search and rescue (SAR) scenario. Comparing the command structure that might arise from standard operating procedures with the sort of structure that might arise from examining information-in-common, using SNA, shows how one structure could be more amenable to 'command' with the other being more amenable to 'control' - which is potentially more suited to complex multi-agency operations. An agent-based model is developed to examine the impact of information sharing with different forms of COPs. It is shown that networks using common relevant operational pictures (which provide subsets of relevant information to groups of agents based on shared function) could result in better sharing of information and a more resilient structure than networks that use a COP. SNA and agent-based modelling are used to compare different forms of COPs for maritime SAR operations. Different forms of COP change the communications structures in the socio-technical systems in which they operate, which has implications for future design and development of a COP.
United Nations Educational, Scientific, and Cultural Organization, Paris (France).
Recognizing that communications satellites are capable of broadcasting programs for individual or community reception, and that the Universal Declaration of Human Rights proclaims that everyone has the right to receive and impart information through any media regardless of frontiers, the following guiding principles are proclaimed: (1) Satellite…
Pan, Jing Samantha; Bingham, Ned; Bingham, Geoffrey P
Rotating a scene in a frontoparallel plane (rolling) yields a change in orientation of constituent images. When using only information provided by static images to perceive a scene after orientation change, identification performance typically decreases (Rock & Heimer, 1957). However, rolling generates optic flow information that relates the discrete, static images (before and after the change) and forms an embodied memory that aids recognition. The embodied memory hypothesis predicts that upon detecting a continuous spatial transformation of image structure, or in other words, seeing the continuous rolling process and objects undergoing rolling observers should accurately perceive objects during and after motion. Thus, in this case, orientation change should not affect performance. We tested this hypothesis in three experiments and found that (a) using combined optic flow and image structure, participants identified locations of previously perceived but currently occluded targets with great accuracy and stability (Experiment 1); (b) using combined optic flow and image structure information, participants identified hidden targets equally well with or without 30° orientation changes (Experiment 2); and (c) when the rolling was unseen, identification of hidden targets after orientation change became worse (Experiment 3). Furthermore, when rolling was unseen, although target identification was better when participants were told about the orientation change than when they were not told, performance was still worse than when there was no orientation change. Therefore, combined optic flow and image structure information, not mere knowledge about the rolling, enables accurate and stable perception despite orientation change. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Sokolov, Andrey; Webster, Rachel; Melatos, Andrew; Kieu, Tien
High-value transactions between banks in Australia are settled in the Reserve Bank Information and Transfer System (RITS) administered by the Reserve Bank of Australia. RITS operates on a real-time gross settlement (RTGS) basis and settles payments and transfers sourced from the SWIFT payment delivery system, the Austraclear securities settlement system, and the interbank transactions entered directly into RITS. In this paper, we analyse a dataset received from the Reserve Bank of Australia that includes all interbank transactions settled in RITS on an RTGS basis during five consecutive weekdays from 19 February 2007 inclusive, a week of relatively quiescent market conditions. The source, destination, and value of each transaction are known, which allows us to separate overnight loans from other transactions (nonloans) and reconstruct monetary flows between banks for every day in our sample. We conduct a novel analysis of the flow stability and examine the connection between loan and nonloan flows. Our aim is to understand the underlying causal mechanism connecting loan and nonloan flows. We find that the imbalances in the banks' exchange settlement funds resulting from the daily flows of nonloan transactions are almost exactly counterbalanced by the flows of overnight loans. The correlation coefficient between loan and nonloan imbalances is about -0.9 on most days. Some flows that persist over two consecutive days can be highly variable, but overall the flows are moderately stable in value. The nonloan network is characterised by a large fraction of persistent flows, whereas only half of the flows persist over any two consecutive days in the loan network. Moreover, we observe an unusual degree of coherence between persistent loan flow values on Tuesday and Wednesday. We probe static topological properties of the Australian interbank network and find them consistent with those observed in other countries.
Gao, Zhong-Ke; Yang, Yu-Xuan; Zhai, Lu-Sheng; Dang, Wei-Dong; Yu, Jia-Liang; Jin, Ning-De
High water cut and low velocity vertical upward oil-water two-phase flow is a typical complex system with the features of multiscale, unstable and non-homogenous. We first measure local flow information by using distributed conductance sensor and then develop a multivariate multiscale complex network (MMCN) to reveal the dispersed oil-in-water local flow behavior. Specifically, we infer complex networks at different scales from multi-channel measurements for three typical vertical oil-in-water flow patterns. Then we characterize the generated multiscale complex networks in terms of network clustering measure. The results suggest that the clustering coefficient entropy from the MMCN not only allows indicating the oil-in-water flow pattern transition but also enables to probe the dynamical flow behavior governing the transitions of vertical oil-water two-phase flow.
Gao, Zhong-Ke; Yang, Yu-Xuan; Cai, Qing; Zhang, Shan-Shan; Jin, Ning-De
Exploring the dynamical behaviors of high water cut and low velocity oil-water flows remains a contemporary and challenging problem of significant importance. This challenge stimulates us to design a high-speed cycle motivation conductance sensor to capture spatial local flow information. We systematically carry out experiments and acquire the multi-channel measurements from different oil-water flow patterns. Then we develop a novel multivariate weighted recurrence network for uncovering the flow behaviors from multi-channel measurements. In particular, we exploit graph energy and weighted clustering coefficient in combination with multivariate time-frequency analysis to characterize the derived complex networks. The results indicate that the network measures are very sensitive to the flow transitions and allow uncovering local dynamical behaviors associated with water cut and flow velocity. These properties render our method particularly useful for quantitatively characterizing dynamical behaviors governing the transition and evolution of different oil-water flow patterns.
This article, acknowledging the potentially important general attractions of the allowance for corporate equity (ACE), looks at some of its more specific implications. On corporate taxes, the article looks at questions about the implied revenue-neutral rate of corporation tax (and redistribution of the tax burden); the effects on cash flow of both government and companies; and what would become a crucially important charge on capital gains. On income tax, the article comments on the implicati...
Flow visualization techniques are reviewed, with particular attention given to those applicable to liquid helium flows. Three techniques capable of obtaining qualitative and quantitative measurements of complex 3D flow fields are discussed including focusing schlieren, particle image volocimetry, and holocinematography (HCV). It is concluded that the HCV appears to be uniquely capable of obtaining full time-varying, 3D velocity field data, but is limited to the low speeds typical of liquid helium facilities. 8 refs