WorldWideScience

Sample records for maximum information content

  1. An entropy approach for evaluating the maximum information content achievable by an urban rainfall network

    Directory of Open Access Journals (Sweden)

    E. Ridolfi

    2011-07-01

    Full Text Available Hydrological models are the basis of operational flood-forecasting systems. The accuracy of these models is strongly dependent on the quality and quantity of the input information represented by rainfall height. Finer space-time rainfall resolution results in more accurate hazard forecasting. In this framework, an optimum raingauge network is essential in predicting flood events.

    This paper develops an entropy-based approach to evaluate the maximum information content achievable by a rainfall network for different sampling time intervals. The procedure is based on the determination of the coefficients of transferred and nontransferred information and on the relative isoinformation contours.

    The nontransferred information value achieved by the whole network is strictly dependent on the sampling time intervals considered. An empirical curve is defined, to assess the objective of the research: the nontransferred information value is plotted versus the associated sampling time on a semi-log scale. The curve has a linear trend.

    In this paper, the methodology is applied to the high-density raingauge network of the urban area of Rome.

  2. Maximum mutual information regularized classification

    KAUST Repository

    Wang, Jim Jing-Yan

    2014-09-07

    In this paper, a novel pattern classification approach is proposed by regularizing the classifier learning to maximize mutual information between the classification response and the true class label. We argue that, with the learned classifier, the uncertainty of the true class label of a data sample should be reduced by knowing its classification response as much as possible. The reduced uncertainty is measured by the mutual information between the classification response and the true class label. To this end, when learning a linear classifier, we propose to maximize the mutual information between classification responses and true class labels of training samples, besides minimizing the classification error and reducing the classifier complexity. An objective function is constructed by modeling mutual information with entropy estimation, and it is optimized by a gradient descend method in an iterative algorithm. Experiments on two real world pattern classification problems show the significant improvements achieved by maximum mutual information regularization.

  3. Maximum mutual information regularized classification

    KAUST Repository

    Wang, Jim Jing-Yan; Wang, Yi; Zhao, Shiguang; Gao, Xin

    2014-01-01

    In this paper, a novel pattern classification approach is proposed by regularizing the classifier learning to maximize mutual information between the classification response and the true class label. We argue that, with the learned classifier, the uncertainty of the true class label of a data sample should be reduced by knowing its classification response as much as possible. The reduced uncertainty is measured by the mutual information between the classification response and the true class label. To this end, when learning a linear classifier, we propose to maximize the mutual information between classification responses and true class labels of training samples, besides minimizing the classification error and reducing the classifier complexity. An objective function is constructed by modeling mutual information with entropy estimation, and it is optimized by a gradient descend method in an iterative algorithm. Experiments on two real world pattern classification problems show the significant improvements achieved by maximum mutual information regularization.

  4. A mini-exhibition with maximum content

    CERN Multimedia

    Laëtitia Pedroso

    2011-01-01

    The University of Budapest has been hosting a CERN mini-exhibition since 8 May. While smaller than the main travelling exhibition it has a number of major advantages: its compact design alleviates transport difficulties and makes it easier to find suitable venues in the Member States. Its content can be updated almost instantaneously and it will become even more interactive and high-tech as time goes by.   The exhibition on display in Budapest. The purpose of CERN's new mini-exhibition is to be more interactive and easier to install. Due to its size, the main travelling exhibition cannot be moved around quickly, which is why it stays in the same country for 4 to 6 months. But this means a long waiting list for the other Member States. To solve this problem, the Education Group has designed a new exhibition, which is smaller and thus easier to install. Smaller maybe, but no less rich in content, as the new exhibition conveys exactly the same messages as its larger counterpart. However, in the slimm...

  5. AUDIT INFORMATION CONTENT

    OpenAIRE

    Ioan Rus

    2012-01-01

    The audit of computer systems shows at least two features that make the auditwork not includable in other audit processes such as internal audit and financial audit. Thesetwo particularities refer to the specific software used in information systems auditing and reallevels of information systems audit. This paper presents the specific levels of a system ofauditing and specific techniques available for their implementation in practice. In the end theauthor suggests proposals for improving spec...

  6. Sequential pattern recognition by maximum conditional informativity

    Czech Academy of Sciences Publication Activity Database

    Grim, Jiří

    2014-01-01

    Roč. 45, č. 1 (2014), s. 39-45 ISSN 0167-8655 R&D Projects: GA ČR(CZ) GA14-02652S; GA ČR(CZ) GA14-10911S Keywords : Multivariate statistics * Statistical pattern recognition * Sequential decision making * Product mixtures * EM algorithm * Shannon information Subject RIV: IN - Informatics, Computer Sci ence Impact factor: 1.551, year: 2014 http://library.utia.cas.cz/separaty/2014/RO/grim-0428565.pdf

  7. Content analysis in information flows

    Energy Technology Data Exchange (ETDEWEB)

    Grusho, Alexander A. [Institute of Informatics Problems of Federal Research Center “Computer Science and Control” of the Russian Academy of Sciences, Vavilova str., 44/2, Moscow (Russian Federation); Faculty of Computational Mathematics and Cybernetics, Moscow State University, Moscow (Russian Federation); Grusho, Nick A.; Timonina, Elena E. [Institute of Informatics Problems of Federal Research Center “Computer Science and Control” of the Russian Academy of Sciences, Vavilova str., 44/2, Moscow (Russian Federation)

    2016-06-08

    The paper deals with architecture of content recognition system. To analyze the problem the stochastic model of content recognition in information flows was built. We proved that under certain conditions it is possible to solve correctly a part of the problem with probability 1, viewing a finite section of the information flow. That means that good architecture consists of two steps. The first step determines correctly certain subsets of contents, while the second step may demand much more time for true decision.

  8. The information content of options

    OpenAIRE

    Navon, Yonatan

    2017-01-01

    The objective of this thesis is to examine the information content of stock options in financial markets. A key question in financial economics is how information diffuses across markets and how quickly it is reflected in security prices. This thesis aims at exploring this question by investigating the informational role that options play in financial markets. This is achieved by exploring the joint cross section of option and bond prices, the informational role of options in seasoned equity ...

  9. Content dependent information flow control

    DEFF Research Database (Denmark)

    Nielson, Hanne Riis; Nielson, Flemming

    2017-01-01

    Information flow control extends access control by not only regulating who is allowed to access what data but also the subsequent use of the data. Applications within communications systems require such information flow control to be dependent on the actual contents of the data. We develop...

  10. Modelling information flow along the human connectome using maximum flow.

    Science.gov (United States)

    Lyoo, Youngwook; Kim, Jieun E; Yoon, Sujung

    2018-01-01

    The human connectome is a complex network that transmits information between interlinked brain regions. Using graph theory, previously well-known network measures of integration between brain regions have been constructed under the key assumption that information flows strictly along the shortest paths possible between two nodes. However, it is now apparent that information does flow through non-shortest paths in many real-world networks such as cellular networks, social networks, and the internet. In the current hypothesis, we present a novel framework using the maximum flow to quantify information flow along all possible paths within the brain, so as to implement an analogy to network traffic. We hypothesize that the connection strengths of brain networks represent a limit on the amount of information that can flow through the connections per unit of time. This allows us to compute the maximum amount of information flow between two brain regions along all possible paths. Using this novel framework of maximum flow, previous network topological measures are expanded to account for information flow through non-shortest paths. The most important advantage of the current approach using maximum flow is that it can integrate the weighted connectivity data in a way that better reflects the real information flow of the brain network. The current framework and its concept regarding maximum flow provides insight on how network structure shapes information flow in contrast to graph theory, and suggests future applications such as investigating structural and functional connectomes at a neuronal level. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Information content of poisson images

    International Nuclear Information System (INIS)

    Cederlund, J.

    1979-04-01

    One major problem when producing images with the aid of Poisson distributed quanta is how best to compromise between spatial and contrast resolution. Increasing the number of image elements improves spatial resolution, but at the cost of fewer quanta per image element, which reduces contrast resolution. Information theory arguments are used to analyse this problem. It is argued that information capacity is a useful concept to describe an important property of the imaging device, but that in order to compute the information content of an image produced by this device some statistical properties (such as the a priori probability of the densities) of the object to be depicted must be taken into account. If these statistical properties are not known one cannot make a correct choice between spatial and contrast resolution. (author)

  12. Optimal item discrimination and maximum information for logistic IRT models

    NARCIS (Netherlands)

    Veerkamp, W.J.J.; Veerkamp, Wim J.J.; Berger, Martijn P.F.; Berger, Martijn

    1999-01-01

    Items with the highest discrimination parameter values in a logistic item response theory model do not necessarily give maximum information. This paper derives discrimination parameter values, as functions of the guessing parameter and distances between person parameters and item difficulty, that

  13. Optimum detection for extracting maximum information from symmetric qubit sets

    International Nuclear Information System (INIS)

    Mizuno, Jun; Fujiwara, Mikio; Sasaki, Masahide; Akiba, Makoto; Kawanishi, Tetsuya; Barnett, Stephen M.

    2002-01-01

    We demonstrate a class of optimum detection strategies for extracting the maximum information from sets of equiprobable real symmetric qubit states of a single photon. These optimum strategies have been predicted by Sasaki et al. [Phys. Rev. A 59, 3325 (1999)]. The peculiar aspect is that the detections with at least three outputs suffice for optimum extraction of information regardless of the number of signal elements. The cases of ternary (or trine), quinary, and septenary polarization signals are studied where a standard von Neumann detection (a projection onto a binary orthogonal basis) fails to access the maximum information. Our experiments demonstrate that it is possible with present technologies to attain about 96% of the theoretical limit

  14. Maximum mass-particle velocities in Kantor's information mechanics

    International Nuclear Information System (INIS)

    Sverdlik, D.I.

    1989-01-01

    Kantor's information mechanics links phenomena previously regarded as not treatable by a single theory. It is used here to calculate the maximum velocities υ m of single particles. For the electron, υ m /c ∼ 1 - 1.253814 x 10 -77 . The maximum υ m corresponds to υ m /c ∼ 1 -1.097864 x 10 -122 for a single mass particle with a rest mass of 3.078496 x 10 -5 g. This is the fastest that matter can move. Either information mechanics or classical mechanics can be used to show that υ m is less for heavier particles. That υ m is less for lighter particles can be deduced from an information mechanics argument alone

  15. The maximum entropy production and maximum Shannon information entropy in enzyme kinetics

    Science.gov (United States)

    Dobovišek, Andrej; Markovič, Rene; Brumen, Milan; Fajmut, Aleš

    2018-04-01

    We demonstrate that the maximum entropy production principle (MEPP) serves as a physical selection principle for the description of the most probable non-equilibrium steady states in simple enzymatic reactions. A theoretical approach is developed, which enables maximization of the density of entropy production with respect to the enzyme rate constants for the enzyme reaction in a steady state. Mass and Gibbs free energy conservations are considered as optimization constraints. In such a way computed optimal enzyme rate constants in a steady state yield also the most uniform probability distribution of the enzyme states. This accounts for the maximal Shannon information entropy. By means of the stability analysis it is also demonstrated that maximal density of entropy production in that enzyme reaction requires flexible enzyme structure, which enables rapid transitions between different enzyme states. These results are supported by an example, in which density of entropy production and Shannon information entropy are numerically maximized for the enzyme Glucose Isomerase.

  16. Water Quality Assessment and Total Maximum Daily Loads Information (ATTAINS)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Water Quality Assessment TMDL Tracking And Implementation System (ATTAINS) stores and tracks state water quality assessment decisions, Total Maximum Daily Loads...

  17. INFORMATION SYSTEMS AUDIT CURRICULA CONTENT MATCHING

    Directory of Open Access Journals (Sweden)

    Vasile-Daniel CARDOȘ

    2014-11-01

    Full Text Available Financial and internal auditors must cope with the challenge of performing their mission in technology enhanced environment. In this article we match the information technology description found in the International Federation of Accountants (IFAC and the Institute of Internal Auditors (IIA curricula against the Model Curriculum issued by the Information Systems Audit and Control Association (ISACA. By reviewing these three curricula, we matched the content in the ISACA Model Curriculum with the IFAC International Education Practice Statement 2 and the IIAs’ Global Model Internal Audit Curriculum. In the IFAC and IIA Curriculum there are 16 content elements, out of 19 possible, which match, in their description, the ISACA Model Curriculum’s content. We noticed that a candidate who graduates an IFAC or IIA compliant program acquire IS auditing competences similar to the specific content of the ISACA model curriculum but less than the requirements for a professional information systems auditor.

  18. Cancer Patients' Informational Needs: Qualitative Content Analysis.

    Science.gov (United States)

    Heidari, Haydeh; Mardani-Hamooleh, Marjan

    2016-12-01

    Understanding the informational needs of cancer patients is a requirement to plan any educative care program for them. The aim of this study was to identify Iranian cancer patients' perceptions of informational needs. The study took a qualitative approach. Semi-structured interviews were held with 25 cancer patients in two teaching hospitals in Iran. Transcripts of the interviews underwent conventional content analysis, and categories were extracted. The results came under two main categories: disease-related informational needs and information needs related to daily life. Disease-related informational needs had two subcategories: obtaining information about the nature of disease and obtaining information about disease prognosis. Information needs related to daily life also had two subcategories: obtaining information about healthy lifestyle and obtaining information about regular activities of daily life. The findings provide deep understanding of cancer patients' informational needs in Iran.

  19. Network Inference and Maximum Entropy Estimation on Information Diagrams

    Czech Academy of Sciences Publication Activity Database

    Martin, E.A.; Hlinka, Jaroslav; Meinke, A.; Děchtěrenko, Filip; Tintěra, J.; Oliver, I.; Davidsen, J.

    2017-01-01

    Roč. 7, č. 1 (2017), č. článku 7062. ISSN 2045-2322 R&D Projects: GA ČR GA13-23940S; GA MZd(CZ) NV15-29835A Grant - others:GA MŠk(CZ) LO1611 Institutional support: RVO:67985807 Keywords : complex networks * mutual information * entropy maximization * fMRI Subject RIV: BD - Theory of Information OBOR OECD: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8) Impact factor: 4.259, year: 2016

  20. Network Inference and Maximum Entropy Estimation on Information Diagrams

    Czech Academy of Sciences Publication Activity Database

    Martin, E.A.; Hlinka, J.; Meinke, A.; Děchtěrenko, Filip; Tintěra, J.; Oliver, I.; Davidsen, J.

    2017-01-01

    Roč. 7, č. 1 (2017), s. 1-15, č. článku 7062. ISSN 2045-2322 R&D Projects: GA ČR GA13-23940S Institutional support: RVO:68081740 Keywords : complex networks * mutual information * entropy maximization * fMRI Subject RIV: AN - Psychology OBOR OECD: Cognitive sciences Impact factor: 4.259, year: 2016

  1. INFORMATION SYSTEMS AUDIT CURRICULA CONTENT MATCHING

    OpenAIRE

    Vasile-Daniel CARDOȘ; Ildikó Réka CARDOȘ

    2014-01-01

    Financial and internal auditors must cope with the challenge of performing their mission in technology enhanced environment. In this article we match the information technology description found in the International Federation of Accountants (IFAC) and the Institute of Internal Auditors (IIA) curricula against the Model Curriculum issued by the Information Systems Audit and Control Association (ISACA). By reviewing these three curricula, we matched the content in the ISACA Model Curriculum wi...

  2. An assessment of information communication technology content ...

    African Journals Online (AJOL)

    An assessment of information communication technology content, context and ... a- vis the upscaling of ICT in health care facilities in Nairobi and Machakos counties. ... high in all the facilities compared to levels of services operations computerised and ... for Authors · for Policy Makers · about Open Access · Journal Quality.

  3. On the information content of discrete phylogenetic characters.

    Science.gov (United States)

    Bordewich, Magnus; Deutschmann, Ina Maria; Fischer, Mareike; Kasbohm, Elisa; Semple, Charles; Steel, Mike

    2017-12-16

    Phylogenetic inference aims to reconstruct the evolutionary relationships of different species based on genetic (or other) data. Discrete characters are a particular type of data, which contain information on how the species should be grouped together. However, it has long been known that some characters contain more information than others. For instance, a character that assigns the same state to each species groups all of them together and so provides no insight into the relationships of the species considered. At the other extreme, a character that assigns a different state to each species also conveys no phylogenetic signal. In this manuscript, we study a natural combinatorial measure of the information content of an individual character and analyse properties of characters that provide the maximum phylogenetic information, particularly, the number of states such a character uses and how the different states have to be distributed among the species or taxa of the phylogenetic tree.

  4. MUATAN INFORMASI (INFORMATION CONTENTS DARI KEBIJAKAN DIVIDEN

    Directory of Open Access Journals (Sweden)

    Endang Raino Wirjono

    2016-11-01

    Full Text Available This article has objective to describe information content of dividend policy, especially to foretell earnings growth. It has been oberved that an increase in the price of a stock, while a dividend cut generally leads to a stock price decline. However, many market observers point to the very high fraction of earnings retained (or low dividend payout ratio as a sign that future earnings growth will be well above historical norm. In the real world, many complications exist that could confound the expected inverse relationship between current payouts and future earnings growth. Keywords: dividend policy, earnings growth, payout ratio

  5. Information content of household-stratified epidemics

    Directory of Open Access Journals (Sweden)

    T.M. Kinyanjui

    2016-09-01

    Full Text Available Household structure is a key driver of many infectious diseases, as well as a natural target for interventions such as vaccination programs. Many theoretical and conceptual advances on household-stratified epidemic models are relatively recent, but have successfully managed to increase the applicability of such models to practical problems. To be of maximum realism and hence benefit, they require parameterisation from epidemiological data, and while household-stratified final size data has been the traditional source, increasingly time-series infection data from households are becoming available. This paper is concerned with the design of studies aimed at collecting time-series epidemic data in order to maximize the amount of information available to calibrate household models. A design decision involves a trade-off between the number of households to enrol and the sampling frequency. Two commonly used epidemiological study designs are considered: cross-sectional, where different households are sampled at every time point, and cohort, where the same households are followed over the course of the study period. The search for an optimal design uses Bayesian computationally intensive methods to explore the joint parameter-design space combined with the Shannon entropy of the posteriors to estimate the amount of information in each design. For the cross-sectional design, the amount of information increases with the sampling intensity, i.e., the designs with the highest number of time points have the most information. On the other hand, the cohort design often exhibits a trade-off between the number of households sampled and the intensity of follow-up. Our results broadly support the choices made in existing epidemiological data collection studies. Prospective problem-specific use of our computational methods can bring significant benefits in guiding future study designs.

  6. Information content of household-stratified epidemics.

    Science.gov (United States)

    Kinyanjui, T M; Pellis, L; House, T

    2016-09-01

    Household structure is a key driver of many infectious diseases, as well as a natural target for interventions such as vaccination programs. Many theoretical and conceptual advances on household-stratified epidemic models are relatively recent, but have successfully managed to increase the applicability of such models to practical problems. To be of maximum realism and hence benefit, they require parameterisation from epidemiological data, and while household-stratified final size data has been the traditional source, increasingly time-series infection data from households are becoming available. This paper is concerned with the design of studies aimed at collecting time-series epidemic data in order to maximize the amount of information available to calibrate household models. A design decision involves a trade-off between the number of households to enrol and the sampling frequency. Two commonly used epidemiological study designs are considered: cross-sectional, where different households are sampled at every time point, and cohort, where the same households are followed over the course of the study period. The search for an optimal design uses Bayesian computationally intensive methods to explore the joint parameter-design space combined with the Shannon entropy of the posteriors to estimate the amount of information in each design. For the cross-sectional design, the amount of information increases with the sampling intensity, i.e., the designs with the highest number of time points have the most information. On the other hand, the cohort design often exhibits a trade-off between the number of households sampled and the intensity of follow-up. Our results broadly support the choices made in existing epidemiological data collection studies. Prospective problem-specific use of our computational methods can bring significant benefits in guiding future study designs. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  7. Content Sharing Based on Personal Information in Virtually Secured Space

    Science.gov (United States)

    Sohn, Hosik; Ro, Yong Man; Plataniotis, Kostantinos N.

    User generated contents (UGC) are shared in an open space like social media where users can upload and consume contents freely. Since the access of contents is not restricted, the contents could be delivered to unwanted users or misused sometimes. In this paper, we propose a method for sharing UGCs securely based on the personal information of users. With the proposed method, virtual secure space is created for contents delivery. The virtual secure space allows UGC creator to deliver contents to users who have similar personal information and they can consume the contents without any leakage of personal information. In order to verify the usefulness of the proposed method, the experiment was performed where the content was encrypted with personal information of creator, and users with similar personal information have decrypted and consumed the contents. The results showed that UGCs were securely shared among users who have similar personal information.

  8. The information content of cosmic microwave background anisotropies

    Science.gov (United States)

    Scott, Douglas; Contreras, Dagoberto; Narimani, Ali; Ma, Yin-Zhe

    2016-06-01

    The cosmic microwave background (CMB) contains perturbations that are close to Gaussian and isotropic. This means that its information content, in the sense of the ability to constrain cosmological models, is closely related to the number of modes probed in CMB power spectra. Rather than making forecasts for specific experimental setups, here we take a more pedagogical approach and ask how much information we can extract from the CMB if we are only limited by sample variance. We show that, compared with temperature measurements, the addition of E-mode polarization doubles the number of modes available out to a fixed maximum multipole, provided that all of the TT, TE, and EE power spectra are measured. However, the situation in terms of constraints on particular parameters is more complicated, as we explain and illustrate graphically. We also discuss the enhancements in information that can come from adding B-mode polarization and gravitational lensing. We show how well one could ever determine the basic cosmological parameters from CMB data compared with what has been achieved with Planck, which has already probed a substantial fraction of the TT information. Lastly, we look at constraints on neutrino mass as a specific example of how lensing information improves future prospects beyond the current 6-parameter model.

  9. The information content of cosmic microwave background anisotropies

    International Nuclear Information System (INIS)

    Scott, Douglas; Contreras, Dagoberto; Narimani, Ali; Ma, Yin-Zhe

    2016-01-01

    The cosmic microwave background (CMB) contains perturbations that are close to Gaussian and isotropic. This means that its information content, in the sense of the ability to constrain cosmological models, is closely related to the number of modes probed in CMB power spectra. Rather than making forecasts for specific experimental setups, here we take a more pedagogical approach and ask how much information we can extract from the CMB if we are only limited by sample variance. We show that, compared with temperature measurements, the addition of E -mode polarization doubles the number of modes available out to a fixed maximum multipole, provided that all of the TT , TE , and EE power spectra are measured. However, the situation in terms of constraints on particular parameters is more complicated, as we explain and illustrate graphically. We also discuss the enhancements in information that can come from adding B -mode polarization and gravitational lensing. We show how well one could ever determine the basic cosmological parameters from CMB data compared with what has been achieved with Planck , which has already probed a substantial fraction of the TT information. Lastly, we look at constraints on neutrino mass as a specific example of how lensing information improves future prospects beyond the current 6-parameter model.

  10. Constructing valid density matrices on an NMR quantum information processor via maximum likelihood estimation

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Harpreet; Arvind; Dorai, Kavita, E-mail: kavita@iisermohali.ac.in

    2016-09-07

    Estimation of quantum states is an important step in any quantum information processing experiment. A naive reconstruction of the density matrix from experimental measurements can often give density matrices which are not positive, and hence not physically acceptable. How do we ensure that at all stages of reconstruction, we keep the density matrix positive? Recently a method has been suggested based on maximum likelihood estimation, wherein the density matrix is guaranteed to be positive definite. We experimentally implement this protocol on an NMR quantum information processor. We discuss several examples and compare with the standard method of state estimation. - Highlights: • State estimation using maximum likelihood method was performed on an NMR quantum information processor. • Physically valid density matrices were obtained every time in contrast to standard quantum state tomography. • Density matrices of several different entangled and separable states were reconstructed for two and three qubits.

  11. The Content-Provider Paradox: Universities in the Information Ecosystem.

    Science.gov (United States)

    Vaidhyanathan, Siva

    2002-01-01

    Asserts that universities' rush to abandon their role as "national parks" in the information ecosystem in favor of becoming profitable "content providers" has led to a paradox: to generate new knowledge, researchers and teachers need broad content freedom, but the role of content provider requires highly restrictive policies to…

  12. Information Content of Mutual Fund Portfolio Disclosure

    NARCIS (Netherlands)

    Y. Wang (Yu)

    2011-01-01

    textabstractAcademic financial economists have been keenly interested in the value of active portfolio management since the seminal paper of Jensen (1968). This book examines the information advantages that active mutual fund managers attain in financial markets through an analysis of disclosed fund

  13. Measuring the Information Content of Stock Trades.

    OpenAIRE

    Hasbrouck, Joel

    1991-01-01

    This paper suggests that the interactions of security trades and quote revisions be modeled as a vector autoregressive system. Within this framework, a trade's information effect may be meaningfully measured as the ultimate price impact of the trade innovation. Estimates for a sample of NYSE issues suggest a trade's full price impact arrives only with a protracted lag; the impact is a positive and concave function of the trade size; large trades cause the spread to widen; trades occurring in ...

  14. Three faces of entropy for complex systems: Information, thermodynamics, and the maximum entropy principle

    Science.gov (United States)

    Thurner, Stefan; Corominas-Murtra, Bernat; Hanel, Rudolf

    2017-09-01

    There are at least three distinct ways to conceptualize entropy: entropy as an extensive thermodynamic quantity of physical systems (Clausius, Boltzmann, Gibbs), entropy as a measure for information production of ergodic sources (Shannon), and entropy as a means for statistical inference on multinomial processes (Jaynes maximum entropy principle). Even though these notions represent fundamentally different concepts, the functional form of the entropy for thermodynamic systems in equilibrium, for ergodic sources in information theory, and for independent sampling processes in statistical systems, is degenerate, H (p ) =-∑ipilogpi . For many complex systems, which are typically history-dependent, nonergodic, and nonmultinomial, this is no longer the case. Here we show that for such processes, the three entropy concepts lead to different functional forms of entropy, which we will refer to as SEXT for extensive entropy, SIT for the source information rate in information theory, and SMEP for the entropy functional that appears in the so-called maximum entropy principle, which characterizes the most likely observable distribution functions of a system. We explicitly compute these three entropy functionals for three concrete examples: for Pólya urn processes, which are simple self-reinforcing processes, for sample-space-reducing (SSR) processes, which are simple history dependent processes that are associated with power-law statistics, and finally for multinomial mixture processes.

  15. Information Entropy Production of Maximum Entropy Markov Chains from Spike Trains

    Directory of Open Access Journals (Sweden)

    Rodrigo Cofré

    2018-01-01

    Full Text Available The spiking activity of neuronal networks follows laws that are not time-reversal symmetric; the notion of pre-synaptic and post-synaptic neurons, stimulus correlations and noise correlations have a clear time order. Therefore, a biologically realistic statistical model for the spiking activity should be able to capture some degree of time irreversibility. We use the thermodynamic formalism to build a framework in the context maximum entropy models to quantify the degree of time irreversibility, providing an explicit formula for the information entropy production of the inferred maximum entropy Markov chain. We provide examples to illustrate our results and discuss the importance of time irreversibility for modeling the spike train statistics.

  16. Maximum mutual information vector quantization of log-likelihood ratios for memory efficient HARQ implementations

    DEFF Research Database (Denmark)

    Danieli, Matteo; Forchhammer, Søren; Andersen, Jakob Dahl

    2010-01-01

    analysis leads to using maximum mutual information (MMI) as optimality criterion and in turn Kullback-Leibler (KL) divergence as distortion measure. Simulations run based on an LTE-like system have proven that VQ can be implemented in a computationally simple way at low rates of 2-3 bits per LLR value......Modern mobile telecommunication systems, such as 3GPP LTE, make use of Hybrid Automatic Repeat reQuest (HARQ) for efficient and reliable communication between base stations and mobile terminals. To this purpose, marginal posterior probabilities of the received bits are stored in the form of log...

  17. Extraction of Information of Audio-Visual Contents

    Directory of Open Access Journals (Sweden)

    Carlos Aguilar

    2011-10-01

    Full Text Available In this article we show how it is possible to use Channel Theory (Barwise and Seligman, 1997 for modeling the process of information extraction realized by audiences of audio-visual contents. To do this, we rely on the concepts pro- posed by Channel Theory and, especially, its treatment of representational systems. We then show how the information that an agent is capable of extracting from the content depends on the number of channels he is able to establish between the content and the set of classifications he is able to discriminate. The agent can endeavor the extraction of information through these channels from the totality of content; however, we discuss the advantages of extracting from its constituents in order to obtain a greater number of informational items that represent it. After showing how the extraction process is endeavored for each channel, we propose a method of representation of all the informative values an agent can obtain from a content using a matrix constituted by the channels the agent is able to establish on the content (source classifications, and the ones he can understand as individual (destination classifications. We finally show how this representation allows reflecting the evolution of the informative items through the evolution of audio-visual content.

  18. 78 FR 23918 - Request for Information Regarding Third Party Testing for Lead Content, Phthalate Content, and...

    Science.gov (United States)

    2013-04-23

    ... CONSUMER PRODUCT SAFETY COMMISSION [Docket No. CPSC 2011-0081] Request for Information Regarding Third Party Testing for Lead Content, Phthalate Content, and the Solubility of the Eight Elements Listed in ASTM F963-11 Correction In notice document 2013-8858 appearing on pages 22518-22520 in the issue...

  19. The dependence of human reliability upon task information content

    International Nuclear Information System (INIS)

    Hermanson, E.M.; Golay, M.W.

    1994-09-01

    The role of human error in safety mishaps is an important factor in system design. As systems become increasingly complex the capacity of the human to deal with the added complexity is diminished. It is therefore crucial to understand the relationship between system complexity and human reliability so that systems may be built in such a way as to minimize human error. One way of understanding this relationship is to quantify system complexity and then measure the human reaction in response to situations of varying complexity. The quantification of system complexity may be performed by determining the information content present in the tasks that the human must execute. The purpose of this work is therefore to build and perform a consistent experiment which will determine the extent to which human reliability depends upon task information content. Two main conclusions may be drawn from this work. The first is that human reliability depends upon task information content. Specifically, as the information content contained in a task increases, the capacity of a human to deal successfully with the task decreases monotonically. Here the definition of total success is the ability to complete the task at hand fully and correctly. Furthermore, there exists a value of information content below which a human can deal with the task successfully, but above which the success of an individual decreases monotonically with increasing information. These ideas should be generalizable to any model where system complexity can be clearly and consistently defined

  20. Information content versus word length in random typing

    International Nuclear Information System (INIS)

    Ferrer-i-Cancho, Ramon; Moscoso del Prado Martín, Fermín

    2011-01-01

    Recently, it has been claimed that a linear relationship between a measure of information content and word length is expected from word length optimization and it has been shown that this linearity is supported by a strong correlation between information content and word length in many languages (Piantadosi et al 2011 Proc. Nat. Acad. Sci. 108 3825). Here, we study in detail some connections between this measure and standard information theory. The relationship between the measure and word length is studied for the popular random typing process where a text is constructed by pressing keys at random from a keyboard containing letters and a space behaving as a word delimiter. Although this random process does not optimize word lengths according to information content, it exhibits a linear relationship between information content and word length. The exact slope and intercept are presented for three major variants of the random typing process. A strong correlation between information content and word length can simply arise from the units making a word (e.g., letters) and not necessarily from the interplay between a word and its context as proposed by Piantadosi and co-workers. In itself, the linear relation does not entail the results of any optimization process. (letter)

  1. Mainstreaming African Local Content in the Information Society: The ...

    African Journals Online (AJOL)

    The global transition to the information and knowledge society requires that every country contribute its local content to the burgeoning global information infrastructure. African states and the African continent as a whole have much to offer in form of indigenous knowledge and scholarly research. However, it has been very ...

  2. The Spatial Information Content of the Honey Bee Waggle Dance

    Directory of Open Access Journals (Sweden)

    Roger eSchürch

    2015-03-01

    Full Text Available In 1954, Haldane and Spurway published a paper in which they discussed the information content of the honey bee waggle dance with regard to the ideas of Norbert Wiener, who had recently developed a formal theory of information. We return to this concept by reanalyzing the information content in both vector components (direction, distance of the waggle dance using recent empirical data from a study that investigated the accuracy of the dance. Our results show that the direction component conveys 2.9 bits and the distance component 4.5 bits of information, which agrees to some extent with Haldane and Spurway's estimates that were based on data gathered by von Frisch. Of course, these are small amounts of information compared to what can be conveyed, given enough time, by human language, or compared to what is routinely transferred via the internet. Nevertheless, small amounts of information can be very valuable if it is the right information. The receivers of this information, the nestmate bees, know how to react adaptively so that the value of the information is not negated by its low information content.

  3. Principle of maximum Fisher information from Hardy's axioms applied to statistical systems.

    Science.gov (United States)

    Frieden, B Roy; Gatenby, Robert A

    2013-10-01

    Consider a finite-sized, multidimensional system in parameter state a. The system is either at statistical equilibrium or general nonequilibrium, and may obey either classical or quantum physics. L. Hardy's mathematical axioms provide a basis for the physics obeyed by any such system. One axiom is that the number N of distinguishable states a in the system obeys N=max. This assumes that N is known as deterministic prior knowledge. However, most observed systems suffer statistical fluctuations, for which N is therefore only known approximately. Then what happens if the scope of the axiom N=max is extended to include such observed systems? It is found that the state a of the system must obey a principle of maximum Fisher information, I=I(max). This is important because many physical laws have been derived, assuming as a working hypothesis that I=I(max). These derivations include uses of the principle of extreme physical information (EPI). Examples of such derivations were of the De Broglie wave hypothesis, quantum wave equations, Maxwell's equations, new laws of biology (e.g., of Coulomb force-directed cell development and of in situ cancer growth), and new laws of economic fluctuation and investment. That the principle I=I(max) itself derives from suitably extended Hardy axioms thereby eliminates its need to be assumed in these derivations. Thus, uses of I=I(max) and EPI express physics at its most fundamental level, its axiomatic basis in math.

  4. The energy content of restaurant foods without stated calorie information.

    Science.gov (United States)

    Urban, Lorien E; Lichtenstein, Alice H; Gary, Christine E; Fierstein, Jamie L; Equi, Ashley; Kussmaul, Carolyn; Dallal, Gerard E; Roberts, Susan B

    2013-07-22

    National recommendations for the prevention and treatment of obesity emphasize reducing energy intake through self-monitoring food consumption. However, little information is available on the energy content of foods offered by nonchain restaurants, which account for approximately 50% of restaurant locations in the United States. To measure the energy content of foods from independent and small-chain restaurants that do not provide stated information on energy content. We used bomb calorimetry to determine the dietary energy content of the 42 most frequently purchased meals from the 9 most common restaurant categories. Independent and small-chain restaurants were randomly selected, and 157 individual meals were analyzed. Area within 15 miles of downtown Boston. A random sample of independent and small-chain restaurants. Dietary energy. All meal categories provided excessive dietary energy. The mean energy content of individual meals was 1327 (95% CI, 1248-1406) kcal, equivalent to 66% of typical daily energy requirements. We found a significant effect of food category on meal energy (P ≤ .05), and 7.6% of meals provided more than 100% of typical daily energy requirements. Within-meal variability was large (average SD, 271 kcal), and we found no significant effect of restaurant establishment or size. In addition, meal energy content averaged 49% greater than those of popular meals from the largest national chain restaurants (P restaurants have been criticized for offering meals with excess dietary energy. This study finds that independent and small-chain restaurants, which provide no nutrition information, also provide excessive dietary energy in amounts apparently greater than popular meals from chain restaurants or information in national food databases. A national requirement for accurate calorie labeling in all restaurants may discourage menus offering unhealthy portions and would allow consumers to make informed choices about ordering meals that promote weight

  5. Information Content of Aerosol Retrievals in the Sunglint Region

    Science.gov (United States)

    Ottaviani, M.; Knobelspiesse, K.; Cairns, B.; Mishchenko, M.

    2013-01-01

    We exploit quantitative metrics to investigate the information content in retrievals of atmospheric aerosol parameters (with a focus on single-scattering albedo), contained in multi-angle and multi-spectral measurements with sufficient dynamical range in the sunglint region. The simulations are performed for two classes of maritime aerosols with optical and microphysical properties compiled from measurements of the Aerosol Robotic Network. The information content is assessed using the inverse formalism and is compared to that deriving from observations not affected by sunglint. We find that there indeed is additional information in measurements containing sunglint, not just for single-scattering albedo, but also for aerosol optical thickness and the complex refractive index of the fine aerosol size mode, although the amount of additional information varies with aerosol type.

  6. Bayesian inference with information content model check for Langevin equations

    DEFF Research Database (Denmark)

    Krog, Jens F. C.; Lomholt, Michael Andersen

    2017-01-01

    The Bayesian data analysis framework has been proven to be a systematic and effective method of parameter inference and model selection for stochastic processes. In this work we introduce an information content model check which may serve as a goodness-of-fit, like the chi-square procedure...

  7. Comfort and Content: Considerations for Informal Science Professional Development

    Science.gov (United States)

    Holliday, Gary M.; Lederman, Norman G.; Lederman, Judith S.

    2014-01-01

    This study looked at a life science course that was offered at and taught by education staff of a large informal science institution (ISI) located in the Midwest. The curriculum, materials, and agendas for the course were developed by education staff and complemented a permanent life science exhibition. The researcher developed a content test…

  8. Information content in reflected global navigation satellite system signals

    DEFF Research Database (Denmark)

    Høeg, Per; Carlstrom, Anders

    2011-01-01

    The direct signals from satellites in global satellite navigation satellites systems (GNSS) as, GPS, GLONASS and GALILEO, constitute the primary source for positioning, navigation and timing from space. But also the reflected GNSS signals contain an important information content of signal travel...

  9. Content-Based Information Retrieval from Forensic Databases

    NARCIS (Netherlands)

    Geradts, Z.J.M.H.

    2002-01-01

    In forensic science, the number of image databases is growing rapidly. For this reason, it is necessary to have a proper procedure for searching in these images databases based on content. The use of image databases results in more solved crimes; furthermore, statistical information can be obtained

  10. 37 CFR 1.98 - Content of information disclosure statement.

    Science.gov (United States)

    2010-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Content of information disclosure statement. 1.98 Section 1.98 Patents, Trademarks, and Copyrights UNITED STATES PATENT AND TRADEMARK OFFICE, DEPARTMENT OF COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES National Processing...

  11. The changing information environment for nanotechnology: online audiences and content

    International Nuclear Information System (INIS)

    Anderson, Ashley A.; Brossard, Dominique; Scheufele, Dietram A.

    2010-01-01

    The shift toward online communication in all realms, from print newspapers to broadcast television, has implications for how the general public consumes information about nanotechnology. The goal of this study is threefold: to investigate who is using online sources for information and news about science and nanotechnology, to examine what the general public is searching for online with regards to nanotechnology, and to analyze what they find in online content of nanotechnology. Using survey data, we find those who report the Internet as their primary source of science and technology news are diverse in age, more knowledgeable about science and nanotechnology, highly educated, male, and more diverse racially than users of other media. In a comparison of demographic data on actual visits by online users to general news and science Web sites, science sites attracted more male, non-white users from the Western region of the United States than news sites did. News sites, on the other hand, attracted those with a slightly higher level of education. Our analysis of published estimates of keyword searches on nanotechnology reveals people are turning to the Internet to search for keyword searches related to the future, health, and applications of nanotechnology. A content analysis of online content reveals health content dominates overall. Comparisons of content in different types of sites-blogs, government, and general sites-are conducted.

  12. The changing information environment for nanotechnology: online audiences and content

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Ashley A., E-mail: aaanderson3@wisc.edu; Brossard, Dominique; Scheufele, Dietram A. [University of Wisconsin-Madison, Department of Life Sciences Communication (United States)

    2010-05-15

    The shift toward online communication in all realms, from print newspapers to broadcast television, has implications for how the general public consumes information about nanotechnology. The goal of this study is threefold: to investigate who is using online sources for information and news about science and nanotechnology, to examine what the general public is searching for online with regards to nanotechnology, and to analyze what they find in online content of nanotechnology. Using survey data, we find those who report the Internet as their primary source of science and technology news are diverse in age, more knowledgeable about science and nanotechnology, highly educated, male, and more diverse racially than users of other media. In a comparison of demographic data on actual visits by online users to general news and science Web sites, science sites attracted more male, non-white users from the Western region of the United States than news sites did. News sites, on the other hand, attracted those with a slightly higher level of education. Our analysis of published estimates of keyword searches on nanotechnology reveals people are turning to the Internet to search for keyword searches related to the future, health, and applications of nanotechnology. A content analysis of online content reveals health content dominates overall. Comparisons of content in different types of sites-blogs, government, and general sites-are conducted.

  13. Content and accuracy of vaccine information on pediatrician blogs.

    Science.gov (United States)

    Bryan, Mersine A; Gunningham, Hailey; Moreno, Megan A

    2018-01-29

    Parents often use social media such as blogs to inform decisions about vaccinations, however little is known about pediatrician blogs addressing vaccines. The objective of this study was to assess content, citations, audience engagement and accuracy of vaccine information on pediatrician blogs. We conducted a content analysis of vaccine information on pediatrician blogs. A national sample of pediatrician blogs was identified using a search rubric of terms applied to multiple search engines. Inclusion criteria were: (1) the writer identified as a pediatrician (2) US based (3) ≥1 post since 1/1/2014. We identified 84 blogs; 56 fit inclusion criteria. Data were collected on all posts mentioning vaccines from 1/1/14 to 2/28/15. We identified the major topic for each post, examined citations to determine sources of information and counted the number of comments per post to evaluate audience engagement. We assessed accuracy of vaccine information using evaluation criteria adapted from information for parents on the CDC website. We identified 324 unique blog posts containing information about vaccines on 31 pediatrician blogs. The most common major topic was vaccine-specific posts (36%); Influenza and MMR were the most prevalent. Other common topics included: activism against anti-vaccine information (21%), vaccine exemptions (10%), autism (8%), and vaccine safety (6%). Activism against anti-vaccine information was the topic with the most reader engagement. The most common sources cited were governmental organizations such as the CDC and WHO (34%), and medical journals (31%). All blogs except 2 included information that was consistent with CDC information. Pediatrician bloggers frequently address vaccinations; most provide accurate information. Pediatrician blogs may be a new source to provide vaccine education to parents via social media. Copyright © 2017. Published by Elsevier Ltd.

  14. Information Entropy Production of Maximum Entropy Markov Chains from Spike Trains

    Science.gov (United States)

    Cofré, Rodrigo; Maldonado, Cesar

    2018-01-01

    We consider the maximum entropy Markov chain inference approach to characterize the collective statistics of neuronal spike trains, focusing on the statistical properties of the inferred model. We review large deviations techniques useful in this context to describe properties of accuracy and convergence in terms of sampling size. We use these results to study the statistical fluctuation of correlations, distinguishability and irreversibility of maximum entropy Markov chains. We illustrate these applications using simple examples where the large deviation rate function is explicitly obtained for maximum entropy models of relevance in this field.

  15. Breastfeeding information in pharmacology textbooks: a content analysis.

    Science.gov (United States)

    Amir, Lisa H; Raval, Manjri; Hussainy, Safeera Y

    2013-07-01

    Women often need to take medicines while breastfeeding and pharmacists need to provide accurate information in order to avoid undue caution about the compatibility of medicines and breastfeeding. The objective of this study was to review information provided about breastfeeding in commonly used pharmacology textbooks. We asked 15 Australian universities teaching pharmacy courses to provide a list of recommended pharmacology textbooks in 2011. Ten universities responded, generating a list of 11 textbooks that we analysed for content relating to breastfeeding. Pharmacology textbooks outline the mechanisms of actions of medicines and their use: however, only a small emphasis is placed on the safety/compatibility of medicines for women during breastfeeding. Current pharmacology textbooks recommended by Australian universities have significant gaps in their coverage of medicine use in breastfeeding. Authors of textbooks should address this gap, so academic staff can recommend texts with the best lactation content.

  16. Virtual assistant: Enhancing content acquisition by eliciting information from humans

    OpenAIRE

    Ozeki, Motoyuki; Maeda, Shunichi; Obata, Kanako; Nakamura, Yuichi

    2009-01-01

    In this paper, we propose the "Virtual Assistant, " a novel framework for supporting knowledge capturing in videos. The Virtual Assistant is an artificial agent that simulates a human assistant shown in TV programs and prompts users to provide feedback by asking questions. This framework ensures that sufficient information is provided in the captured content while users interact in a natural and enjoyable way with the agent. We developed a prototype agent based on a chatbot-like approach and ...

  17. Separation of Stochastic and Deterministic Information from Seismological Time Series with Nonlinear Dynamics and Maximum Entropy Methods

    International Nuclear Information System (INIS)

    Gutierrez, Rafael M.; Useche, Gina M.; Buitrago, Elias

    2007-01-01

    We present a procedure developed to detect stochastic and deterministic information contained in empirical time series, useful to characterize and make models of different aspects of complex phenomena represented by such data. This procedure is applied to a seismological time series to obtain new information to study and understand geological phenomena. We use concepts and methods from nonlinear dynamics and maximum entropy. The mentioned method allows an optimal analysis of the available information

  18. A content relevance model for social media health information.

    Science.gov (United States)

    Prybutok, Gayle Linda; Koh, Chang; Prybutok, Victor R

    2014-04-01

    Consumer health informatics includes the development and implementation of Internet-based systems to deliver health risk management information and health intervention applications to the public. The application of consumer health informatics to educational and interventional efforts such as smoking reduction and cessation has garnered attention from both consumers and health researchers in recent years. Scientists believe that smoking avoidance or cessation before the age of 30 years can prevent more than 90% of smoking-related cancers and that individuals who stop smoking fare as well in preventing cancer as those who never start. The goal of this study was to determine factors that were most highly correlated with content relevance for health information provided on the Internet for a study group of 18- to 30-year-old college students. Data analysis showed that the opportunity for convenient entertainment, social interaction, health information-seeking behavior, time spent surfing on the Internet, the importance of available activities on the Internet (particularly e-mail), and perceived site relevance for Internet-based sources of health information were significantly correlated with content relevance for 18- to 30-year-old college students, an educated subset of this population segment.

  19. Bayesian Maximum Entropy prediction of soil categories using a traditional soil map as soft information.

    NARCIS (Netherlands)

    Brus, D.J.; Bogaert, P.; Heuvelink, G.B.M.

    2008-01-01

    Bayesian Maximum Entropy was used to estimate the probabilities of occurrence of soil categories in the Netherlands, and to simulate realizations from the associated multi-point pdf. Besides the hard observations (H) of the categories at 8369 locations, the soil map of the Netherlands 1:50 000 was

  20. Information management for high content live cell imaging

    Directory of Open Access Journals (Sweden)

    White Michael RH

    2009-07-01

    Full Text Available Abstract Background High content live cell imaging experiments are able to track the cellular localisation of labelled proteins in multiple live cells over a time course. Experiments using high content live cell imaging will generate multiple large datasets that are often stored in an ad-hoc manner. This hinders identification of previously gathered data that may be relevant to current analyses. Whilst solutions exist for managing image data, they are primarily concerned with storage and retrieval of the images themselves and not the data derived from the images. There is therefore a requirement for an information management solution that facilitates the indexing of experimental metadata and results of high content live cell imaging experiments. Results We have designed and implemented a data model and information management solution for the data gathered through high content live cell imaging experiments. Many of the experiments to be stored measure the translocation of fluorescently labelled proteins from cytoplasm to nucleus in individual cells. The functionality of this database has been enhanced by the addition of an algorithm that automatically annotates results of these experiments with the timings of translocations and periods of any oscillatory translocations as they are uploaded to the repository. Testing has shown the algorithm to perform well with a variety of previously unseen data. Conclusion Our repository is a fully functional example of how high throughput imaging data may be effectively indexed and managed to address the requirements of end users. By implementing the automated analysis of experimental results, we have provided a clear impetus for individuals to ensure that their data forms part of that which is stored in the repository. Although focused on imaging, the solution provided is sufficiently generic to be applied to other functional proteomics and genomics experiments. The software is available from: fhttp://code.google.com/p/livecellim/

  1. CONTENT OF FINANCIAL STATEMENTS AND THEIR INFORMATIVE VALENCES FOR STAKEHOLDERS

    Directory of Open Access Journals (Sweden)

    MIRON Vasile Cristian Ioachim

    2015-06-01

    Full Text Available The qualitative characteristics of accounting information have a major importance in fundamenting stakeholders decisions in order to satisfy their interests. The financial statements, by their nature, provide synthetic information which shows the financial position and its modifications, the economic performance of the entity, the management of resources and other aspects that lead to rational decisions. Stakeholders interests are complex and sometimes divergent, that is why the content of the financial statements must be adapted in order to meet these interests. The present research analyzes how the information presented in the financial statements respond to the needs of the stakeholders. The analysis showed that there are some significant aspects for which the informational power of the financial statements is reduced. Also, using econometric processing, we have conceived a function that characterizes the correlation between the financial profitability of the entities activating in the energy sector and the profitability obtained in the stock exchange market. The conclusions of the research allowed us to propose some measures of improvement of the information from the financial statements, in order to create an adequate informational basis for the decisions of all the categories of stakeholders.

  2. Modeling Information Content Via Dirichlet-Multinomial Regression Analysis.

    Science.gov (United States)

    Ferrari, Alberto

    2017-01-01

    Shannon entropy is being increasingly used in biomedical research as an index of complexity and information content in sequences of symbols, e.g. languages, amino acid sequences, DNA methylation patterns and animal vocalizations. Yet, distributional properties of information entropy as a random variable have seldom been the object of study, leading to researchers mainly using linear models or simulation-based analytical approach to assess differences in information content, when entropy is measured repeatedly in different experimental conditions. Here a method to perform inference on entropy in such conditions is proposed. Building on results coming from studies in the field of Bayesian entropy estimation, a symmetric Dirichlet-multinomial regression model, able to deal efficiently with the issue of mean entropy estimation, is formulated. Through a simulation study the model is shown to outperform linear modeling in a vast range of scenarios and to have promising statistical properties. As a practical example, the method is applied to a data set coming from a real experiment on animal communication.

  3. INFORMATION CONTENT OF EXOPLANETARY TRANSIT SPECTRA: AN INITIAL LOOK

    International Nuclear Information System (INIS)

    Line, Michael R.; Zhang Xi; Yung, Yuk L.; Vasisht, Gautam; Natraj, Vijay; Chen Pin

    2012-01-01

    It has been shown that spectroscopy of transiting extrasolar planets can potentially provide a wealth of information about their atmospheres. Herein, we set up the inverse problem in spectroscopic retrieval. We use nonlinear optimal estimation to retrieve the atmospheric state (pioneered for Earth sounding by Rodgers). The formulation quantifies the degrees of freedom and information content of the spectrum with respect to geophysical parameters; herein, we focus specifically on temperature and composition. First, we apply the technique to synthetic near-infrared spectra and explore the influence of spectral signal-to-noise ratio and resolution (the two important parameters when designing a future instrument) on the information content of the data. As expected, we find that the number of retrievable parameters increases with increasing signal-to-noise ratio and resolution, although the gains quickly level off for large values. Second, we apply the methods to the previously studied dayside near-infrared emission spectrum of HD 189733b and compare the results of our retrieval with those obtained by others.

  4. Optimization of information properties of NAA with respect to information content and profitability of results

    International Nuclear Information System (INIS)

    Obrusnik, I.; Eckschlager, K.

    1986-01-01

    Information properties of analytical results together with other important parameters especially economic ones can be used for the optimization of analytical procedures. Therefore, we have proposed a computational technique for the optimization of multielement neutron activation analysis (NAA) based on the information content and profitability. The optimization starts with the prediction of the γ-ray spectra to be expected during analysis under given experimental conditions (sample size, irradiation, decay and counting times etc.) and with the calculation of detection and determination limits. In the next step, the information contents for the determination of particular elements and for the simultaneous determination of element groups are computed. The information content depends or is closely connected with such properties of the method as selectivity, snesitivity, precision, accuracy and, as in the other cases of trace analysis, also with the detection limit. Then, the information profitability (IP) taking into account the information content and relevance (appreciation of specific information according to its contribution to the solution of a given problem) together wit economic aspects can be calculated. This function can be used for the optimization of a particular NAA procedure, for the mutual comparison of different variants of NAA and also for the comparison with other analytical methods. The use of information profitability for the optimization of NAA is shown on a practical example of the INAA analysis of urban particulate matter SRN 1648 produced by NBS (USA). (author)

  5. Information Theoretical Analysis of Identification based on Active Content Fingerprinting

    OpenAIRE

    Farhadzadeh, Farzad; Willems, Frans M. J.; Voloshinovskiy, Sviatoslav

    2014-01-01

    Content fingerprinting and digital watermarking are techniques that are used for content protection and distribution monitoring. Over the past few years, both techniques have been well studied and their shortcomings understood. Recently, a new content fingerprinting scheme called {\\em active content fingerprinting} was introduced to overcome these shortcomings. Active content fingerprinting aims to modify a content to extract robuster fingerprints than the conventional content fingerprinting....

  6. Information Content Moderates Positivity and Negativity Biases in Memory

    Science.gov (United States)

    Hess, Thomas M.; Popham, Lauren E.; Dennis, Paul A.; Emery, Lisa

    2014-01-01

    Two experiments examined the impact of encoding conditions and information content in memory for positive, neutral, and negative pictures. We examined the hypotheses that the positivity effect in memory (i.e., a bias in favor of positive or against negative information in later life) would be reduced when (a) pictures were viewed under structured as opposed to unstructured conditions, and (b) contained social as opposed to nonsocial content. Both experiments found that the positivity effect observed with nonsocial stimuli was absent with social stimuli. In addition, little evidence was obtained that encoding conditions affected the strength of the positivity effect. We argue that some types of social stimuli may engage different types of processing than nonsocial stimuli, perhaps encouraging self-referential processing that engages attention and supports memory. This processing may then conflict with the goal-driven, top-down processing that is hypothesized to drive the positivity effect. Thus, our results identify further boundary conditions associated with the positivity effect in memory, arguing that stimulus factors as well as situational goals may affect its occurrence. Further research awaits to determine if this effect is specific to all social stimuli or specific subsets. PMID:23421322

  7. Information content moderates positivity and negativity biases in memory.

    Science.gov (United States)

    Hess, Thomas M; Popham, Lauren E; Dennis, Paul A; Emery, Lisa

    2013-09-01

    Two experiments examined the impact of encoding conditions and information content in memory for positive, neutral, and negative pictures. We examined the hypotheses that the positivity effect in memory (i.e., a bias in favor of positive or against negative information in later life) would be reduced when (a) pictures were viewed under structured as opposed to unstructured conditions, and (b) contained social as opposed to nonsocial content. Both experiments found that the positivity effect observed with nonsocial stimuli was absent with social stimuli. In addition, little evidence was obtained that encoding conditions affected the strength of the positivity effect. We argue that some types of social stimuli may engage different types of processing than nonsocial stimuli, perhaps encouraging self-referential processing that engages attention and supports memory. This processing may then conflict with the goal-driven, top-down processing that is hypothesized to drive the positivity effect. Thus, our results identify further boundary conditions associated with the positivity effect in memory, arguing that stimulus factors as well as situational goals may affect its occurrence. Further research awaits to determine if this effect is specific to all social stimuli or specific subsets.

  8. The nature and psychological content of information psychological impact

    Directory of Open Access Journals (Sweden)

    Evgeny G. Baranov

    2017-03-01

    Full Text Available The paper presents the results of theoretical analysis of «information-psychological impact» category. The study aims to determine the role and place of impacts of such kind in the upbringing process, and in education in general. The paper contrasts comparative analysis of existing scientific approaches to understanding the nature and psychological content of the concept of “information” and psychological impact. Based on the data mentioned above, the conclusion is made that the psychological impact is the influence of surrounding elements of the physical and social environment on the people, which change the course of their mental processes, mental state, psychological structure of consciousness and behaviour. In addition, the purposeful psychological impact carried out either by an individual or a collective entity can be direct or indirect (e.g. information psychological. Based on the performed analysis the conclusion is made that depending on their purpose and nature of influence, information and psychological impact can be manipulative (subject-object or developmental (subject-subject. Manipulative impact creates temporary, unstable mental forms, while developing impact creates stable personality forms. Both kinds of information and psychological influences can be observes in the educational process. The teacher selects types of influence based on his/her own pedagogical qualifications and teaching objectives: to develop the personality of the student or to form behavioural stereotypes.

  9. Is online information on ecstasy tablet content safe?

    Science.gov (United States)

    Vrolijk, Ruben Q; Brunt, Tibor M; Vreeker, Annabel; Niesink, Raymond J M

    2017-01-01

    In recent years, the prevalence of ecstasy use has increased in most European countries. Users can acquire information on ecstasy tablet composition through the internet. This study compares online information from two websites, Pillreports and Partyflock, to the validated Dutch Drugs Information and Monitoring System (DIMS) database, and aims to measure its accuracy and potential danger or value. The drug-related information posted on Pillreports.net and Partyflock.nl between 1 January 2014 and 31 December 2015 was investigated for accuracy and several information characteristics such as picture inclusion and dose range inclusion. In total, 471 informatory statements on ecstasy tablet content were analysed relative to the Dutch ecstasy market. Informatory statements on the content of specific ecstasy tablets were scored as 'too high' or 'too low' if their concentrations deviated > 10 mg from the entries in the DIMS database within a 12-week time-frame, and scored as 'dangerous' if their concentration was > 40 mg too low. Unreported substances were scored as 'dangerous' if listed as an illegal or dangerous substance in the DIMS database and if present in relevant quantities. Also scored were the report characteristics 'picture inclusion', 'spread inclusion' and 'website source', which were tested for their association with report safety/danger. On average, reports on ecstasy tablets from Pillreports and Partyflock show concentrations which are 10.6 mg too high [95% confidence interval (CI) = 6.7-14.4]. Qualitatively, 39.7% of the reports scored as 'too high' (95% CI = 35.2-44.4), 17.6% scored as 'too low' (95% CI = 14.0-21.2) and 15.5% had 'unreported substances' (95% CI = 12.3-18.9), resulting overall in 15.3% of the reports being scored as 'dangerous' (95% CI = 11.9-18.5). The report characteristic 'spread inclusion' associated inversely with report danger [Exp(b) = 0.511, 95% CI = 0.307-0.850, P = 0.01]. Information from the popular

  10. Information content in B→VV decays and the angular moments method

    International Nuclear Information System (INIS)

    Dighe, A.; Sen, S.

    1998-10-01

    The time-dependent angular distributions of decays of neutral B mesons into two vector mesons contain information about the lifetimes, mass differences, strong and weak phases, form factors, and CP violating quantities. A statistical analysis of the information content is performed by giving the ''information'' a quantitative meaning. It is shown that for some parameters of interest, the information content in time and angular measurements combined may be orders of magnitude more than the information from time measurements alone and hence the angular measurements are highly recommended. The method of angular moments is compared with the (maximum) likelihood method to find that it works almost as well in the region of interest for the one-angle distribution. For the complete three-angle distribution, an estimate of possible statistical errors expected on the observables of interest is obtained. It indicates that the three-angle distribution, unraveled by the method of angular moments, would be able to nail down many quantities of interest and will help in pointing unambiguously to new physics. (author)

  11. The Acoustic Structure and Information Content of Female Koala Vocal Signals

    Science.gov (United States)

    Charlton, Benjamin D.

    2015-01-01

    Determining the information content of animal vocalisations can give valuable insights into the potential functions of vocal signals. The source-filter theory of vocal production allows researchers to examine the information content of mammal vocalisations by linking variation in acoustic features with variation in relevant physical characteristics of the caller. Here I used a source-filter theory approach to classify female koala vocalisations into different call-types, and determine which acoustic features have the potential to convey important information about the caller to other conspecifics. A two-step cluster analysis classified female calls into bellows, snarls and tonal rejection calls. Additional results revealed that female koala vocalisations differed in their potential to provide information about a given caller’s phenotype that may be of importance to receivers. Female snarls did not contain reliable acoustic cues to the caller’s identity and age. In contrast, female bellows and tonal rejection calls were individually distinctive, and the tonal rejection calls of older female koalas had consistently lower mean, minimum and maximum fundamental frequency. In addition, female bellows were significantly shorter in duration and had higher fundamental frequency, formant frequencies, and formant frequency spacing than male bellows. These results indicate that female koala vocalisations have the potential to signal the caller’s identity, age and sex. I go on to discuss the anatomical basis for these findings, and consider the possible functional relevance of signalling this type of information in the koala’s natural habitat. PMID:26465340

  12. Assessment of the information content of patterns: an algorithm

    Science.gov (United States)

    Daemi, M. Farhang; Beurle, R. L.

    1991-12-01

    A preliminary investigation confirmed the possibility of assessing the translational and rotational information content of simple artificial images. The calculation is tedious, and for more realistic patterns it is essential to implement the method on a computer. This paper describes an algorithm developed for this purpose which confirms the results of the preliminary investigation. Use of the algorithm facilitates much more comprehensive analysis of the combined effect of continuous rotation and fine translation, and paves the way for analysis of more realistic patterns. Owing to the volume of calculation involved in these algorithms, extensive computing facilities were necessary. The major part of the work was carried out using an ICL 3900 series mainframe computer as well as other powerful workstations such as a RISC architecture MIPS machine.

  13. Extracting maximum petrophysical and geological information from a limited reservoir database

    Energy Technology Data Exchange (ETDEWEB)

    Ali, M.; Chawathe, A.; Ouenes, A. [New Mexico Institute of Mining and Technology, Socorro, NM (United States)] [and others

    1997-08-01

    The characterization of old fields lacking sufficient core and log data is a challenging task. This paper describes a methodology that uses new and conventional tools to build a reliable reservoir model for the Sulimar Queen field. At the fine scale, permeability measured on a fine grid with a minipermeameter was used in conjunction with the petrographic data collected on multiple thin sections. The use of regression analysis and a newly developed fuzzy logic algorithm led to the identification of key petrographic elements which control permeability. At the log scale, old gamma ray logs were first rescaled/calibrated throughout the entire field for consistency and reliability using only four modem logs. Using data from one cored well and the rescaled gamma ray logs, correlations between core porosity, permeability, total water content and gamma ray were developed to complete the small scale characterization. At the reservoir scale, outcrop data and the rescaled gamma logs were used to define the reservoir structure over an area of ten square miles where only 36 wells were available. Given the structure, the rescaled gamma ray logs were used to build the reservoir volume by identifying the flow units and their continuity. Finally, history-matching results constrained to the primary production were used to estimate the dynamic reservoir properties such as relative permeabilities to complete the characterization. The obtained reservoir model was tested by forecasting the waterflood performance and which was in good agreement with the actual performance.

  14. Getting maximum information from incomplete data on B → charmonium-KS decays

    International Nuclear Information System (INIS)

    Lipkin, H.J.

    1989-01-01

    Tests of CP violation using B decays into CP eigenstates can be improved by using events normally rejected because of incomplete information. A search for lepton asymmetry in decays Υ(4S) → B + bar B → (K S + J/ψ) + (lepton ± + X) can be improved by including other (c bar c)K S events where the (c bar c) pair is not bound in a J/ψ but in some other state like ψ' or η c and where the lepton asymmetry is predicted to be the same as for (K S + J/ψ), other (c bar c)K S events which are not fully reconstructed and (c bar c)K L events where the K L pair is not detected and which are predicted to have the opposite lepton asymmetry from corresponding K S events. The information from these additional events can give improved statistics if suitable cuts can be found to improve signal/noise. The opposite asymmetry predicted for K L events can test spurious lepton asymmetries due to systematic errors. 3 refs

  15. Design Issues and Information Contents of the Provincial Government Websites of Indonesia: A Content Analysis on Visual Messages

    Directory of Open Access Journals (Sweden)

    Achmad Syarief

    2009-07-01

    Full Text Available A website is not just merely act as an object of displaying information, but it also represents a contextual medium of communication through visuals and contents. The interplay of website design elements builds up meanings that affect users beyond what previous communication practices have uncovered. Previous research acknowledges that visuals and contents have significant effects in attracting users’ attention and trust. Thus, the ability of a website to provide credible information through visuals and contents to target users is therefore plays great importance in the success of a website. However, although a considerable number of researches on website design have been performed, study in understanding the characteristics of site’s visual appearances and information contents for the purpose of promoting local investment in Indonesia has been very limited. This paper addresses visual design issues and information contents of eighteen provincial government websites of Indonesia. Through content analysis, the paper comparatively examines visual appearances, information contents, and functions of each website, in order to determine visual characteristics and contents that suit the purpose of promoting local potencies. The paper focuses on commonality, discrepancy, and pattern of contents, provide suggestions to improve the use of provincial government website design of Indonesia.

  16. A maximum information utilization approach in X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Papp, T.; Maxwell, J.A.; Papp, A.T.

    2009-01-01

    X-ray fluorescence data bases have significant contradictions, and inconsistencies. We have identified that the main source of the contradictions, after the human factors, is rooted in the signal processing approaches. We have developed signal processors to overcome many of the problems by maximizing the information available to the analyst. These non-paralyzable, fully digital signal processors have yielded improved resolution, line shape, tailing and pile up recognition. The signal processors account for and register all events, sorting them into two spectra, one spectrum for the desirable or accepted events, and one spectrum for the rejected events. The information contained in the rejected spectrum is mandatory to have control over the measurement and to make a proper accounting and allocation of the events. It has established the basis for the application of the fundamental parameter method approach. A fundamental parameter program was also developed. The primary X-ray line shape (Lorentzian) is convoluted with a system line shape (Gaussian) and corrected for the sample material absorption, X-ray absorbers and detector efficiency. The peaks also can have, a lower and upper energy side tailing, including the physical interaction based long range functions. It also employs a peak and continuum pile up and can handle layered samples of up to five layers. The application of a fundamental parameter method demands the proper equipment characterization. We have also developed an inverse fundamental parameter method software package for equipment characterisation. The program calculates the excitation function at the sample position and the detector efficiency, supplying an internally consistent system.

  17. Content Integration: Creating a Scalable Common Platform for Information Resources

    Science.gov (United States)

    Berenstein, Max; Katz, Demian

    2012-01-01

    Academic, government, and corporate librarians organize and leverage internal resources and content through institutional repositories and library catalogs. Getting more value and usage from the content they license is a key goal. However, the ever-growing amount of content and shifting user demands for new materials or features has made the…

  18. An assessment of the quality and content of information on diverticulitis on the internet.

    Science.gov (United States)

    Connelly, Tara M; Khan, Mohammad Shoaib; Victory, Liana; Mehmood, Abeera; Cooke, Fiachra

    2018-05-21

    Although commonly the first port of call for medical information, the internet provides unregulated information of variable quality. We aimed to evaluate commonly accessed web-based patient information on diverticulitis using validated and novel scoring systems. The top internet search engines (Google/Bing/Yahoo) were queried using the keyword 'diverticulitis.' The first 20 websites from each were graded using the DISCERN and Journal of the American Medical Association (JAMA) benchmark criteria. A novel diverticulitis-specific score was devised and applied. Thirty-six unique websites were identified. The mean total DISCERN score for all websites was 39.92 ± 12.44 (range = 18-62). No website achieved the maximum DISCERN score of 75. The mean JAMA and diverticulitis scores were 2.5 ± 1.08 (maximum possible score = 4) and 11.08 ± 4.17 (19 points possible) respectively. Fourteen (35.9%) and 20 (51.2%) did not provide the date of last update and authorship respectively. Thirty-three (84.6%) mentioned surgery as a treatment option; however, the majority (69.7%) did not describe the surgery or the possibility of a stoma. All except two described disease symptoms. Only ten (25.64%) provided information on when to seek further medical advice or help. Web-based information on diverticulitis is of variable content and quality. The majority of top websites describe disease symptoms and aetiology; however, information to prompt seeking medical attention if required, descriptions of surgical procedures and the possibility of stoma creation are poorly described in the majority of websites. These findings should be highlighted to patients utilising the internet to obtain information on diverticulitis. Copyright © 2018 Royal College of Surgeons of Edinburgh (Scottish charity number SC005317) and Royal College of Surgeons in Ireland. Published by Elsevier Ltd. All rights reserved.

  19. Contents operation center for 'mopera' information service; Mopera joho service muke contents un'ei center

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    'Mopera' information service is a mobile information service in which NTT Mobile Communications Network, Inc. offers information of various fields such as business and hobbies for the users of the portable telephone or PHS of the company. Toshiba Corp. started the contents operation center consistently performing from the preparation of contents to the management of a server for the above information service, making efforts in expanding the contents since the beginning of the service in the fall of 1998, and operating at present more than ten kinds of contents such as news, weather forecast, and stock information other than mobile 'Ekimae-Tanken Club' (adventure club in front of a station). Moreover, Toshiba takes it into consideration to build a system aiming at a stable operation like a duplex operation of a server, 24-hour automatic surveillance, etc., continuously providing highly reliable services. (translated by NEDO)

  20. Information Content in Radio Waves: Student Investigations in Radio Science

    Science.gov (United States)

    Jacobs, K.; Scaduto, T.

    2013-12-01

    We describe an inquiry-based instructional unit on information content in radio waves, created in the summer of 2013 as part of a MIT Haystack Observatory (Westford, MA) NSF Research Experiences for Teachers (RET) program. This topic is current and highly relevant, addressing science and technical aspects from radio astronomy, geodesy, and atmospheric research areas as well as Next Generation Science Standards (NGSS). Projects and activities range from simple classroom demonstrations and group investigations, to long term research projects incorporating data acquisition from both student-built instrumentation as well as online databases. Each of the core lessons is applied to one of the primary research centers at Haystack through an inquiry project that builds on previously developed units through the MIT Haystack RET program. In radio astronomy, students investigate the application of a simple and inexpensive software defined radio chip (RTL-SDR) for use in systems implementing a small and very small radio telescope (SRT and VSRT). Both of these systems allow students to explore fundamental principles of radio waves and interferometry as applied to radio astronomy. In ionospheric research, students track solar storms from the initial coronal mass ejection (using Solar Dynamics Observatory images) to the resulting variability in total electron density concentrations using data from the community standard Madrigal distributed database system maintained by MIT Haystack. Finally, students get to explore very long-baseline interferometry as it is used in geodetic studies by measuring crustal plate displacements over time. Alignment to NextGen standards is provided for each lesson and activity with emphasis on HS-PS4 'Waves and Their Applications in Technologies for Information Transfer'.

  1. Content of Bachelors' in Tourism Informative Training in Ukrainian and Polish Experience: Comparative Study

    Science.gov (United States)

    Zubekhina, Tetiana

    2015-01-01

    This article provides a comparative analysis of the content of Bachelors' in Tourism informative training in Ukrainian and Polish experience. The content of Bachelors' in Tourism informative training in Ukraine and Poland has been analyzed. The content of subjects, namely, "Information Technologies in Tourism" and "The Foundations…

  2. Entropy-based implied volatility and its information content

    NARCIS (Netherlands)

    X. Xiao (Xiao); C. Zhou (Chen)

    2016-01-01

    markdownabstractThis paper investigates the maximum entropy approach on estimating implied volatility. The entropy approach also allows to measure option implied skewness and kurtosis nonparametrically, and to construct confidence intervals. Simulations show that the en- tropy approach outperforms

  3. Method for Measuring the Information Content of Terrain from Digital Elevation Models

    Directory of Open Access Journals (Sweden)

    Lujin Hu

    2015-10-01

    Full Text Available As digital terrain models are indispensable for visualizing and modeling geographic processes, terrain information content is useful for terrain generalization and representation. For terrain generalization, if the terrain information is considered, the generalized terrain may be of higher fidelity. In other words, the richer the terrain information at the terrain surface, the smaller the degree of terrain simplification. Terrain information content is also important for evaluating the quality of the rendered terrain, e.g., the rendered web terrain tile service in Google Maps (Google Inc., Mountain View, CA, USA. However, a unified definition and measures for terrain information content have not been established. Therefore, in this paper, a definition and measures for terrain information content from Digital Elevation Model (DEM, i.e., a digital model or 3D representation of a terrain’s surface data are proposed and are based on the theory of map information content, remote sensing image information content and other geospatial information content. The information entropy was taken as the information measuring method for the terrain information content. Two experiments were carried out to verify the measurement methods of the terrain information content. One is the analysis of terrain information content in different geomorphic types, and the results showed that the more complex the geomorphic type, the richer the terrain information content. The other is the analysis of terrain information content with different resolutions, and the results showed that the finer the resolution, the richer the terrain information. Both experiments verified the reliability of the measurements of the terrain information content proposed in this paper.

  4. Head and neck cancer information on the internet: type, accuracy and content.

    LENUS (Irish Health Repository)

    Ni Riordain, Richeal

    2009-08-01

    This study aimed to determine the type, accuracy and content of information available on the internet regarding head and neck cancer. The search engine Google was used to generate a list of the top 100 websites about head and neck cancer. The websites were evaluated using the DISCERN instrument and the JAMA benchmarks and whether the site displayed the Health on the Net seal was also recorded. The search yielded 1,650,000 sites on the Google website. Of the top 100 sites, a total of 33 sites were suitable for analysis due to duplicate links, non-functioning links and irrelevant website. 45% achieved all four JAMA benchmarks and 18% achieved only 1 benchmark. No website receiving the maximum mark on the overall score and four websites received the lowest overall score regarding the DISCERN instrument. The question with the poorest response score was \\'Does it describe how the treatment choices affect overall quality of life?\\' 39% of the websites displayed the Health on the Net (HON) seal. A wide variety of types of information are available on the internet regarding head and neck cancer with variable accuracy levels based on both Journal of the American Medical Association (JAMA) benchmarks and DISCERN. The onus lies with the practitioner to guide the patient regarding scientific reliability of information and to direct the patient in filtering the information sourced. The inclusion of quality of life related information is currently lacking and should be addressed to ensure a more comprehensive understanding for patients of treatment options.

  5. Addictions Content Published in Counseling Journals: A 10-Year Content Analysis to Inform Research and Practice

    Science.gov (United States)

    Wahesh, Edward; Likis-Werle, S. Elizabeth; Moro, Regina R.

    2017-01-01

    This content analysis includes 210 articles that focused on addictions topics published between January 2005 and December 2014 in the journals of the National Board for Certified Counselors (NBCC), Chi Sigma Iota (CSI), the American Counseling Association (ACA), and ACA member divisions. Results include the types of addictions content and…

  6. Cartography and Geographic Information Science in Current Contents

    Directory of Open Access Journals (Sweden)

    Nedjeljko Frančula

    2009-12-01

    Full Text Available The Cartography and Geographic Information Science (CaGIS journal was published as The American Cartographer from 1974 to 1989, after that as Cartography and Geographic Information System, and since then has been published with its current name. It is published by the Cartography and Geographic Information Society, a member of the American Congress on Surveying and Mapping.

  7. Axiomatic Evaluation Method and Content Structure for Information Appliances

    Science.gov (United States)

    Guo, Yinni

    2010-01-01

    Extensive studies have been conducted to determine how best to present information in order to enhance usability, but not what information is needed to be presented for effective decision making. Hence, this dissertation addresses the factor structure of the nature of information needed for presentation and proposes a more effective method than…

  8. On the low SNR capacity of maximum ratio combining over rician fading channels with full channel state information

    KAUST Repository

    Benkhelifa, Fatma

    2013-04-01

    In this letter, we study the ergodic capacity of a maximum ratio combining (MRC) Rician fading channel with full channel state information (CSI) at the transmitter and at the receiver. We focus on the low Signal-to-Noise Ratio (SNR) regime and we show that the capacity scales as L ΩK+L SNRx log(1SNR), where Ω is the expected channel gain per branch, K is the Rician fading factor, and L is the number of diversity branches. We show that one-bit CSI feedback at the transmitter is enough to achieve this capacity using an on-off power control scheme. Our framework can be seen as a generalization of recently established results regarding the fading-channels capacity characterization in the low-SNR regime. © 2012 IEEE.

  9. On the low SNR capacity of maximum ratio combining over rician fading channels with full channel state information

    KAUST Repository

    Benkhelifa, Fatma; Rezki, Zouheir; Alouini, Mohamed-Slim

    2013-01-01

    In this letter, we study the ergodic capacity of a maximum ratio combining (MRC) Rician fading channel with full channel state information (CSI) at the transmitter and at the receiver. We focus on the low Signal-to-Noise Ratio (SNR) regime and we show that the capacity scales as L ΩK+L SNRx log(1SNR), where Ω is the expected channel gain per branch, K is the Rician fading factor, and L is the number of diversity branches. We show that one-bit CSI feedback at the transmitter is enough to achieve this capacity using an on-off power control scheme. Our framework can be seen as a generalization of recently established results regarding the fading-channels capacity characterization in the low-SNR regime. © 2012 IEEE.

  10. Binary versus non-binary information in real time series: empirical results and maximum-entropy matrix models

    Science.gov (United States)

    Almog, Assaf; Garlaschelli, Diego

    2014-09-01

    The dynamics of complex systems, from financial markets to the brain, can be monitored in terms of multiple time series of activity of the constituent units, such as stocks or neurons, respectively. While the main focus of time series analysis is on the magnitude of temporal increments, a significant piece of information is encoded into the binary projection (i.e. the sign) of such increments. In this paper we provide further evidence of this by showing strong nonlinear relations between binary and non-binary properties of financial time series. These relations are a novel quantification of the fact that extreme price increments occur more often when most stocks move in the same direction. We then introduce an information-theoretic approach to the analysis of the binary signature of single and multiple time series. Through the definition of maximum-entropy ensembles of binary matrices and their mapping to spin models in statistical physics, we quantify the information encoded into the simplest binary properties of real time series and identify the most informative property given a set of measurements. Our formalism is able to accurately replicate, and mathematically characterize, the observed binary/non-binary relations. We also obtain a phase diagram allowing us to identify, based only on the instantaneous aggregate return of a set of multiple time series, a regime where the so-called ‘market mode’ has an optimal interpretation in terms of collective (endogenous) effects, a regime where it is parsimoniously explained by pure noise, and a regime where it can be regarded as a combination of endogenous and exogenous factors. Our approach allows us to connect spin models, simple stochastic processes, and ensembles of time series inferred from partial information.

  11. Binary versus non-binary information in real time series: empirical results and maximum-entropy matrix models

    International Nuclear Information System (INIS)

    Almog, Assaf; Garlaschelli, Diego

    2014-01-01

    The dynamics of complex systems, from financial markets to the brain, can be monitored in terms of multiple time series of activity of the constituent units, such as stocks or neurons, respectively. While the main focus of time series analysis is on the magnitude of temporal increments, a significant piece of information is encoded into the binary projection (i.e. the sign) of such increments. In this paper we provide further evidence of this by showing strong nonlinear relations between binary and non-binary properties of financial time series. These relations are a novel quantification of the fact that extreme price increments occur more often when most stocks move in the same direction. We then introduce an information-theoretic approach to the analysis of the binary signature of single and multiple time series. Through the definition of maximum-entropy ensembles of binary matrices and their mapping to spin models in statistical physics, we quantify the information encoded into the simplest binary properties of real time series and identify the most informative property given a set of measurements. Our formalism is able to accurately replicate, and mathematically characterize, the observed binary/non-binary relations. We also obtain a phase diagram allowing us to identify, based only on the instantaneous aggregate return of a set of multiple time series, a regime where the so-called ‘market mode’ has an optimal interpretation in terms of collective (endogenous) effects, a regime where it is parsimoniously explained by pure noise, and a regime where it can be regarded as a combination of endogenous and exogenous factors. Our approach allows us to connect spin models, simple stochastic processes, and ensembles of time series inferred from partial information. (paper)

  12. Information-theoretical analysis of private content identification

    NARCIS (Netherlands)

    Voloshynovskiy, S.; Koval, O.; Beekhof, F.; Farhadzadeh, F.; Holotyak, T.

    2010-01-01

    In recent years, content identification based on digital fingerprinting attracts a lot of attention in different emerging applications. At the same time, the theoretical analysis of digital fingerprinting systems for finite length case remains an open issue. Additionally, privacy leaks caused by

  13. Analyses of stomach contents provide information on prey of ...

    African Journals Online (AJOL)

    spamer

    example. In this paper, information is presented on the cephalopods eaten by four species of shark. Initial studies had ..... Their prey selection supports sighting .... 18(1): 27 – 40. KLIMLEY, A. P. 1993 — Highly directional swimming by scal-.

  14. Linked Data Methodologies for Managing Information about Television Content

    Directory of Open Access Journals (Sweden)

    José Luis Redondo-García

    2012-09-01

    Full Text Available OntoTV is a television information management system designed for improving the quality and quantity of the information available in the current television platforms. In order to achieve this objective, OntoTV (1 collects the information offered by the broadcasters, (2 integrates it into a ontology-based data structure, (3 extracts extra data from alternative television sources, and (4 makes possible for the user to perform queries over the stored information.This document shows the way Linked Data methodologies have been applied in OntoTV system, and the improvements in the data consumption and publication processes that have been obtained as result. On the one hand, the possibility of accessing to information available in the Web of Data has made possible to offer more complete descriptions about the programs, as well as more detailed guides than those obtained by using classic collection methods. On the other hand, as the information of the television programs and channels is published according to the Linked Data philosophy, it becomes available not only for OntoTV clients, but also for other agents able to access Linked Data resources, who could offer the viewer more fresh and innovative features.

  15. Airborne Hyperspectral Evaluation of Maximum Gross Photosynthesis, Gravimetric Water Content, and CO2 Uptake Efficiency of the Mer Bleue Ombrotrophic Peatland

    Directory of Open Access Journals (Sweden)

    J. Pablo Arroyo-Mora

    2018-04-01

    Full Text Available Peatlands cover a large area in Canada and globally (12% and 3% of the landmass, respectively. These ecosystems play an important role in climate regulation through the sequestration of carbon dioxide from, and the release of methane to, the atmosphere. Monitoring approaches, required to understand the response of peatlands to climate change at large spatial scales, are challenged by their unique vegetation characteristics, intrinsic hydrological complexity, and rapid changes over short periods of time (e.g., seasonality. In this study, we demonstrate the use of multitemporal, high spatial resolution (1 m2 hyperspectral airborne imagery (Compact Airborne Spectrographic Imager (CASI and Shortwave Airborne Spectrographic Imager (SASI sensors for assessing maximum instantaneous gross photosynthesis (PGmax in hummocks, and gravimetric water content (GWC and carbon uptake efficiency in hollows, at the Mer Bleue ombrotrophic bog. We applied empirical models (i.e., in situ data and spectral indices and we derived spatial and temporal trends for the aforementioned variables. Our findings revealed the distribution of hummocks (51.2%, hollows (12.7%, and tree cover (33.6%, which is the first high spatial resolution map of this nature at Mer Bleue. For hummocks, we found growing season PGmax values between 8 μmol m−2 s−1 and 12 μmol m−2 s−1 were predominant (86.3% of the total area. For hollows, our results revealed, for the first time, the spatial heterogeneity and seasonal trends for gravimetric water content and carbon uptake efficiency for the whole bog.

  16. Information content of transient synchrotron radiation in tokamak plasmas

    International Nuclear Information System (INIS)

    Fisch, N.J.; Kritz, A.H.

    1989-04-01

    A brief, deliberate, perturbation of hot tokamak electrons produces a transient, synchrotron radiation signal, in frequency-time space, with impressive informative potential on plasma parameters; for example, the dc toroidal electric field, not available by other means, may be measurably. Very fast algorithms have been developed, making tractable a statistical analysis that compares essentially all parameter sets that might possibly explain the transient signal. By simulating data numerically, we can estimate the informative worth of data prior to obtaining it. 20 refs., 2 figs

  17. Information content when mutual funds deviate from benchmarks

    NARCIS (Netherlands)

    H. Jiang (Hao); M.J.C.M. Verbeek (Marno); Y. Wang (Yu)

    2014-01-01

    markdownabstract__Abstract__ The consensus wisdom of active mutual fund managers, as reflected in their average over-and underweighting decisions, contains valuable information about future stock returns. Analyzing a comprehensive sample of active U.S. equity funds from 1984 to 2008, we find that

  18. 10 CFR 52.47 - Contents of applications; technical information.

    Science.gov (United States)

    2010-01-01

    ... information sufficiently detailed to permit the preparation of acceptance and inspection requirements by the... combustible gas control as required by 10 CFR 50.44; (13) The list of electric equipment important to safety... accidents, e.g., challenges to containment integrity caused by core-concrete interaction, steam explosion...

  19. Content

    DEFF Research Database (Denmark)

    Keiding, Tina Bering

    secondary levels. In subject matter didactics, the question of content is more developed, but it is still mostly confined to teaching on lower levels. As for higher education didactics, discussions on selection of content are almost non-existent on the programmatic level. Nevertheless, teachers are forced...... curriculum, in higher education, and to generate analytical categories and criteria for selection of content, which can be used for systematic didactical reflection. The larger project also concerns reflection on and clarification of the concept of content, including the relation between content at the level......Aim, content and methods are fundamental categories of both theoretical and practical general didactics. A quick glance in recent pedagogical literature on higher education, however, reveals a strong preoccupation with methods, i.e. how teaching should be organized socially (Biggs & Tang, 2007...

  20. Informal Content and Student Note-Taking in Advanced Mathematics Classes

    Science.gov (United States)

    Fukawa-Connelly, Timothy; Weber, Keith; Mejía-Ramos, Juan Pablo

    2017-01-01

    This study investigates 3 hypotheses about proof-based mathematics instruction: (a) that lectures include informal content (ways of thinking and reasoning about advanced mathematics that are not captured by formal symbolic statements), (b) that informal content is usually presented orally but not written on the board, and (c) that students do not…

  1. Content Is King: Databases Preserve the Collective Information of Science.

    Science.gov (United States)

    Yates, John R

    2018-04-01

    Databases store sequence information experimentally gathered to create resources that further science. In the last 20 years databases have become critical components of fields like proteomics where they provide the basis for large-scale and high-throughput proteomic informatics. Amos Bairoch, winner of the Association of Biomolecular Resource Facilities Frederick Sanger Award, has created some of the important databases proteomic research depends upon for accurate interpretation of data.

  2. Distributed Repositories for Educational Content - Part 1: Information Management for Educational Content

    Directory of Open Access Journals (Sweden)

    Bernd J. Krämer

    2011-07-01

    Full Text Available As education providers increasingly integrate digital learning media into their education processes, the need for the systematic management of learning materials and learning arrangements becomes clearer. Digital repositories, often called Learning Object Repositories (LOR, promise to provide an answer to this challenge. This article is composed of two parts. In this part, we derive technological and pedagogical requirements for LORs from a concretization of information quality criteria for e-learning technology. We review the evolution of learning object repositories and discuss their core features in the context of pedagogical requirements, information quality demands, and e-learning technology standards. We conclude with an outlook in Part 2, which presents concrete technical solutions, in particular networked repository architectures.

  3. Information matching the content of visual working memory is prioritized for conscious access.

    Science.gov (United States)

    Gayet, Surya; Paffen, Chris L E; Van der Stigchel, Stefan

    2013-12-01

    Visual working memory (VWM) is used to retain relevant information for imminent goal-directed behavior. In the experiments reported here, we found that VWM helps to prioritize relevant information that is not yet available for conscious experience. In five experiments, we demonstrated that information matching VWM content reaches visual awareness faster than does information not matching VWM content. Our findings suggest a functional link between VWM and visual awareness: The content of VWM is recruited to funnel down the vast amount of sensory input to that which is relevant for subsequent behavior and therefore requires conscious access.

  4. Explaining the Variation in Adoption Rates of the Information Content of Environmental Disclosure

    DEFF Research Database (Denmark)

    Fallan, Even

    2015-01-01

    of content, and whether innovation adoption theory might represent important factors of this decision-making process. Design/methodology/approach: - Actual adoption rates of 13 information content categories are computed using content analysis of annual reports for 62 listed companies. Each content category......Purpose: - Corporate management decides what types of environmental information content to disclose/adopt. It is explored whether internal context - decision-makers’ perception of characteristics of the information content - might predict the variation in adoption rates of different types...... is seen as an innovation the company decides to adopt or not. Interviews with management in several companies illustrate the decision process of disclosure, and help predict adoption rates. Predicted and actual adoption rates are compared. Findings: - Adoption rates vary considerably among the 13 types...

  5. Information content in reflected signals during GPS Radio Occultation observations

    Science.gov (United States)

    Aparicio, Josep M.; Cardellach, Estel; Rodríguez, Hilda

    2018-04-01

    The possibility of extracting useful information about the state of the lower troposphere from the surface reflections that are often detected during GPS radio occultations (GPSRO) is explored. The clarity of the reflection is quantified, and can be related to properties of the surface and the low troposphere. The reflected signal is often clear enough to show good phase coherence, and can be tracked and processed as an extension of direct non-reflected GPSRO atmospheric profiles. A profile of bending angle vs. impact parameter can be obtained for these reflected signals, characterized by impact parameters that are below the apparent horizon, and that is a continuation at low altitude of the standard non-reflected bending angle profile. If there were no reflection, these would correspond to tangent altitudes below the local surface, and in particular below the local mean sea level. A forward operator is presented, for the evaluation of the bending angle of reflected GPSRO signals, given atmospheric properties as described by a numerical weather prediction system. The operator is an extension, at lower impact parameters, of standard bending angle operators, and reproduces both the direct and reflected sections of the measured profile. It can be applied to the assimilation of the reflected section of the profile as supplementary data to the direct section. Although the principle is also applicable over land, this paper is focused on ocean cases, where the topographic height of the reflecting surface, the sea level, is better known a priori.

  6. Designing experiments for maximum information from cyclic oxidation tests and their statistical analysis using half Normal plots

    International Nuclear Information System (INIS)

    Coleman, S.Y.; Nicholls, J.R.

    2006-01-01

    Cyclic oxidation testing at elevated temperatures requires careful experimental design and the adoption of standard procedures to ensure reliable data. This is a major aim of the 'COTEST' research programme. Further, as such tests are both time consuming and costly, in terms of human effort, to take measurements over a large number of cycles, it is important to gain maximum information from a minimum number of tests (trials). This search for standardisation of cyclic oxidation conditions leads to a series of tests to determine the relative effects of cyclic parameters on the oxidation process. Following a review of the available literature, databases and the experience of partners to the COTEST project, the most influential parameters, upper dwell temperature (oxidation temperature) and time (hot time), lower dwell time (cold time) and environment, were investigated in partners' laboratories. It was decided to test upper dwell temperature at 3 levels, at and equidistant from a reference temperature; to test upper dwell time at a reference, a higher and a lower time; to test lower dwell time at a reference and a higher time and wet and dry environments. Thus an experiment, consisting of nine trials, was designed according to statistical criteria. The results of the trial were analysed statistically, to test the main linear and quadratic effects of upper dwell temperature and hot time and the main effects of lower dwell time (cold time) and environment. The nine trials are a quarter fraction of the 36 possible combinations of parameter levels that could have been studied. The results have been analysed by half Normal plots as there are only 2 degrees of freedom for the experimental error variance, which is rather low for a standard analysis of variance. Half Normal plots give a visual indication of which factors are statistically significant. In this experiment each trial has 3 replications, and the data are analysed in terms of mean mass change, oxidation kinetics

  7. PROCESSING THE INFORMATION CONTENT ON THE BASIS OF FUZZY NEURAL MODEL OF DECISION MAKING

    Directory of Open Access Journals (Sweden)

    Nina V. Komleva

    2013-01-01

    Full Text Available The article is devoted to the issues of mathematical modeling of the decision-making process of information content processing based on the fuzzy neural network TSK. Integral rating assessment of the content, which is necessary for taking a decision about its further usage, is made depended on varying characteristics. Mechanism for building individual trajectory and forming individual competence is provided to make the intellectual content search.

  8. The Readability of Information Literacy Content on Academic Library Web Sites

    Science.gov (United States)

    Lim, Adriene

    2010-01-01

    This article reports on a study addressing the readability of content on academic libraries' Web sites, specifically content intended to improve users' information literacy skills. Results call for recognition of readability as an evaluative component of text in order to better meet the needs of diverse user populations. (Contains 8 tables.)

  9. Readability and Content Assessment of Informed Consent Forms for Medical Procedures in Croatia.

    Science.gov (United States)

    Vučemilo, Luka; Borovečki, Ana

    2015-01-01

    High quality of informed consent form is essential for adequate information transfer between physicians and patients. Current status of medical procedure consent forms in clinical practice in Croatia specifically in terms of the readability and the content is unknown. The aim of this study was to assess the readability and the content of informed consent forms for diagnostic and therapeutic procedures used with patients in Croatia. 52 informed consent forms from six Croatian hospitals on the secondary and tertiary health-care level were tested for reading difficulty using Simple Measure of Gobbledygook (SMOG) formula adjusted for Croatian language and for qualitative analysis of the content. The averaged SMOG grade of analyzed informed consent forms was 13.25 (SD 1.59, range 10-19). Content analysis revealed that informed consent forms included description of risks in 96% of the cases, benefits in 81%, description of procedures in 78%, alternatives in 52%, risks and benefits of alternatives in 17% and risks and benefits of not receiving treatment or undergoing procedures in 13%. Readability of evaluated informed consent forms is not appropriate for the general population in Croatia. The content of the forms failed to include in high proportion of the cases description of alternatives, risks and benefits of alternatives, as well as risks and benefits of not receiving treatments or undergoing procedures. Data obtained from this research could help in development and improvement of informed consent forms in Croatia especially now when Croatian hospitals are undergoing the process of accreditation.

  10. Ecological content validation of the Information Assessment Method for parents (IAM-parent): A mixed methods study.

    Science.gov (United States)

    Bujold, M; El Sherif, R; Bush, P L; Johnson-Lafleur, J; Doray, G; Pluye, P

    2018-02-01

    This mixed methods study content validated the Information Assessment Method for parents (IAM-parent) that allows users to systematically rate and comment on online parenting information. Quantitative data and results: 22,407 IAM ratings were collected; of the initial 32 items, descriptive statistics showed that 10 had low relevance. Qualitative data and results: IAM-based comments were collected, and 20 IAM users were interviewed (maximum variation sample); the qualitative data analysis assessed the representativeness of IAM items, and identified items with problematic wording. Researchers, the program director, and Web editors integrated quantitative and qualitative results, which led to a shorter and clearer IAM-parent. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  11. 75 FR 26283 - Notice of Proposed Information Collection: IMLS Digital Collections and Content: Opening History...

    Science.gov (United States)

    2010-05-11

    ... provided in the desired format, reporting burden (time and financial resources) is minimized, collection... information to assess the usefulness to reference-service providers in museums and libraries of the IMLS Digital Collections and Content project's Opening History resource. A copy of the proposed information...

  12. Educational Information Quantization for Improving Content Quality in Learning Management Systems

    Science.gov (United States)

    Rybanov, Alexander Aleksandrovich

    2014-01-01

    The article offers the educational information quantization method for improving content quality in Learning Management Systems. The paper considers questions concerning analysis of quality of quantized presentation of educational information, based on quantitative text parameters: average frequencies of parts of speech, used in the text; formal…

  13. 10 CFR 52.157 - Contents of applications; technical information in final safety analysis report.

    Science.gov (United States)

    2010-01-01

    ...; technical information in final safety analysis report. The application must contain a final safety analysis... 10 Energy 2 2010-01-01 2010-01-01 false Contents of applications; technical information in final safety analysis report. 52.157 Section 52.157 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSES...

  14. 10 CFR 52.79 - Contents of applications; technical information in final safety analysis report.

    Science.gov (United States)

    2010-01-01

    ...; technical information in final safety analysis report. (a) The application must contain a final safety... 10 Energy 2 2010-01-01 2010-01-01 false Contents of applications; technical information in final safety analysis report. 52.79 Section 52.79 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSES...

  15. LANDSAT-4 MSS and Thematic Mapper data quality and information content analysis

    Science.gov (United States)

    Anuta, P.; Bartolucci, L.; Dean, E.; Lozano, F.; Malaret, E.; Mcgillem, C. D.; Valdes, J.; Valenzuela, C.

    1984-01-01

    LANDSAT-4 thematic mapper (TM) and multispectral scanner (MSS) data were analyzed to obtain information on data quality and information content. Geometric evaluations were performed to test band-to-band registration accuracy. Thematic mapper overall system resolution was evaluated using scene objects which demonstrated sharp high contrast edge responses. Radiometric evaluation included detector relative calibration, effects of resampling, and coherent noise effects. Information content evaluation was carried out using clustering, principal components, transformed divergence separability measure, and supervised classifiers on test data. A detailed spectral class analysis (multispectral classification) was carried out to compare the information content of the MSS and TM for a large number of scene classes. A temperature-mapping experiment was carried out for a cooling pond to test the quality of thermal-band calibration. Overall TM data quality is very good. The MSS data are noisier than previous LANDSAT results.

  16. Informal reasoning regarding socioscientific issues: The influence of morality and content knowledge

    Science.gov (United States)

    Sadler, Troy Dow

    This study focused on informal reasoning regarding socioscientific issues. It explored how morality and content knowledge influenced the negotiation and resolution of contentious and complex scenarios based on genetic engineering. Two hundred and sixty-nine undergraduate students completed a quantitative test of genetics concepts. A sub-set of the students (n = 30) who completed this instrument and represented divergent levels of content knowledge participated in two individual interviews, during which they discussed their ideas, reactions, and solutions to three gene therapy scenarios and three cloning scenarios. A mixed-methods approach was used to examine patterns of informal reasoning and the influence of morality, the effects of content knowledge on the use of informal reasoning patterns, and the effects of content knowledge on the quality of informal reasoning. Students demonstrated evidence of rationalistic, emotive, and intuitive forms of informal reasoning. Rationalistic informal reasoning described reason-based considerations; emotive informal reasoning described care-based considerations; and intuitive reasoning described considerations based on immediate reactions to the context of a scenario. Participants frequently relied on combinations of these reasoning patterns as they worked to resolve individual socioscientific scenarios. Most of the participants appreciated at least some of the moral implications of their decisions, and these considerations were typically interwoven within an overall pattern of informal reasoning. Although differences in content knowledge were not found to be related to modes of informal reasoning (rationalistic, emotive, and informal), data did indicate that differences in content knowledge were related to variations in informal reasoning quality. Participants, with more advanced understandings of genetics, demonstrated fewer instances of reasoning flaws, as defined by a priori criteria (intra-scenario coherence, inter

  17. Assessment of Quality and Content of Online Information About Hip Arthroscopy.

    Science.gov (United States)

    Ellsworth, Bridget; Patel, Hiren; Kamath, Atul F

    2016-10-01

    The purpose of this study was to assess the quality of information available to patients on the Internet when using popular search engines to search the term "hip arthroscopy." We analyzed the quality and content of information about hip arthroscopy (HA) on the first 50 websites returned by the search engines Google and Bing for the search term "hip arthroscopy." The sites were categorized by type, and quality and content were measured using the DISCERN score, along with an HA-specific content score. The HA-specific content score was used to assess each website for the presence or absence of 19 topics about HA determined to be important for a patient seeking information about the procedure. The Health on the Net Code (HONcode) status of each website was also noted. The mean DISCERN score for all websites analyzed was 39.5, considered "poor," while only 44.6% of sites were considered "fair" or "good." Governmental and nonprofit organization (NPO) websites had the highest average DISCERN score. The mean HA-specific content score was 8.6 (range, 2 to 16). The commercial website category had the highest average HA-specific content score, followed by the governmental and NPO category. Sites that bore the HONcode certification obtained significantly higher DISCERN scores than those without the certification (P = .0032) but did not obtain significantly higher HA-specific content scores. "Hip arthroscopy" is a fairly general term, and there is significant variability in the quality of HA information available online. The HONcode is useful to identify quality patient information websites; however, it is not commonly used in HA-specific websites and does not encompass all quality websites about HA. This study increases awareness of the quality of information on HA available online. Copyright © 2016 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  18. The Information Content of Financial and Economic Variables: Empirical Tests of Information Variables in Japan

    OpenAIRE

    Kengo Kato

    1991-01-01

    The main topic of this paper is "information variables" (or "indicators") of monetary policy, which work as criteria for setting the direction of monetary policy. After briefly surveying the notion and candidates of information variables, according to the studies mainly in the United States, empirical tests using Japan's data are conducted. It can be said that some information variables seem to be useful, but the results are mixed in general.

  19. System renewal of objective contents on basis of new information technologies in continuing education in the field of information security

    Directory of Open Access Journals (Sweden)

    Михаил Иванович Бочаров

    2010-06-01

    Full Text Available In the article the questions of the optimization of the contents of education and the processes of its formation and renewal in dynamically developed subject areas are considered. As an example training process of information security according to suggested by the models of the vital cycle of knowledges system of continuing education is investigated.

  20. Does stock price synchronicity effect information content of reported earnings? Evidence from the MENA region

    Directory of Open Access Journals (Sweden)

    Omar Farooq

    2016-08-01

    Full Text Available This paper documents the effect of stock price synchronicity on the value relevance of reported earnings in the MENA region during the period between 2009 and 2013. Our results show that the information content of reported earnings increases with increase in stock price synchronicity. We document higher impact of earnings on returns for firms with higher stock price synchronicity. We argue that firms with high synchronicity have better information environment. As a result, these firms disclose information that is of high quality. We also show that information conveyed through stock price synchronicity is more important than information conveyed through traditional governance mechanisms

  1. Understanding Information Sharing Among Scientists Through a Professional Online Community: Analyses on Interaction Patterns and Contents

    Directory of Open Access Journals (Sweden)

    Shin, Eun-Ja

    2017-12-01

    Full Text Available Even through many professional organizations increasingly use Q&A sites in their online communities for information sharing, there are few studies which examine what is really going on in the Q&A activities in professional online communities (POC. This study aims to examine the interaction patterns and contents posted in the Q&A site of a POC, KOSEN, a science and technology online community in South Korea, focusing on how actively scientific information and knowledge are shared. The interaction patterns among the participants were identified through social network analysis (SNA and the contents in the Q&As were examined by content analysis. The results show that the overall network indicated a moderate level of participation and connection and answerers especially tended to be active. Also, there are different interaction patterns depending on academic fields. Relatively few participants were posting leaders who seemed to steer the overall interactions. Furthermore, some content related to manipulation and explanation for experiments, which are in urgent need, seem to be posted in the sites more frequently with more amounts. Combining both SNA and content analysis, this study demonstrated how actively information and knowledge is shared and what types of contents are exchanged. The findings have practical implications for POC managers and practitioners.

  2. Security of Heterogeneous Content in Cloud Based Library Information Systems Using an Ontology Based Approach

    Directory of Open Access Journals (Sweden)

    Mihai DOINEA

    2014-01-01

    Full Text Available As in any domain that involves the use of software, the library information systems take advantages of cloud computing. The paper highlights the main aspect of cloud based systems, describing some public solutions provided by the most important players on the market. Topics related to content security in cloud based services are tackled in order to emphasize the requirements that must be met by these types of systems. A cloud based implementation of an Information Library System is presented and some adjacent tools that are used together with it to provide digital content and metadata links are described. In a cloud based Information Library System security is approached by means of ontologies. Aspects such as content security in terms of digital rights are presented and a methodology for security optimization is proposed.

  3. Readability and Content Assessment of Informed Consent Forms for Medical Procedures in Croatia

    Science.gov (United States)

    Vučemilo, Luka; Borovečki, Ana

    2015-01-01

    Background High quality of informed consent form is essential for adequate information transfer between physicians and patients. Current status of medical procedure consent forms in clinical practice in Croatia specifically in terms of the readability and the content is unknown. The aim of this study was to assess the readability and the content of informed consent forms for diagnostic and therapeutic procedures used with patients in Croatia. Methods 52 informed consent forms from six Croatian hospitals on the secondary and tertiary health-care level were tested for reading difficulty using Simple Measure of Gobbledygook (SMOG) formula adjusted for Croatian language and for qualitative analysis of the content. Results The averaged SMOG grade of analyzed informed consent forms was 13.25 (SD 1.59, range 10–19). Content analysis revealed that informed consent forms included description of risks in 96% of the cases, benefits in 81%, description of procedures in 78%, alternatives in 52%, risks and benefits of alternatives in 17% and risks and benefits of not receiving treatment or undergoing procedures in 13%. Conclusions Readability of evaluated informed consent forms is not appropriate for the general population in Croatia. The content of the forms failed to include in high proportion of the cases description of alternatives, risks and benefits of alternatives, as well as risks and benefits of not receiving treatments or undergoing procedures. Data obtained from this research could help in development and improvement of informed consent forms in Croatia especially now when Croatian hospitals are undergoing the process of accreditation. PMID:26376183

  4. Contents

    Directory of Open Access Journals (Sweden)

    Editor IJRED

    2012-11-01

    Full Text Available International Journal of Renewable Energy Development www.ijred.com Volume 1             Number 3            October 2012                ISSN 2252- 4940   CONTENTS OF ARTICLES page Design and Economic Analysis of a Photovoltaic System: A Case Study 65-73 C.O.C. Oko , E.O. Diemuodeke, N.F. Omunakwe, and E. Nnamdi     Development of Formaldehyde Adsorption using Modified Activated Carbon – A Review 75-80 W.D.P Rengga , M. Sudibandriyo and M. Nasikin     Process Optimization for Ethyl Ester Production in Fixed Bed Reactor Using Calcium Oxide Impregnated Palm Shell Activated Carbon (CaO/PSAC 81-86 A. Buasri , B. Ksapabutr, M. Panapoy and N. Chaiyut     Wind Resource Assessment in Abadan Airport in Iran 87-97 Mojtaba Nedaei       The Energy Processing by Power Electronics and its Impact on Power Quality 99-105 J. E. Rocha and B. W. D. C. Sanchez       First Aspect of Conventional Power System Assessment for High Wind Power Plants Penetration 107-113 A. Merzic , M. Music, and M. Rascic   Experimental Study on the Production of Karanja Oil Methyl Ester and Its Effect on Diesel Engine 115-122 N. Shrivastava,  , S.N. Varma and M. Pandey  

  5. Evaluation of Web-Based Consumer Medication Information: Content and Usability of 4 Australian Websites.

    Science.gov (United States)

    Raban, Magdalena Z; Tariq, Amina; Richardson, Lauren; Byrne, Mary; Robinson, Maureen; Li, Ling; Westbrook, Johanna I; Baysari, Melissa T

    2016-07-21

    Medication is the most common intervention in health care, and written medication information can affect consumers' medication-related behavior. Research has shown that a large proportion of Australians search for medication information on the Internet. To evaluate the medication information content, based on consumer medication information needs, and usability of 4 Australian health websites: Better Health Channel, myDr, healthdirect, and NPS MedicineWise . To assess website content, the most common consumer medication information needs were identified using (1) medication queries to the healthdirect helpline (a telephone helpline available across most of Australia) and (2) the most frequently used medications in Australia. The most frequently used medications were extracted from Australian government statistics on use of subsidized medicines in the community and the National Census of Medicines Use. Each website was assessed to determine whether it covered or partially covered information and advice about these medications. To assess website usability, 16 consumers participated in user testing wherein they were required to locate 2 pieces of medication information on each website. Brief semistructured interviews were also conducted with participants to gauge their opinions of the websites. Information on prescription medication was more comprehensively covered on all websites (3 of 4 websites covered 100% of information) than nonprescription medication (websites covered 0%-67% of information). Most websites relied on consumer medicines information leaflets to convey prescription medication information to consumers. Information about prescription medication classes was less comprehensive, with no website providing all information examined about antibiotics and antidepressants. Participants (n=16) were able to locate medication information on websites in most cases (accuracy ranged from 84% to 91%). However, a number of usability issues relating to website

  6. Assessing advertising content in a hospital advertising campaign: An application of Puto and Wells (1984) measure of informational and transformational advertising content.

    Science.gov (United States)

    Menon, Mohan K; Goodnight, Janelle M; Wayne, Robin J

    2006-01-01

    The following is a report of a study designed to measure advertising content based on the cognitive and affective elements of informational (i.e., information processing) and transformational (i.e., experiential) content using the measure of advertising informational and transformational content developed by Puto and Wells (1984). A university hospital advertising campaign designed to be high in transformational content did not appear to affect perceived quality of local university hospitals relative to private hospitals or increase the likelihood of choosing a university hospital in the future. Further, experiences with university hospitals that seemed to be in direct contrast to the content of the advertisements based on subject perceptions affected how university hospital advertisements were perceived in terms of content. Conclusions and implications for hospital advertising campaigns are discussed.

  7. The Effect of Information Analysis Automation Display Content on Human Judgment Performance in Noisy Environments

    Science.gov (United States)

    Bass, Ellen J.; Baumgart, Leigh A.; Shepley, Kathryn Klein

    2014-01-01

    Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noisy sensor data are used by both the human and the information analysis automation to make judgments. In a simplified air traffic conflict prediction experiment, 32 participants made probability of horizontal conflict judgments under different display content conditions. After being exposed to the information analysis automation, judgment achievement significantly improved for all participants as compared to judgments without any of the automation's information. Participants provided with additional display content pertaining to cue variability in the task environment had significantly higher aided judgment achievement compared to those provided with only the automation's judgment of a probability of conflict. When designing information analysis automation for environments where the automation's judgment achievement is impacted by noisy environmental data, it may be beneficial to show additional task environment information to the human judge in order to improve judgment performance. PMID:24847184

  8. The Effect of Information Analysis Automation Display Content on Human Judgment Performance in Noisy Environments.

    Science.gov (United States)

    Bass, Ellen J; Baumgart, Leigh A; Shepley, Kathryn Klein

    2013-03-01

    Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noisy sensor data are used by both the human and the information analysis automation to make judgments. In a simplified air traffic conflict prediction experiment, 32 participants made probability of horizontal conflict judgments under different display content conditions. After being exposed to the information analysis automation, judgment achievement significantly improved for all participants as compared to judgments without any of the automation's information. Participants provided with additional display content pertaining to cue variability in the task environment had significantly higher aided judgment achievement compared to those provided with only the automation's judgment of a probability of conflict. When designing information analysis automation for environments where the automation's judgment achievement is impacted by noisy environmental data, it may be beneficial to show additional task environment information to the human judge in order to improve judgment performance.

  9. Using Information and Communication Technology (ICT) to the Maximum: Learning and Teaching Biology with Limited Digital Technologies

    Science.gov (United States)

    Van Rooy, Wilhelmina S.

    2012-01-01

    Background: The ubiquity, availability and exponential growth of digital information and communication technology (ICT) creates unique opportunities for learning and teaching in the senior secondary school biology curriculum. Digital technologies make it possible for emerging disciplinary knowledge and understanding of biological processes…

  10. Informational and symbolic content of over-the-counter drug advertising on television.

    Science.gov (United States)

    Tsao, J C

    1997-01-01

    The informational and symbolic content of 150 over-the-counter drug commercials on television are empirically analyzed in this study. Results on the informational content suggest that over-the-counter drug ads tend to focus on the concern of what the drug will do for the consumer, rather than on the reasons why the drug should be ingested. Accordingly, advertising strategy is centered on consumer awareness of the product as the primary goal. Educational commitment, however, did not seem to be blended into the promotional efforts for over-the-counter drugs. Findings on the symbolic content of over-the-counter drug ads reveal that drug images have been distorted. Performance of most drugs has been portrayed to be simple resolutions to relieve the symptom. Moreover, a casual attitude toward drug usage is encouraged in the commercials, while time lapse of drug effects is overlooked.

  11. INTEGRATION OF SPATIAL INFORMATION WITH COLOR FOR CONTENT RETRIEVAL OF REMOTE SENSING IMAGES

    Directory of Open Access Journals (Sweden)

    Bikesh Kumar Singh

    2010-08-01

    Full Text Available There is rapid increase in image databases of remote sensing images due to image satellites with high resolution, commercial applications of remote sensing & high available bandwidth in last few years. The problem of content-based image retrieval (CBIR of remotely sensed images presents a major challenge not only because of the surprisingly increasing volume of images acquired from a wide range of sensors but also because of the complexity of images themselves. In this paper, a software system for content-based retrieval of remote sensing images using RGB and HSV color spaces is presented. Further, we also compare our results with spatiogram based content retrieval which integrates spatial information along with color histogram. Experimental results show that the integration of spatial information in color improves the image analysis of remote sensing data. In general, retrievals in HSV color space showed better performance than in RGB color space.

  12. Sorting through search results: a content analysis of HPV vaccine information online.

    Science.gov (United States)

    Madden, Kelly; Nan, Xiaoli; Briones, Rowena; Waks, Leah

    2012-05-28

    Surveys have shown that many people now turn to the Internet for health information when making health-related decisions. This study systematically analyzed the HPV vaccine information returned by online search engines. HPV is the most common sexually transmitted disease and is the leading cause of cervical cancers. We conducted a content analysis of 89 top search results from Google, Yahoo, Bing, and Ask.com. The websites were analyzed with respect to source, tone, information related to specific content analyzed through the lens of the Health Belief Model, and in terms of two content themes (i.e., conspiracy theories and civil liberties). The relations among these aspects of the websites were also explored. Most websites were published by nonprofit or academic sources (34.8%) and governmental agencies (27.4%) and were neutral in tone (57.3%), neither promoting nor opposing the HPV vaccine. Overall, the websites presented suboptimal or inaccurate information related to the five behavioral predictors stipulated in the Health Belief Model. Questions related to civil liberties were present on some websites. Health professionals designing online communication with the intent of increasing HPV vaccine uptake should take care to include information about the risks of HPV, including susceptibility and severity. Additionally, websites should include information about the benefits of the vaccine (i.e., effective against HPV), low side effects as a barrier that can be overcome, and ways in which to receive the vaccine to raise individual self-efficacy. Copyright © 2011 Elsevier Ltd. All rights reserved.

  13. Interdisciplinary Content, Contestations of Knowledge and Informational Transparency in Engineering Curriculum

    Science.gov (United States)

    Barnard, Sarah; Hassan, Tarek; Dainty, Andrew; Bagilhole, Barbara

    2013-01-01

    With the introduction of key information sets (KIS) for all university programmes in the UK from 2012, the character, content and delivery of university degrees may be increasingly used by potential students to differentiate between degree programmes. Therefore, developments in curricula and the relationship to the profession are of growing…

  14. Probability distributions of bed load particle velocities, accelerations, hop distances, and travel times informed by Jaynes's principle of maximum entropy

    Science.gov (United States)

    Furbish, David; Schmeeckle, Mark; Schumer, Rina; Fathel, Siobhan

    2016-01-01

    We describe the most likely forms of the probability distributions of bed load particle velocities, accelerations, hop distances, and travel times, in a manner that formally appeals to inferential statistics while honoring mechanical and kinematic constraints imposed by equilibrium transport conditions. The analysis is based on E. Jaynes's elaboration of the implications of the similarity between the Gibbs entropy in statistical mechanics and the Shannon entropy in information theory. By maximizing the information entropy of a distribution subject to known constraints on its moments, our choice of the form of the distribution is unbiased. The analysis suggests that particle velocities and travel times are exponentially distributed and that particle accelerations follow a Laplace distribution with zero mean. Particle hop distances, viewed alone, ought to be distributed exponentially. However, the covariance between hop distances and travel times precludes this result. Instead, the covariance structure suggests that hop distances follow a Weibull distribution. These distributions are consistent with high-resolution measurements obtained from high-speed imaging of bed load particle motions. The analysis brings us closer to choosing distributions based on our mechanical insight.

  15. A Two-Stage Information-Theoretic Approach to Modeling Landscape-Level Attributes and Maximum Recruitment of Chinook Salmon in the Columbia River Basin.

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, William L.; Lee, Danny C.

    2000-11-01

    Many anadromous salmonid stocks in the Pacific Northwest are at their lowest recorded levels, which has raised questions regarding their long-term persistence under current conditions. There are a number of factors, such as freshwater spawning and rearing habitat, that could potentially influence their numbers. Therefore, we used the latest advances in information-theoretic methods in a two-stage modeling process to investigate relationships between landscape-level habitat attributes and maximum recruitment of 25 index stocks of chinook salmon (Oncorhynchus tshawytscha) in the Columbia River basin. Our first-stage model selection results indicated that the Ricker-type, stock recruitment model with a constant Ricker a (i.e., recruits-per-spawner at low numbers of fish) across stocks was the only plausible one given these data, which contrasted with previous unpublished findings. Our second-stage results revealed that maximum recruitment of chinook salmon had a strongly negative relationship with percentage of surrounding subwatersheds categorized as predominantly containing U.S. Forest Service and private moderate-high impact managed forest. That is, our model predicted that average maximum recruitment of chinook salmon would decrease by at least 247 fish for every increase of 33% in surrounding subwatersheds categorized as predominantly containing U.S. Forest Service and privately managed forest. Conversely, mean annual air temperature had a positive relationship with salmon maximum recruitment, with an average increase of at least 179 fish for every increase in 2 C mean annual air temperature.

  16. Information content of long-range NMR data for the characterization of conformational heterogeneity

    Energy Technology Data Exchange (ETDEWEB)

    Andrałojć, Witold [University of Florence, Center for Magnetic Resonance (CERM) (Italy); Berlin, Konstantin; Fushman, David, E-mail: fushman@umd.edu [University of Maryland, Department of Chemistry and Biochemistry, Center for Biomolecular Structure and Organization (United States); Luchinat, Claudio, E-mail: luchinat@cerm.unifi.it; Parigi, Giacomo; Ravera, Enrico [University of Florence, Center for Magnetic Resonance (CERM) (Italy); Sgheri, Luca [CNR, Istituto per le Applicazioni del Calcolo, Sezione di Firenze (Italy)

    2015-07-15

    Long-range NMR data, namely residual dipolar couplings (RDCs) from external alignment and paramagnetic data, are becoming increasingly popular for the characterization of conformational heterogeneity of multidomain biomacromolecules and protein complexes. The question addressed here is how much information is contained in these averaged data. We have analyzed and compared the information content of conformationally averaged RDCs caused by steric alignment and of both RDCs and pseudocontact shifts caused by paramagnetic alignment, and found that, despite the substantial differences, they contain a similar amount of information. Furthermore, using several synthetic tests we find that both sets of data are equally good towards recovering the major state(s) in conformational distributions.

  17. Using information and communication technology (ICT) to the maximum: learning and teaching biology with limited digital technologies

    Science.gov (United States)

    Van Rooy, Wilhelmina S.

    2012-04-01

    Background: The ubiquity, availability and exponential growth of digital information and communication technology (ICT) creates unique opportunities for learning and teaching in the senior secondary school biology curriculum. Digital technologies make it possible for emerging disciplinary knowledge and understanding of biological processes previously too small, large, slow or fast to be taught. Indeed, much of bioscience can now be effectively taught via digital technology, since its representational and symbolic forms are in digital formats. Purpose: This paper is part of a larger Australian study dealing with the technologies and modalities of learning biology in secondary schools. Sample: The classroom practices of three experienced biology teachers, working in a range of NSW secondary schools, are compared and contrasted to illustrate how the challenges of limited technologies are confronted to seamlessly integrate what is available into a number of molecular genetics lessons to enhance student learning. Design and method: The data are qualitative and the analysis is based on video classroom observations and semi-structured teacher interviews. Results: Findings indicate that if professional development opportunities are provided where the pedagogy of learning and teaching of both the relevant biology and its digital representations are available, then teachers see the immediate pedagogic benefit to student learning. In particular, teachers use ICT for challenging genetic concepts despite limited computer hardware and software availability. Conclusion: Experienced teachers incorporate ICT, however limited, in order to improve the quality of student learning.

  18. The nature and contents of needs of the users of financial information

    Directory of Open Access Journals (Sweden)

    O.L. Sherstiuk

    2016-09-01

    Full Text Available There are transformation of the content and form of processes, changes in the role of the participants of economic relations, their specified goals and objectives in economic activity. Consequently, the role of information support of economic processes is changing. That’s a result of changes in the informational needs of users of information. Beside this, means of receiving, processing and using, view its qualitative and quantitative characteristics are improving. They are based on both the interest of users and modification of tasks that can be made on the results of its using. Particularly, more attention of the subjects of economic relations is paid to aspects of relating directly to obtain economic benefits due possession of certain resources. As the result of study we can see the basis of the needs of users of financial information is an understanding of its formation, evaluation, implementation and evaluation of results implementation. The reasons of user interest in understanding these processes are characterized in the article. The consumer behavior of users of financial information is identified as a detection of consumption cultural of financial information. the content of a random basic, targeted, evaluative and functional levels of consumer behavior of users of financial information are analyzed in the article.

  19. When Educational Material Is Delivered: A Mixed Methods Content Validation Study of the Information Assessment Method.

    Science.gov (United States)

    Badran, Hani; Pluye, Pierre; Grad, Roland

    2017-03-14

    The Information Assessment Method (IAM) allows clinicians to report the cognitive impact, clinical relevance, intention to use, and expected patient health benefits associated with clinical information received by email. More than 15,000 Canadian physicians and pharmacists use the IAM in continuing education programs. In addition, information providers can use IAM ratings and feedback comments from clinicians to improve their products. Our general objective was to validate the IAM questionnaire for the delivery of educational material (ecological and logical content validity). Our specific objectives were to measure the relevance and evaluate the representativeness of IAM items for assessing information received by email. A 3-part mixed methods study was conducted (convergent design). In part 1 (quantitative longitudinal study), the relevance of IAM items was measured. Participants were 5596 physician members of the Canadian Medical Association who used the IAM. A total of 234,196 ratings were collected in 2012. The relevance of IAM items with respect to their main construct was calculated using descriptive statistics (relevance ratio R). In part 2 (qualitative descriptive study), the representativeness of IAM items was evaluated. A total of 15 family physicians completed semistructured face-to-face interviews. For each construct, we evaluated the representativeness of IAM items using a deductive-inductive thematic qualitative data analysis. In part 3 (mixing quantitative and qualitative parts), results from quantitative and qualitative analyses were reviewed, juxtaposed in a table, discussed with experts, and integrated. Thus, our final results are derived from the views of users (ecological content validation) and experts (logical content validation). Of the 23 IAM items, 21 were validated for content, while 2 were removed. In part 1 (quantitative results), 21 items were deemed relevant, while 2 items were deemed not relevant (R=4.86% [N=234,196] and R=3.04% [n

  20. Development and content validation of the information assessment method for patients and consumers.

    Science.gov (United States)

    Pluye, Pierre; Granikov, Vera; Bartlett, Gillian; Grad, Roland M; Tang, David L; Johnson-Lafleur, Janique; Shulha, Michael; Barbosa Galvão, Maria Cristiane; Ricarte, Ivan Lm; Stephenson, Randolph; Shohet, Linda; Hutsul, Jo-Anne; Repchinsky, Carol A; Rosenberg, Ellen; Burnand, Bernard; Légaré, France; Dunikowski, Lynn; Murray, Susan; Boruff, Jill; Frati, Francesca; Kloda, Lorie; Macaulay, Ann; Lagarde, François; Doray, Geneviève

    2014-02-18

    Online consumer health information addresses health problems, self-care, disease prevention, and health care services and is intended for the general public. Using this information, people can improve their knowledge, participation in health decision-making, and health. However, there are no comprehensive instruments to evaluate the value of health information from a consumer perspective. We collaborated with information providers to develop and validate the Information Assessment Method for all (IAM4all) that can be used to collect feedback from information consumers (including patients), and to enable a two-way knowledge translation between information providers and consumers. Content validation steps were followed to develop the IAM4all questionnaire. The first version was based on a theoretical framework from information science, a critical literature review and prior work. Then, 16 laypersons were interviewed on their experience with online health information and specifically their impression of the IAM4all questionnaire. Based on the summaries and interpretations of interviews, questionnaire items were revised, added, and excluded, thus creating the second version of the questionnaire. Subsequently, a panel of 12 information specialists and 8 health researchers participated in an online survey to rate each questionnaire item for relevance, clarity, representativeness, and specificity. The result of this expert panel contributed to the third, current, version of the questionnaire. The current version of the IAM4all questionnaire is structured by four levels of outcomes of information seeking/receiving: situational relevance, cognitive impact, information use, and health benefits. Following the interviews and the expert panel survey, 9 questionnaire items were confirmed as relevant, clear, representative, and specific. To improve readability and accessibility for users with a lower level of literacy, 19 items were reworded and all inconsistencies in using a

  1. A neuromathematical model of human information processing and its application to science content acquisition

    Science.gov (United States)

    Anderson, O. Roger

    The rate of information processing during science learning and the efficiency of the learner in mobilizing relevant information in long-term memory as an aid in transmitting newly acquired information to stable storage in long-term memory are fundamental aspects of science content acquisition. These cognitive processes, moreover, may be substantially related in tempo and quality of organization to the efficiency of higher thought processes such as divergent thinking and problem-solving ability that characterize scientific thought. As a contribution to our quantitative understanding of these fundamental information processes, a mathematical model of information acquisition is presented and empirically evaluated in comparison to evidence obtained from experimental studies of science content acquisition. Computer-based models are used to simulate variations in learning parameters and to generate the theoretical predictions to be empirically tested. The initial tests of the predictive accuracy of the model show close agreement between predicted and actual mean recall scores in short-term learning tasks. Implications of the model for human information acquisition and possible future research are discussed in the context of the unique theoretical framework of the model.

  2. The Role of Multimedia Content in Determining the Virality of Social Media Information

    Directory of Open Access Journals (Sweden)

    Paolo Giacomazzi

    2012-07-01

    Full Text Available The paper provides empirical evidence supporting the assumption that content plays a critical role in determining the virality, i.e., the influence, of social media information. The analysis focuses on multimedia content on Twitter and explores the idea that links to multimedia information increase the virality of posts. In particular, we put forward the following three main hypotheses: (1 posts with a link to multimedia content (photo or video are more retweeted than posts without a link; (2 posts linking a photo are more retweeted than posts linking a video, and (3 posts linking a video raise more sentiment than posts linking a photo. Hypotheses are tested on a sample of roughly two million tweets posted in July 2011 including comments on Berlin, London, Madrid, and Milan relevant from a tourism perspective. Findings support our hypotheses and indicate that multimedia content plays an important role in determining not only the volumes of retweeting, but also the dynamics of the virality of posts measured as speed of retweeting.

  3. Climate change on Twitter: Content, media ecology and information sharing behaviour.

    Science.gov (United States)

    Veltri, Giuseppe A; Atanasova, Dimitrinka

    2017-08-01

    This article presents a study of the content, use of sources and information sharing about climate change analysing over 60,000 tweets collected using a random week sample. We discuss the potential for studying Twitter as a communicative space that is rich in different types of information and presents both new challenges and opportunities. Our analysis combines automatic thematic analysis, semantic network analysis and text classification according to psychological process categories. We also consider the media ecology of tweets and the external web links that users shared. In terms of content, the network of topics uncovered presents a multidimensional discourse that accounts for complex causal links between climate change and its consequences. The media ecology analysis revealed a narrow set of sources with a major role played by traditional media and that emotionally arousing text was more likely to be shared.

  4. Information content analysis: the potential for methane isotopologue retrieval from GOSAT-2

    Science.gov (United States)

    Malina, Edward; Yoshida, Yukio; Matsunaga, Tsuneo; Muller, Jan-Peter

    2018-02-01

    Atmospheric methane is comprised of multiple isotopic molecules, with the most abundant being 12CH4 and 13CH4, making up 98 and 1.1 % of atmospheric methane respectively. It has been shown that is it possible to distinguish between sources of methane (biogenic methane, e.g. marshland, or abiogenic methane, e.g. fracking) via a ratio of these main methane isotopologues, otherwise known as the δ13C value. δ13C values typically range between -10 and -80 ‰, with abiogenic sources closer to zero and biogenic sources showing more negative values. Initially, we suggest that a δ13C difference of 10 ‰ is sufficient, in order to differentiate between methane source types, based on this we derive that a precision of 0.2 ppbv on 13CH4 retrievals may achieve the target δ13C variance. Using an application of the well-established information content analysis (ICA) technique for assumed clear-sky conditions, this paper shows that using a combination of the shortwave infrared (SWIR) bands on the planned Greenhouse gases Observing SATellite (GOSAT-2) mission, 13CH4 can be measured with sufficient information content to a precision of between 0.7 and 1.2 ppbv from a single sounding (assuming a total column average value of 19.14 ppbv), which can then be reduced to the target precision through spatial and temporal averaging techniques. We therefore suggest that GOSAT-2 can be used to differentiate between methane source types. We find that large unconstrained covariance matrices are required in order to achieve sufficient information content, while the solar zenith angle has limited impact on the information content.

  5. Information content analysis: the potential for methane isotopologue retrieval from GOSAT-2

    Directory of Open Access Journals (Sweden)

    E. Malina

    2018-02-01

    Full Text Available Atmospheric methane is comprised of multiple isotopic molecules, with the most abundant being 12CH4 and 13CH4, making up 98 and 1.1 % of atmospheric methane respectively. It has been shown that is it possible to distinguish between sources of methane (biogenic methane, e.g. marshland, or abiogenic methane, e.g. fracking via a ratio of these main methane isotopologues, otherwise known as the δ13C value. δ13C values typically range between −10 and −80 ‰, with abiogenic sources closer to zero and biogenic sources showing more negative values. Initially, we suggest that a δ13C difference of 10 ‰ is sufficient, in order to differentiate between methane source types, based on this we derive that a precision of 0.2 ppbv on 13CH4 retrievals may achieve the target δ13C variance. Using an application of the well-established information content analysis (ICA technique for assumed clear-sky conditions, this paper shows that using a combination of the shortwave infrared (SWIR bands on the planned Greenhouse gases Observing SATellite (GOSAT-2 mission, 13CH4 can be measured with sufficient information content to a precision of between 0.7 and 1.2 ppbv from a single sounding (assuming a total column average value of 19.14 ppbv, which can then be reduced to the target precision through spatial and temporal averaging techniques. We therefore suggest that GOSAT-2 can be used to differentiate between methane source types. We find that large unconstrained covariance matrices are required in order to achieve sufficient information content, while the solar zenith angle has limited impact on the information content.

  6. An evaluation of telehealth websites for design, literacy, information and content.

    Science.gov (United States)

    Whitten, Pamela; Holtz, Bree; Cornacchione, Jennifer; Wirth, Christina

    2011-01-01

    We examined 62 telehealth websites using four assessment criteria: design, literacy, information and telehealth content. The websites came from the member list of the American Telemedicine Association and the Office for the Advancement of Telehealth and partner sites, and were included if they were currently active and at least three clicks deep. Approximately 130 variables were examined for each website by two independent researchers. The websites reviewed contained most of the design variables (mean 74%, SD 6), but fewer of those relating to literacy (mean 26%, SD 6), website information (mean 35%, SD 16) and telehealth content (mean 37%, SD 18). Only 29% of websites encouraged users to ask about telehealth, and 19% contained information on overcoming telehealth barriers. Nonetheless, 84% promoted awareness of telehealth. All evaluation assessments were significantly correlated with each other except for literacy and information. The present study identified various matters that should be addressed when developing telehealth websites. Although much of this represents simple common sense in website design, our evaluation demonstrates that there is still much room for improvement.

  7. Developing Accounting Information System Course Content for Iraqi Higher Education Institution: An Instrument Design

    Directory of Open Access Journals (Sweden)

    Naseem Yousif Hanna Lallo

    2013-07-01

    Full Text Available In ensuring that competent graduates are produced in the universities, the course used embedding knowledge in the students, mindsets needs to be effective. However, the unusual circumstances that happened in Iraq were affected on a universities course. The revolution in information technology (IT affects most of our activations. As a result, it is important to consider the impact of IT on accounting careers. Developing accenting information system course content can generate an accountant who is armed with the knowledge and skills before entering accounting job. Also the development process required instructors have characteristics that make the integrating process of IT knowledge components in AIS course content more smoothly. Iraq is the country facing many difficulties that makes its higher education institutions (HEIs suffered from un updated learning environment and technological backwardness. This causes a low level of accounting graduates’ knowledge and in turn leads to led to consider the Iraqi accountant incapable of working with international organizations and companies or conducting was it professionally. The aim of this paper is to explain the role of IT knowledge elements in developing AIS course content in Iraqi HEIs with considering the moderate effect of the instructors’ characteristics. Furthermore, this paper discusses the development and validation of the quantitative instrument (questionnaire for IT knowledge elements in Iraqi HEIs. Moreover, the reliability of the constructs is also discussed.

  8. Understanding the aerosol information content in multi-spectral reflectance measurements using a synergetic retrieval algorithm

    Directory of Open Access Journals (Sweden)

    D. Martynenko

    2010-11-01

    Full Text Available An information content analysis for multi-wavelength SYNergetic AErosol Retrieval algorithm SYNAER was performed to quantify the number of independent pieces of information that can be retrieved. In particular, the capability of SYNAER to discern various aerosol types is assessed. This information content depends on the aerosol optical depth, the surface albedo spectrum and the observation geometry. The theoretical analysis is performed for a large number of scenarios with various geometries and surface albedo spectra for ocean, soil and vegetation. When the surface albedo spectrum and its accuracy is known under cloud-free conditions, reflectance measurements used in SYNAER is able to provide for 2–4° of freedom that can be attributed to retrieval parameters: aerosol optical depth, aerosol type and surface albedo.

    The focus of this work is placed on an information content analysis with emphasis to the aerosol type classification. This analysis is applied to synthetic reflectance measurements for 40 predefined aerosol mixtures of different basic components, given by sea salt, mineral dust, biomass burning and diesel aerosols, water soluble and water insoluble aerosols. The range of aerosol parameters considered through the 40 mixtures covers the natural variability of tropospheric aerosols. After the information content analysis performed in Holzer-Popp et al. (2008 there was a necessity to compare derived degrees of freedom with retrieved aerosol optical depth for different aerosol types, which is the main focus of this paper.

    The principle component analysis was used to determine the correspondence between degrees of freedom for signal in the retrieval and derived aerosol types. The main results of the analysis indicate correspondence between the major groups of the aerosol types, which are: water soluble aerosol, soot, mineral dust and sea salt and degrees of freedom in the algorithm and show the ability of the SYNAER to

  9. Communicating Risk Information in Direct-to-Consumer Prescription Drug Television Ads: A Content Analysis.

    Science.gov (United States)

    Sullivan, Helen W; Aikin, Kathryn J; Poehlman, Jon

    2017-11-10

    Direct-to-consumer (DTC) television ads for prescription drugs are required to disclose the product's major risks in the audio or audio and visual parts of the presentation (sometimes referred to as the "major statement"). The objective of this content analysis was to determine how the major statement of risks is presented in DTC television ads, including what risk information is presented, how easy or difficult it is to understand the risk information, and the audio and visual characteristics of the major statement. We identified 68 DTC television ads for branded prescription drugs, which included a unique major statement and that aired between July 2012 and August 2014. We used subjective and objective measures to code 50 ads randomly selected from the main sample. Major statements often presented numerous risks, usually in order of severity, with no quantitative information about the risks' severity or prevalence. The major statements required a high school reading level, and many included long and complex sentences. The major statements were often accompanied by competing non-risk information in the visual images, presented with moderately fast-paced music, and read at a faster pace than benefit information. Overall, we discovered several ways in which the communication of risk information could be improved.

  10. Advertising, expectations and informed consent: the contents and functions of acupuncture leaflets.

    Science.gov (United States)

    Bishop, Felicity L; Salmon, Cathy

    2013-12-01

    To evaluate the content of patient information leaflets about acupuncture. 401 patient information leaflets were obtained from practising UK acupuncturists and subjected to content and thematic analysis. 59% of included leaflets were from NHS physiotherapists. Almost all the leaflets defined acupuncture and the majority explained how it might work, described the treatment process and placed it in a historical context. Most described possible benefits and risks of acupuncture and discussed contraindications and safety. Just under a third of leaflets (120, 30%) suggested conditions that might be helped by acupuncture, most commonly musculoskeletal pain, arthritis and injuries. By emphasising differences between individuals in acupuncture treatments and responsiveness, the leaflets fostered hope for positive effects without making any guarantees. Information leaflets are broadly consistent with the evidence for acupuncture, but some claims are inconsistent with official advice from advertising regulators. An ethically sound, scientifically grounded and psychologically effective leaflet should accurately convey both benefits and risks of treatment, optimise patients' expectations and allay concerns about needling. This study suggests that acupuncture leaflets might achieve these multiple functions but care should be taken to ensure adequate coverage of risks.

  11. Content, Accessibility, and Dissemination of Disaster Information via Social Media During the 2016 Louisiana Floods.

    Science.gov (United States)

    Scott, Katherine K; Errett, Nicole A

    2017-12-27

    Social media is becoming increasingly integrated into disaster response communication strategies of public health and emergency response agencies. We sought to assess the content, accessibility, and dissemination of social media communications made by government agencies during a disaster response. A cross-sectional analysis of social media posts made by federal, state, and local government, public health and emergency management agencies before, during, and after the 2016 Louisiana floods was conducted to determine their content, accessibility, and dissemination by level of government and time relative to disaster onset. Facebook and/or Twitter posts made by public agencies involved in the response to the 2016 Louisiana Flooding events (FEMA Disaster Declaration [DR-4277]) published between August 4 and September 16, 2016, publicly available online between February 21 and March 31, 2017, were included in the analysis. Content: The text of each post was assessed to determine whether it contained information on provision of situational awareness; addressing misconception, actionable requests; mental, behavioral, and emotional support; and/or recovery and rebuilding resources. Accessibility: A Flesh-Kincaid grade level of each post was calculated, and information on post language, originality, hyperlinks, visuals, videos, or hash tag was recorded. Dissemination: The average number of reacts/likes, shares/retweets, and comments per post was calculated. Most posts contained information related to situational awareness and recovery resources. There was an increase in messages during the first week of the disaster at all levels. Few posts were made in languages other than English. Compared with state and federal posts, local Facebook posts averaged fewer reacts, comments, and shares throughout the analysis period. Government agencies may maximize the use of social media platforms for disaster communications by establishing their social media network in advance of a

  12. The Incremental Information Content of the Cash Flow Statement: An Australian Empirical Investigation

    OpenAIRE

    Hadri Kusuma

    2014-01-01

    The general objective of the present study is to investigate and assess the incremental information content of cash flow disclosures as required by the AASB 1026 ¡°Statement of Cash Flows¡±. This test addresses the issue of whether a change in cash flow components has the same relationship with security prices as that in earnings. Several previous studies indicate both income and cash flow statements may be mutually exclusive or mutually inclusive statements. The data to test three hypotheses...

  13. Information content of neural networks with self-control and variable activity

    International Nuclear Information System (INIS)

    Bolle, D.; Amari, S.I.; Dominguez Carreta, D.R.C.; Massolo, G.

    2001-01-01

    A self-control mechanism for the dynamics of neural networks with variable activity is discussed using a recursive scheme for the time evolution of the local field. It is based upon the introduction of a self-adapting time-dependent threshold as a function of both the neural and pattern activity in the network. This mechanism leads to an improvement of the information content of the network as well as an increase of the storage capacity and the basins of attraction. Different architectures are considered and the results are compared with numerical simulations

  14. On providing the fault-tolerant operation of information systems based on open content management systems

    Science.gov (United States)

    Kratov, Sergey

    2018-01-01

    Modern information systems designed to service a wide range of users, regardless of their subject area, are increasingly based on Web technologies and are available to users via Internet. The article discusses the issues of providing the fault-tolerant operation of such information systems, based on free and open source content management systems. The toolkit available to administrators of similar systems is shown; the scenarios for using these tools are described. Options for organizing backups and restoring the operability of systems after failures are suggested. Application of the proposed methods and approaches allows providing continuous monitoring of the state of systems, timely response to the emergence of possible problems and their prompt solution.

  15. Surfing for juvenile idiopathic arthritis: perspectives on quality and content of information on the Internet.

    Science.gov (United States)

    Stinson, Jennifer N; Tucker, Lori; Huber, Adam; Harris, Heather; Lin, Carmen; Cohen, Lindsay; Gill, Navreet; Lukas-Bretzler, Jacqueline; Proulx, Laurie; Prowten, David

    2009-08-01

    To determine the quality and content of English language Internet information about juvenile idiopathic arthritis (JIA) from the perspectives of consumers and healthcare professionals. Key words relevant to JIA were searched across 10 search engines. Quality of information was appraised independently by 2 health professionals, 1 young adult with JIA, and a parent using the DISCERN tool. Concordance of the website content (i.e., accuracy and completeness) with available evidence about the management of JIA was determined. Readability was determined using Flesch-Kincaid grade level and Reading Ease Score. Out of the 3000 Web pages accessed, only 58 unique sites met the inclusion criteria. Of these sites only 16 had DISCERN scores above 50% (indicating fair quality). These sites were then rated by consumers. Most sites targeted parents and none were specifically developed for youth with JIA. The overall quality of website information was fair, with a mean DISCERN quality rating score of 48.92 out of 75 (+/- 6.56, range 34.0-59.5). Overall completeness of sites was 9.07 out of 16 (+/- 2.28, range 5.25-13.25) and accuracy was 3.09 out of 4 (+/- 0.86, range 2-4), indicating a moderate level of accuracy. Average Flesch-Kincaid grade level and Reading Ease Score were 11.48 (+/- 0.74, range 10.1-12.0) and 36.36 (+/- 10.86, range 6.30-48.1), respectively, indicating that the material was difficult to read. Our study highlights the paucity of high quality Internet health information at an appropriate reading level for youth with JIA and their parents.

  16. DIMOND II: Measures for optimising radiological information content and dose in digital imaging

    International Nuclear Information System (INIS)

    Dowling, A.; Malone, J.; Marsh, D.

    2001-01-01

    The European Commission concerted action on 'Digital Imaging: Measures for Optimising Radiological Information Content and Dose', DIMOND II, was conducted by 12 European partners over the period January 1997 to June 1999. The objective of the concerted action was to initiate a project in the area of digital medical imaging where practice was evolving without structured research in radiation protection, optimisation or justification. The main issues addressed were patient and staff dosimetry, image quality, quality criteria and technical issues. The scope included computed radiography (CR), image intensifier radiography and fluoroscopy, cardiology and interventional procedures. The concerted action was based on the consolidation of work conducted in the partner's institutions together with elective new work. Protocols and approaches to dosimetry, radiological information content/image quality measurement and quality criteria were established and presented at an international workshop held in Dublin in June 1999. Details of the work conducted during the DIMOND II concerted action and a summary of the main findings and conclusions are presented in this contribution. (author)

  17. Enriching Traditional Cataloging for Improved Access to Information:Library of Congress Tables of Contents Projects

    Directory of Open Access Journals (Sweden)

    John D. Byrum Jr.

    2006-03-01

    Full Text Available Traditionally, standard catalog records have provided bibliographic data that mostly address the basic features of library resources. At the same time, catalogs have offered access to these records through a limited array of names, titles, series, subject headings, class numbers, and a relatively small number of keywords contained within descriptions. Today’s catalog users expect access to information well beyond what can be offered by traditional approaches to bibliographic description and access. By pursuing a suite of projects, the Library of Congress (LC has responded to the challenge of enticing patrons to continue to include the online catalog among the tools they use for information retrieval. Drawing extensively on the power of automation, staff of LC’s Bibliographic Enrichment Advisory Team (BEAT have created and implemented a variety of initiatives to link researchers, catalogs, and Web resources; increase the content of the catalog record; and link the catalog to electronic resources. BEAT’s ongoing work demonstrates how, in the electronic era, it is possible to provide new and improved ways to capitalize on traditional services in the digital age. This paper will illustrate these points by focusing on BEAT’s tables of contents projects to demonstrate how library automation can make significant bibliographic enhancement efforts quick, easy, and affordable to achieve.

  18. Uncertainty quantification for nuclear density functional theory and information content of new measurements.

    Science.gov (United States)

    McDonnell, J D; Schunck, N; Higdon, D; Sarich, J; Wild, S M; Nazarewicz, W

    2015-03-27

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. The example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.

  19. Uncertainty quantification for nuclear density functional theory and information content of new measurements

    Energy Technology Data Exchange (ETDEWEB)

    McDonnell, J. D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Schunck, N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Higdon, D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sarich, J. [Argonne National Lab. (ANL), Argonne, IL (United States); Wild, S. M. [Argonne National Lab. (ANL), Argonne, IL (United States); Nazarewicz, W. [Michigan State Univ., East Lansing, MI (United States); Oak Ridge National Lab., Oak Ridge, TN (United States); Univ. of Warsaw, Warsaw (Poland)

    2015-03-24

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. As a result, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.

  20. Information content of OCO-2 oxygen A-band channels for retrieving marine liquid cloud properties

    Science.gov (United States)

    Richardson, Mark; Stephens, Graeme L.

    2018-03-01

    Information content analysis is used to select channels for a marine liquid cloud retrieval using the high-spectral-resolution oxygen A-band instrument on NASA's Orbiting Carbon Observatory-2 (OCO-2). Desired retrieval properties are cloud optical depth, cloud-top pressure and cloud pressure thickness, which is the geometric thickness expressed in hectopascals. Based on information content criteria we select a micro-window of 75 of the 853 functioning OCO-2 channels spanning 763.5-764.6 nm and perform a series of synthetic retrievals with perturbed initial conditions. We estimate posterior errors from the sample standard deviations and obtain ±0.75 in optical depth and ±12.9 hPa in both cloud-top pressure and cloud pressure thickness, although removing the 10 % of samples with the highest χ2 reduces posterior error in cloud-top pressure to ±2.9 hPa and cloud pressure thickness to ±2.5 hPa. The application of this retrieval to real OCO-2 measurements is briefly discussed, along with limitations and the greatest caution is urged regarding the assumption of a single homogeneous cloud layer, which is often, but not always, a reasonable approximation for marine boundary layer clouds.

  1. A new Dobson Umkehr ozone profile retrieval method optimising information content and resolution

    Science.gov (United States)

    Stone, K.; Tully, M. B.; Rhodes, S. K.; Schofield, R.

    2015-03-01

    The standard Dobson Umkehr methodology to retrieve coarse-resolution ozone profiles used by the National Oceanographic and Atmospheric Administration uses designated solar zenith angles (SZAs). However, some information may be lost if measurements lie outside the designated SZA range (between 60° and 90°), or do not conform to the fitting technique. Also, while Umkehr measurements can be taken using multiple wavelength pairs (A, C and D), past retrieval methods have focused on a single pair (C). Here we present an Umkehr inversion method that uses measurements at all SZAs (termed operational) and all wavelength pairs. (Although, we caution direct comparison to other algorithms.) Information content for a Melbourne, Australia (38° S, 145° E) Umkehr measurement case study from 28 January 1994, with SZA range similar to that designated in previous algorithms is shown. When comparing the typical single wavelength pair with designated SZAs to the operational measurements, the total degrees of freedom (independent pieces of information) increases from 3.1 to 3.4, with the majority of the information gain originating from Umkehr layers 2 + 3 and 4 (10-20 km and 25-30 km respectively). In addition to this, using all available wavelength pairs increases the total degrees of freedom to 5.2, with the most significant increases in Umkehr layers 2 + 3 to 7 and 9+ (10-40 and 45-80 km). Investigating a case from 13 April 1970 where the measurements extend beyond the 90° SZA range gives further information gain, with total degrees of freedom extending to 6.5. Similar increases are seen in the information content. Comparing the retrieved Melbourne Umkehr time series with ozonesondes shows excellent agreement in layers 2 + 3 and 4 (10-20 and 25-30 km) for both C and A + C + D-pairs. Retrievals in layers 5 and 6 (25-30 and 30-35 km) consistently show lower ozone partial column compared to ozonesondes. This is likely due to stray light effects that are not accounted for in the

  2. Simulation study of the aerosol information content in OMI spectral reflectance measurements

    Directory of Open Access Journals (Sweden)

    B. Veihelmann

    2007-06-01

    Full Text Available The Ozone Monitoring Instrument (OMI is an imaging UV-VIS solar backscatter spectrometer and is designed and used primarily to retrieve trace gases like O3 and NO2 from the measured Earth reflectance spectrum in the UV-visible (270–500 nm. However, also aerosols are an important science target of OMI. The multi-wavelength algorithm is used to retrieve aerosol parameters from OMI spectral reflectance measurements in up to 20 wavelength bands. A Principal Component Analysis (PCA is performed to quantify the information content of OMI reflectance measurements on aerosols and to assess the capability of the multi-wavelength algorithm to discern various aerosol types. This analysis is applied to synthetic reflectance measurements for desert dust, biomass burning aerosols, and weakly absorbing anthropogenic aerosol with a variety of aerosol optical thicknesses, aerosol layer altitudes, refractive indices and size distributions. The range of aerosol parameters considered covers the natural variability of tropospheric aerosols. This theoretical analysis is performed for a large number of scenarios with various geometries and surface albedo spectra for ocean, soil and vegetation. When the surface albedo spectrum is accurately known and clouds are absent, OMI reflectance measurements have 2 to 4 degrees of freedom that can be attributed to aerosol parameters. This information content depends on the observation geometry and the surface albedo spectrum. An additional wavelength band is evaluated, that comprises the O2-O2 absorption band at a wavelength of 477 nm. It is found that this wavelength band adds significantly more information than any other individual band.

  3. 10 CFR 2.908 - Contents of notice of intent to introduce restricted data or other national security information.

    Science.gov (United States)

    2010-01-01

    ... or other national security information. 2.908 Section 2.908 Energy NUCLEAR REGULATORY COMMISSION... Applicable to Adjudicatory Proceedings Involving Restricted Data and/or National Security Information § 2.908 Contents of notice of intent to introduce restricted data or other national security information. (a) A...

  4. Is Nutrient Content and Other Label Information for Prescription Prenatal Supplements Different from Nonprescription Products?

    Science.gov (United States)

    Saldanha, Leila G; Dwyer, Johanna T; Andrews, Karen W; Brown, LaVerne L; Costello, Rebecca B; Ershow, Abby G; Gusev, Pavel A; Hardy, Constance J; Pehrsson, Pamela R

    2017-09-01

    Prenatal supplements are often recommended to pregnant women to help meet their nutrient needs. Many products are available, making it difficult to choose a suitable supplement because little is known about their labeling and contents to evaluate their appropriateness. To determine differences between prescription and nonprescription prenatal supplements available in the United States regarding declared nutrient and nonnutrient ingredients and the presence of dosing and safety-related information. Using two publicly available databases with information about prenatal supplement products, information from prescription and nonprescription product labels were extracted and evaluated. For the 82 prescription and 132 nonprescription products, declared label amounts of seven vitamins and minerals, docosahexaenoic acid (DHA), the presence of other nonnutrient components, and the presence of key safety and informational elements as identified in two Department of Health and Human Services Office of Inspector General (OIG)'s 2003 reports were compiled and compared. Compared with nonprescription products, prescription products contained significantly fewer vitamins (9±0.2 vs 11±0.3; P≤0.05) and minerals (4±0.1 vs 8±0.3; P≤0.05). Declared amounts of folic acid were higher in prescription products, whereas vitamin A, vitamin D, iodine, and calcium were higher in the nonprescription products. Amounts of iron, zinc, and DHA were similar. Virtually all products contained levels of one or more nutrients that exceeded the Recommended Dietary Allowances for pregnant and/or lactating women. Product type also influenced ingredients added. Fewer prescription products contained botanical ingredients (6% prescription vs 33% nonprescription) and probiotics (2% prescription vs 8% nonprescription). Only prescription products contained the stool softener docusate sodium. Our analysis of prenatal supplements indicates that prescription and nonprescription supplements differ in terms

  5. Information content and acoustic structure of male African elephant social rumbles.

    Science.gov (United States)

    Stoeger, Angela S; Baotic, Anton

    2016-06-08

    Until recently, the prevailing theory about male African elephants (Loxodonta africana) was that, once adult and sexually mature, males are solitary and targeted only at finding estrous females. While this is true during the state of 'musth' (a condition characterized by aggressive behavior and elevated androgen levels), 'non-musth' males exhibit a social system seemingly based on companionship, dominance and established hierarchies. Research on elephant vocal communication has so far focused on females, and very little is known about the acoustic structure and the information content of male vocalizations. Using the source and filter theory approach, we analyzed social rumbles of 10 male African elephants. Our results reveal that male rumbles encode information about individuality and maturity (age and size), with formant frequencies and absolute fundamental frequency values having the most informative power. This first comprehensive study on male elephant vocalizations gives important indications on their potential functional relevance for male-male and male-female communication. Our results suggest that, similar to the highly social females, future research on male elephant vocal behavior will reveal a complex communication system in which social knowledge, companionship, hierarchy, reproductive competition and the need to communicate over long distances play key roles.

  6. Plain Language to Communicate Physical Activity Information: A Website Content Analysis.

    Science.gov (United States)

    Paige, Samantha R; Black, David R; Mattson, Marifran; Coster, Daniel C; Stellefson, Michael

    2018-04-01

    Plain language techniques are health literacy universal precautions intended to enhance health care system navigation and health outcomes. Physical activity (PA) is a popular topic on the Internet, yet it is unknown if information is communicated in plain language. This study examined how plain language techniques are included in PA websites, and if the use of plain language techniques varies according to search procedures (keyword, search engine) and website host source (government, commercial, educational/organizational). Three keywords ("physical activity," "fitness," and "exercise") were independently entered into three search engines (Google, Bing, and Yahoo) to locate a nonprobability sample of websites ( N = 61). Fourteen plain language techniques were coded within each website to examine content formatting, clarity and conciseness, and multimedia use. Approximately half ( M = 6.59; SD = 1.68) of the plain language techniques were included in each website. Keyword physical activity resulted in websites with fewer clear and concise plain language techniques ( p websites with more clear and concise techniques ( p language techniques did not vary by search engine or the website host source. Accessing PA information that is easy to understand and behaviorally oriented may remain a challenge for users. Transdisciplinary collaborations are needed to optimize plain language techniques while communicating online PA information.

  7. The Information Content of Corridor Volatility Measures During Calm and Turmoil Periods

    Directory of Open Access Journals (Sweden)

    Elyas Elyasiani

    2017-12-01

    Full Text Available Measurement of volatility is of paramount importance in finance because of the effects on risk measurement and risk management. Corridor implied volatility measures allow us to disentangle the volatility of positive returns from that of negative returns, providing investors with additional information beyond standard market volatility. The aim of the paper is twofold. First, to propose different types of corridor implied volatility and some combinations of them as risk indicators, in order to provide useful information about investors’ sentiment and future market returns. Second, to investigate their usefulness in prediction of market returns under different market conditions (with a particular focus on the subprime crisis and the European debt crisis. The data set consists of daily index options traded on the Italian market and covers the 2005–2014 period. We find that upside corridor implied volatility measure embeds the highest information content about contemporaneous market returns, claiming the superiority of call options in measuring current sentiment in the market. Moreover, both upside and downside volatilities can be considered as barometers of investors’ fear. The volatility measures proposed have forecasting power on future returns only during high volatility periods and in particular during the European debt crisis. The explanatory power on future market returns improves when two of the proposed volatility measures are combined together in the same model.

  8. Informative content of clinical symptoms of acute appendicitis in different terms of pregnancy

    Directory of Open Access Journals (Sweden)

    Kutovoy A.B.

    2015-09-01

    Full Text Available With the purpose to evaluate diagnostic efficacy of some clinical symptoms of acute appendicitis 75 women in different terms of pregnancy were examined. Informative content of such symptoms as Kocher- Volkovich, Rovsing, Bartomje - Michelson, Sitkovsky, Gabay, Brendo, Michelson, Ivanov was studied. Pain syndrome was fixed in all examined women. Pain localization was various and depended on the pregnancy term. During the I trimester of pregnancy the most often pain was manifestated in epigastrium and right lower quadrant, rarely in other abdomen regions. In the II trimester in majority of cases pain occurred in right lower quadrant. During III trimester pain prevailed in right upper quadrant of abdomen. Analyzing informative component of researching symptoms there was noted significant decrease (р<0,05; р<0,01; р<0,001 of their diagnostic value with growth of pregnancy term. Therefore Kocher – Volkovich and Rovsing symptoms were the most informative in the I trimester of pregnancy. Diagnostic efficacy of Brendo(67,3%, Michelson(55,7%, Ivanov(59,6% symptoms was higher than that of Kocher – Volkovich (36,5%, Rovsing (28,8%, Sitkovsky (51,9%, Bartomje – Michelson (55,7% symptoms, their value was diminishing together with increase of pregnancy terms.

  9. Fatty acid and sodium contents of commercial milk chocolate – analytical aspects and nutritional information

    Directory of Open Access Journals (Sweden)

    Renato Cesar Susin

    2015-06-01

    Full Text Available SummaryChocolate consumption is usually associated with enjoyment, milk chocolate desserts being a very popular choice. Besides, the literature provides data suggesting health benefits for chocolate products as compared to non-chocolate candies. However, the lipid composition of cocoa and its commercial products has yet to be completely elucidated and understood, although much research has been carried out with this objective. Contributions to this objective frequently face difficulties in the field of Analytical Chemistry due to the complexity of the composition of such a food. On the other hand, the sodium content of foods is currently a major concern. Thus, this work aims to provide information concerning the composition of commercial milk chocolate in terms of its fatty acid profile and sodium content. To achieve this purpose, analytical adjustments and improvements to the methodology were made and described in this paper. Sodium (FAAS and a total of 50 fatty acids (GC-FID were determined in eight samples of milk chocolate bars from different manufacturers. The samples were purchased from retailers in Porto Alegre – Brazil. In the determination of the fatty acids, possible losses during methylation deserved special attention and were studied. Nevertheless, large differences were not found in comparison with the nutritional facts declared on the label. However, the results obtained for sodium demonstrated the importance of food inspection, considering the discrepancies found.

  10. Eye movements during the recollection of text information reflect content rather than the text itself

    DEFF Research Database (Denmark)

    Traub, Franziska; Johansson, Roger; Holmqvist, Kenneth

    Several studies have reported that spontaneous eye movements occur when visuospatial information is recalled from memory. Such gazes closely reflect the content and spatial relations from the original scene layout (e.g., Johansson et al., 2012). However, when someone has originally read a scene....... Recollection was performed orally while gazing at a blank screen. Results demonstrate that participant’s gaze patterns during recall more closely reflect the spatial layout of the scene than the physical locations of the text. Memory data provide evidence that mental models representing either the situation...... description, the memory of the physical layout of the text itself might compete with the memory of the spatial arrangement of the described scene. The present study was designed to address this fundamental issue by having participants read scene descriptions that where manipulated to be either congruent...

  11. Subfamily logos: visualization of sequence deviations at alignment positions with high information content

    Directory of Open Access Journals (Sweden)

    Beitz Eric

    2006-06-01

    Full Text Available Abstract Background Recognition of relevant sequence deviations can be valuable for elucidating functional differences between protein subfamilies. Interesting residues at highly conserved positions can then be mutated and experimentally analyzed. However, identification of such sites is tedious because automated approaches are scarce. Results Subfamily logos visualize subfamily-specific sequence deviations. The display is similar to classical sequence logos but extends into the negative range. Positive, upright characters correspond to residues which are characteristic for the subfamily, negative, upside-down characters to residues typical for the remaining sequences. The symbol height is adjusted to the information content of the alignment position. Residues which are conserved throughout do not appear. Conclusion Subfamily logos provide an intuitive display of relevant sequence deviations. The method has proven to be valid using a set of 135 aligned aquaporin sequences in which established subfamily-specific positions were readily identified by the algorithm.

  12. The Informational Content of Credit Ratings in Brazil: An Event Study

    Directory of Open Access Journals (Sweden)

    Flávia Cruz de Souza Murcia

    2014-03-01

    Full Text Available This study analyzes the effect of credit rating announcements on stock returns in the Brazilian market during 1997-2011. We conducted an event study using a sample of 242 observations of listed companies, 179 from Standard and Poor’s and 63 from Moody’s, to analyze stock market reaction. Abnormal returns have been computed using the Market Model and CAPM for three windows: three days (-1, +1, 11 days (-5, +5 and 21 days (-10, +10. We find statistically significant abnormal returns in days -1 and 0 for all the three types of rating announcement tested: initial rating, downgrades and upgrades. For downgrades, consisted with prior studies, our results also showed negative abnormal returns for all practically all windows tested. Overall, our findings evidence the rating announcements do have information content, as it impacts stock returns causing abnormal returns, especially when they bring ‘bad news’ to the market.

  13. Aerosol Retrievals from Proposed Satellite Bistatic Lidar Observations: Algorithm and Information Content

    Science.gov (United States)

    Alexandrov, M. D.; Mishchenko, M. I.

    2017-12-01

    Accurate aerosol retrievals from space remain quite challenging and typically involve solving a severely ill-posed inverse scattering problem. We suggested to address this ill-posedness by flying a bistatic lidar system. Such a system would consist of formation flying constellation of a primary satellite equipped with a conventional monostatic (backscattering) lidar and an additional platform hosting a receiver of the scattered laser light. If successfully implemented, this concept would combine the measurement capabilities of a passive multi-angle multi-spectral polarimeter with the vertical profiling capability of a lidar. Thus, bistatic lidar observations will be free of deficiencies affecting both monostatic lidar measurements (caused by the highly limited information content) and passive photopolarimetric measurements (caused by vertical integration and surface reflection).We present a preliminary aerosol retrieval algorithm for a bistatic lidar system consisting of a high spectral resolution lidar (HSRL) and an additional receiver flown in formation with it at a scattering angle of 165 degrees. This algorithm was applied to synthetic data generated using Mie-theory computations. The model/retrieval parameters in our tests were the effective radius and variance of the aerosol size distribution, complex refractive index of the particles, and their number concentration. Both mono- and bimodal aerosol mixtures were considered. Our algorithm allowed for definitive evaluation of error propagation from measurements to retrievals using a Monte Carlo technique, which involves random distortion of the observations and statistical characterization of the resulting retrieval errors. Our tests demonstrated that supplementing a conventional monostatic HSRL with an additional receiver dramatically increases the information content of the measurements and allows for a sufficiently accurate characterization of tropospheric aerosols.

  14. Information theory explanation of the fluctuation theorem, maximum entropy production and self-organized criticality in non-equilibrium stationary states

    CERN Document Server

    Dewar, R

    2003-01-01

    Jaynes' information theory formalism of statistical mechanics is applied to the stationary states of open, non-equilibrium systems. First, it is shown that the probability distribution p subGAMMA of the underlying microscopic phase space trajectories GAMMA over a time interval of length tau satisfies p subGAMMA propor to exp(tau sigma subGAMMA/2k sub B) where sigma subGAMMA is the time-averaged rate of entropy production of GAMMA. Three consequences of this result are then derived: (1) the fluctuation theorem, which describes the exponentially declining probability of deviations from the second law of thermodynamics as tau -> infinity; (2) the selection principle of maximum entropy production for non-equilibrium stationary states, empirical support for which has been found in studies of phenomena as diverse as the Earth's climate and crystal growth morphology; and (3) the emergence of self-organized criticality for flux-driven systems in the slowly-driven limit. The explanation of these results on general inf...

  15. Teor e capacidade máxima de adsorção de arsênio em Latossolos brasileiros Content and maximum capacity of arsenic adsorption in Brazilian Oxisols

    Directory of Open Access Journals (Sweden)

    Mari Lucia Campos

    2007-12-01

    Full Text Available A alta toxicidade do As aos animais e humanos e a possibilidade de existência de grande número de áreas contaminadas tornam imprescindível o conhecimento do teor semitotal em solos ditos não-contaminados e dos processos de adsorção do As em solos de carga variável. O objetivo deste trabalho foi determinar o teor e a capacidade máxima de adsorção de As (CMADS AS em Latossolos. O teor total foi determinado pelo método USEPA 3051A, e a CMADS As, com auxílio de isotermas de Langmuir com base nos valores de adsorção obtidos em dose de As (0, 90, 190, 380, 760 e 1.150 µmol L-1 (relação solo:solução final = 1:100, a pH 5,5 e força iônica de 15 mmol L-1. Os 17 Latossolos apresentaram teor médio total de As de 5,92 mg kg-1 e CMADS As média de 2.013 mg kg-1. O teor de argila e os óxidos de Fe e Al apresentaram influência positiva na CMADS As.In view of the toxicity of As for man and animals and the possibility of existence of a great number of contaminated areas it is imperative to know the total As content in soils considered uncontaminated and about As sorption processes in soils of variable charge. The objective of this work was to determine the total content and maximum capacity of As adsorption (CMADS As in Oxisols. The total content was determined by the USEPA 3051A method. The cmADS As was determined by the Langmuir Isotherms using six solution concentrations (0, 0.09, 0.19, 0.38, 0.76, 1.15 mmol L-1 (1:100 soil: solution ratio, pH values 5.5 and ionic strength 15 mmol L-1. In the 17 Oxisols the average total As content was 5.92 mg kg-1 and mean cmADS As was 2.013 mg kg-1. Clay, and Fe and Al oxides content influenced cmADSs positively.

  16. On Neyman-Pearson Theory: Information Content of an Experiment and a Fancy Paradox

    Directory of Open Access Journals (Sweden)

    Benito Vittorio Frosini

    2007-10-01

    Full Text Available Two topics, connected with Neyman-Pearson theory of testing hypotheses, are treated in this article. The first topic is related to the information content of an experiment; after a short outline of ordinal comparability of experiments, the two most popular information measures – by Fisher and by Kullback-Leibler – are considered. As far as we require a comparison of two experiments at a time, the superiority of the couple (a,b of the two error probabilities in the Neyman-Pearson approach is easily established, owing to their clear operational meaning. The second topic deals with the so called Jeffreys – or Lindley – paradox: it can be shown that, if we attach a positive probability to a point null hypothesis, some «paradoxical» posterior probabilities – in a Bayesian approach – result in sharp contrast with the error probabilities in the Neyman-Pearson approach. It is argued that such results are simply the outcomes of absurd assumptions, and it is shown that sensible assumptions about interval – not point – hypotheses can yield posterior probabilities perfectly compatible with the Neyman-Pearson approach (although one must be very careful in making such comparisons, as the two approaches are radically different both in assumptions and in purposes.

  17. YouTube as a source of information on skin bleaching: a content analysis.

    Science.gov (United States)

    Basch, C H; Brown, A A; Fullwood, M D; Clark, A; Fung, I C-H; Yin, J

    2018-06-01

    Skin bleaching is a common, yet potentially harmful body modification practice. To describe the characteristics of the most widely viewed YouTube™ videos related to skin bleaching. The search term 'skin bleaching' was used to identify the 100 most popular English-language YouTube videos relating to the topic. Both descriptive and specific information were noted. Among the 100 manually coded skin-bleaching YouTube videos in English, there were 21 consumer-created videos, 45 internet-based news videos, 30 television news videos and 4 professional videos. Excluding the 4 professional videos, we limited our content categorization and regression analysis to 96 videos. Approximately 93% (89/96) of the most widely viewed videos mentioned changing how you look and 74% (71/96) focused on bleaching the whole body. Of the 96 videos, 63 (66%) of videos showed/mentioned a transformation. Only about 14% (13/96) mentioned that skin bleaching is unsafe. The likelihood of a video selling a skin bleaching product was 17 times higher in internet videos compared with consumer videos (OR = 17.00, 95% CI 4.58-63.09, P YouTube video on skin bleaching was uploaded by an internet source. Videos made by television sources mentioned more information about skin bleaching being unsafe, while consumer-generated videos focused more on making skin-bleaching products at home. © 2017 British Association of Dermatologists.

  18. Drug information, misinformation, and disinformation on social media: a content analysis study.

    Science.gov (United States)

    Al Khaja, Khalid A J; AlKhaja, Alwaleed K; Sequeira, Reginald P

    2018-05-24

    Dissemination of misleading drug information through social media can be detrimental to the health of the public. This study, carried out in Bahrain, evaluated the truthfulness of 22 social media claims about drugs (72.7%), dietary supplements (22.7%), and toxic bisphenol-A (4.5%). They circulated on WhatsApp platform, as case studies. We categorized claims as objectively true, false, or potentially misleading. The content analysis revealed that "potentially misleading" claims were the most frequent messages (59.1%). They tend to exaggerate the efficacy or safety without sufficient evidence to substantiate claims. False claims (27.3%) were likely due to unfair competition or deception. Overall, 13.6% of the messages were objectively true claims that could withstand regulatory scrutiny. Majority of the drug-related messages on social media were potentially misleading or false claims that lacked credible evidence to support them. In the public interest, regulatory authorities should monitor such information disseminated via social media platforms.

  19. Content Analysis of Papers Submitted to Communications in Information Literacy, 2007-2013

    Directory of Open Access Journals (Sweden)

    Christopher V. Hollister

    2014-07-01

    Full Text Available The author conducted a content analysis of papers submitted to the journal, Communications in Information Literacy, from the years 2007-2013. The purpose was to investigate and report on the overall quality characteristics of a statistically significant sample of papers submitted to a single-topic, open access, library and information science (LIS journal. Characteristics of manuscript submissions, authorship, reviewer evaluations, and editorial decisions were illuminated to provide context; particular emphasis was given to the analysis of major criticisms found in reviewer evaluations of rejected papers. Overall results were compared to previously published research. The findings suggest a trend in favor of collaborative authorship, and a possible trend toward a more practice-based literature. The findings also suggest a possible deterioration in some of the skills that are required of LIS authors relative to the preparation of scholarly papers. The author discusses potential implications for authors and the disciplinary literature, recommends directions for future research, and where possible, provides recommendations for the benefit of the greater community of LIS scholars.

  20. Embedding QR codes in tumor board presentations, enhancing educational content for oncology information management.

    Science.gov (United States)

    Siderits, Richard; Yates, Stacy; Rodriguez, Arelis; Lee, Tina; Rimmer, Cheryl; Roche, Mark

    2011-01-01

    Quick Response (QR) Codes are standard in supply management and seen with increasing frequency in advertisements. They are now present regularly in healthcare informatics and education. These 2-dimensional square bar codes, originally designed by the Toyota car company, are free of license and have a published international standard. The codes can be generated by free online software and the resulting images incorporated into presentations. The images can be scanned by "smart" phones and tablets using either the iOS or Android platforms, which link the device with the information represented by the QR code (uniform resource locator or URL, online video, text, v-calendar entries, short message service [SMS] and formatted text). Once linked to the device, the information can be viewed at any time after the original presentation, saved in the device or to a Web-based "cloud" repository, printed, or shared with others via email or Bluetooth file transfer. This paper describes how we use QR codes in our tumor board presentations, discusses the benefits, the different QR codes from Web links and how QR codes facilitate the distribution of educational content.

  1. Building Information Modelling and Standardised Construction Contracts: a Content Analysis of the GC21 Contract

    Directory of Open Access Journals (Sweden)

    Aaron Manderson

    2015-08-01

    Full Text Available Building Information Modelling (BIM is seen as a panacea to many of the ills confronting the Architectural, Engineering and Construction (AEC sector. In spite of its well documented benefits the widespread integration of BIM into the project lifecycle is yet to occur. One commonly identified barrier to BIM adoption is the perceived legal risks associated with its integration, coupled with the need for implementation in a collaborative environment. Many existing standardised contracts used in the Australian AEC industry were drafted before the emergence of BIM. As BIM continues to become ingrained in the delivery process the shortcomings of these existing contracts have become apparent. This paper reports on a study that reviewed and consolidated the contractual and legal concerns associated with BIM implementation. The findings of the review were used to conduct a qualitative content analysis of the GC21 2nd edition, an Australian standardised construction contract, to identify possible changes to facilitate the implementation of BIM in a collaborative environment. The findings identified a number of changes including the need to adopt a collaborative contract structure with equitable risk and reward mechanisms, recognition of the model as a contract document and the need for standardisation of communication/information exchange.

  2. Phylogenetic Information Content of Copepoda Ribosomal DNA Repeat Units: ITS1 and ITS2 Impact

    Science.gov (United States)

    Zagoskin, Maxim V.; Lazareva, Valentina I.; Grishanin, Andrey K.; Mukha, Dmitry V.

    2014-01-01

    The utility of various regions of the ribosomal repeat unit for phylogenetic analysis was examined in 16 species representing four families, nine genera, and two orders of the subclass Copepoda (Crustacea). Fragments approximately 2000 bp in length containing the ribosomal DNA (rDNA) 18S and 28S gene fragments, the 5.8S gene, and the internal transcribed spacer regions I and II (ITS1 and ITS2) were amplified and analyzed. The DAMBE (Data Analysis in Molecular Biology and Evolution) software was used to analyze the saturation of nucleotide substitutions; this test revealed the suitability of both the 28S gene fragment and the ITS1/ITS2 rDNA regions for the reconstruction of phylogenetic trees. Distance (minimum evolution) and probabilistic (maximum likelihood, Bayesian) analyses of the data revealed that the 28S rDNA and the ITS1 and ITS2 regions are informative markers for inferring phylogenetic relationships among families of copepods and within the Cyclopidae family and associated genera. Split-graph analysis of concatenated ITS1/ITS2 rDNA regions of cyclopoid copepods suggested that the Mesocyclops, Thermocyclops, and Macrocyclops genera share complex evolutionary relationships. This study revealed that the ITS1 and ITS2 regions potentially represent different phylogenetic signals. PMID:25215300

  3. Information content of slug tests for estimating hydraulic properties in realistic, high-conductivity aquifer scenarios

    Science.gov (United States)

    Cardiff, Michael; Barrash, Warren; Thoma, Michael; Malama, Bwalya

    2011-06-01

    SummaryA recently developed unified model for partially-penetrating slug tests in unconfined aquifers ( Malama et al., in press) provides a semi-analytical solution for aquifer response at the wellbore in the presence of inertial effects and wellbore skin, and is able to model the full range of responses from overdamped/monotonic to underdamped/oscillatory. While the model provides a unifying framework for realistically analyzing slug tests in aquifers (with the ultimate goal of determining aquifer properties such as hydraulic conductivity K and specific storage Ss), it is currently unclear whether parameters of this model can be well-identified without significant prior information and, thus, what degree of information content can be expected from such slug tests. In this paper, we examine the information content of slug tests in realistic field scenarios with respect to estimating aquifer properties, through analysis of both numerical experiments and field datasets. First, through numerical experiments using Markov Chain Monte Carlo methods for gauging parameter uncertainty and identifiability, we find that: (1) as noted by previous researchers, estimation of aquifer storage parameters using slug test data is highly unreliable and subject to significant uncertainty; (2) joint estimation of aquifer and skin parameters contributes to significant uncertainty in both unless prior knowledge is available; and (3) similarly, without prior information joint estimation of both aquifer radial and vertical conductivity may be unreliable. These results have significant implications for the types of information that must be collected prior to slug test analysis in order to obtain reliable aquifer parameter estimates. For example, plausible estimates of aquifer anisotropy ratios and bounds on wellbore skin K should be obtained, if possible, a priori. Secondly, through analysis of field data - consisting of over 2500 records from partially-penetrating slug tests in a

  4. Maximum entropy methods

    International Nuclear Information System (INIS)

    Ponman, T.J.

    1984-01-01

    For some years now two different expressions have been in use for maximum entropy image restoration and there has been some controversy over which one is appropriate for a given problem. Here two further entropies are presented and it is argued that there is no single correct algorithm. The properties of the four different methods are compared using simple 1D simulations with a view to showing how they can be used together to gain as much information as possible about the original object. (orig.)

  5. A content analysis of depression-related discourses on Sina Weibo: attribution, efficacy, and information sources.

    Science.gov (United States)

    Pan, Jiabao; Liu, Bingjie; Kreps, Gary L

    2018-06-20

    Depression is a mood disorder that may lead to severe outcomes including mental breakdown, self-injury, and suicide. Potential causes of depression include genetic, sociocultural, and individual-level factors. However, public understandings of depression guided by a complex interplay of media and other societal discourses might not be congruent with the scientific knowledge. Misunderstandings of depression can lead to under-treatment and stigmatization of depression. Against this backdrop, this study aims to achieve a holistic understanding of the patterns and dynamics in discourses about depression from various information sources in China by looking at related posts on social media. A content analysis was conducted with 902 posts about depression randomly selected within a three-year period (2014 to 2016) on the mainstream social media platform in China, Sina Weibo. Posts were analyzed with a focus on attributions of and solutions to depression, attitudes towards depression, and efficacy indicated by the posts across various information sources. Results suggested that depression was most often attributed to individual-level factors. Across all the sources, individual-level attributions were often adopted by state-owned media whereas health and academic experts and organizations most often mentioned biological causes of depression. Citizen journalists and unofficial social groups tended to make societal-level attributions. Overall, traditional media posts suggested the lowest efficacy in coping with depression and the most severe negative outcomes as compared with other sources. The dominance of individual-level attributions and solutions regarding depression on Chinese social media on one hand manifests the public's limited understanding of depression and on the other hand, may further constrain adoption of scientific explanations about depression and exacerbate stigmatization towards depressed individuals. Mass media's posts centered on description of severe

  6. Statistical performance and information content of time lag analysis and redundancy analysis in time series modeling.

    Science.gov (United States)

    Angeler, David G; Viedma, Olga; Moreno, José M

    2009-11-01

    Time lag analysis (TLA) is a distance-based approach used to study temporal dynamics of ecological communities by measuring community dissimilarity over increasing time lags. Despite its increased use in recent years, its performance in comparison with other more direct methods (i.e., canonical ordination) has not been evaluated. This study fills this gap using extensive simulations and real data sets from experimental temporary ponds (true zooplankton communities) and landscape studies (landscape categories as pseudo-communities) that differ in community structure and anthropogenic stress history. Modeling time with a principal coordinate of neighborhood matrices (PCNM) approach, the canonical ordination technique (redundancy analysis; RDA) consistently outperformed the other statistical tests (i.e., TLAs, Mantel test, and RDA based on linear time trends) using all real data. In addition, the RDA-PCNM revealed different patterns of temporal change, and the strength of each individual time pattern, in terms of adjusted variance explained, could be evaluated, It also identified species contributions to these patterns of temporal change. This additional information is not provided by distance-based methods. The simulation study revealed better Type I error properties of the canonical ordination techniques compared with the distance-based approaches when no deterministic component of change was imposed on the communities. The simulation also revealed that strong emphasis on uniform deterministic change and low variability at other temporal scales is needed to result in decreased statistical power of the RDA-PCNM approach relative to the other methods. Based on the statistical performance of and information content provided by RDA-PCNM models, this technique serves ecologists as a powerful tool for modeling temporal change of ecological (pseudo-) communities.

  7. Information Content Analysis for Selection of Optimal JWST  Observing Modes for Transiting Exoplanet Atmospheres

    Energy Technology Data Exchange (ETDEWEB)

    Batalha, Natasha E. [Department of Astronomy and Astrophysics, Pennsylvania State University, State College, PA 16802 (United States); Line, M. R., E-mail: neb149@psu.edu [School of Earth and Space Exploration, Arizona State University, Phoenix, AZ 85282 (United States)

    2017-04-01

    The James Webb Space Telescope ( JWST ) is nearing its launch date of 2018, and is expected to revolutionize our knowledge of exoplanet atmospheres. In order to specifically identify which observing modes will be most useful for characterizing a diverse range of exoplanetary atmospheres, we use an information content (IC) based approach commonly used in the studies of solar system atmospheres. We develop a system based upon these IC methods to trace the instrumental and atmospheric model phase space in order to identify which observing modes are best suited for particular classes of planets, focusing on transmission spectra. Specifically, the atmospheric parameter space we cover is T  = 600–1800 K, C/O = 0.55–1, [M/H] = 1–100 × Solar for an R  = 1.39 R{sub J}, M  = 0.59 M{sub J} planet orbiting a WASP-62-like star. We also explore the influence of a simplified opaque gray cloud on the IC. We find that obtaining broader wavelength coverage over multiple modes is preferred over higher precision in a single mode given the same amount of observing time. Regardless of the planet temperature and composition, the best modes for constraining terminator temperatures, C/O ratios, and metallicity are NIRISS SOSS+NIRSpec G395. If the target’s host star is dim enough such that the NIRSpec prism is applicable, then it can be used instead of NIRISS SOSS+NIRSpec G395. Lastly, observations that use more than two modes should be carefully analyzed because sometimes the addition of a third mode results in no gain of information. In these cases, higher precision in the original two modes is favorable.

  8. Correlating Information Contents of Gene Ontology Terms to Infer Semantic Similarity of Gene Products

    Directory of Open Access Journals (Sweden)

    Mingxin Gan

    2014-01-01

    Full Text Available Successful applications of the gene ontology to the inference of functional relationships between gene products in recent years have raised the need for computational methods to automatically calculate semantic similarity between gene products based on semantic similarity of gene ontology terms. Nevertheless, existing methods, though having been widely used in a variety of applications, may significantly overestimate semantic similarity between genes that are actually not functionally related, thereby yielding misleading results in applications. To overcome this limitation, we propose to represent a gene product as a vector that is composed of information contents of gene ontology terms annotated for the gene product, and we suggest calculating similarity between two gene products as the relatedness of their corresponding vectors using three measures: Pearson’s correlation coefficient, cosine similarity, and the Jaccard index. We focus on the biological process domain of the gene ontology and annotations of yeast proteins to study the effectiveness of the proposed measures. Results show that semantic similarity scores calculated using the proposed measures are more consistent with known biological knowledge than those derived using a list of existing methods, suggesting the effectiveness of our method in characterizing functional relationships between gene products.

  9. The importance of metadata to assess information content in digital reconstructions of neuronal morphology.

    Science.gov (United States)

    Parekh, Ruchi; Armañanzas, Rubén; Ascoli, Giorgio A

    2015-04-01

    Digital reconstructions of axonal and dendritic arbors provide a powerful representation of neuronal morphology in formats amenable to quantitative analysis, computational modeling, and data mining. Reconstructed files, however, require adequate metadata to identify the appropriate animal species, developmental stage, brain region, and neuron type. Moreover, experimental details about tissue processing, neurite visualization and microscopic imaging are essential to assess the information content of digital morphologies. Typical morphological reconstructions only partially capture the underlying biological reality. Tracings are often limited to certain domains (e.g., dendrites and not axons), may be incomplete due to tissue sectioning, imperfect staining, and limited imaging resolution, or can disregard aspects irrelevant to their specific scientific focus (such as branch thickness or depth). Gauging these factors is critical in subsequent data reuse and comparison. NeuroMorpho.Org is a central repository of reconstructions from many laboratories and experimental conditions. Here, we introduce substantial additions to the existing metadata annotation aimed to describe the completeness of the reconstructed neurons in NeuroMorpho.Org. These expanded metadata form a suitable basis for effective description of neuromorphological data.

  10. The number of cell types, information content, and the evolution of complex multicellularity

    Directory of Open Access Journals (Sweden)

    Karl J. Niklas

    2014-12-01

    Full Text Available The number of different cell types (NCT characterizing an organism is often used to quantify organismic complexity. This method results in the tautology that more complex organisms have a larger number of different kinds of cells, and that organisms with more different kinds of cells are more complex. This circular reasoning can be avoided (and simultaneously tested when NCT is plotted against different measures of organismic information content (e.g., genome or proteome size. This approach is illustrated by plotting the NCT of representative diatoms, green and brown algae, land plants, invertebrates, and vertebrates against data for genome size (number of base-pairs, proteome size (number of amino acids, and proteome functional versatility (number of intrinsically disordered protein domains or residues. Statistical analyses of these data indicate that increases in NCT fail to keep pace with increases in genome size, but exceed a one-to-one scaling relationship with increasing proteome size and with increasing numbers of intrinsically disordered protein residues. We interpret these trends to indicate that comparatively small increases in proteome (and not genome size are associated with disproportionate increases in NCT, and that proteins with intrinsically disordered domains enhance cell type diversity and thus contribute to the evolution of complex multicellularity.

  11. Students' Attitudes to Information in the Press: Critical Reading of a Newspaper Article With Scientific Content

    Science.gov (United States)

    Oliveras, B.; Márquez, C.; Sanmartí, N.

    2014-08-01

    This research analyses what happens when a critical reading activity based on a press article dealing with an energy-related problem is implemented with two groups of students of 13-14 years old and 16-17 years old in the same school (a total of 117 students). Specifically, the research analyses the students' profiles from the standpoint of their attitudes to the information given in the news story and the use they make of it when writing an argumentative text. It also analyses the difficulties the students have when it comes to applying their knowledge about energy in a real-life context. Lastly, some strategies are suggested for helping students to critically analyse the scientific content of a newspaper article. Three reader profiles were identified (the credulous reader, the ideological reader and the critical reader). No significant differences were found in reading profiles in terms of age or scientific knowledge. The findings show that the activity helped to link science learning in school with facts relating to an actual context, particularly in the case of students with more science knowledge.

  12. Health Information Needs and Health Seeking Behavior During the 2014-2016 Ebola Outbreak: A Twitter Content Analysis.

    Science.gov (United States)

    Odlum, Michelle; Yoon, Sunmoo

    2018-03-23

    For effective public communication during major disease outbreaks like the 2014-2016 Ebola epidemic, health information needs of the population must be adequately assessed. Through content analysis of social media data, like tweets, public health information needs can be effectively assessed and in turn provide appropriate health information to address such needs. The aim of the current study was to assess health information needs about Ebola, at distinct epidemic time points, through longitudinal tracking. Natural language processing was applied to explore public response to Ebola over time from July 2014 to March 2015. A total 155,647 tweets (unique 68,736, retweet 86,911) mentioning Ebola were analyzed and visualized with infographics. Public fear, frustration, and health information seeking regarding Ebola-related global priorities were observed across time. Our longitudinal content analysis revealed that due to ongoing health information deficiencies, resulting in fear and frustration, social media was at times an impediment and not a vehicle to support health information needs. Content analysis of tweets effectively assessed Ebola information needs. Our study also demonstrates the use of Twitter as a method for capturing real-time data to assess ongoing information needs, fear, and frustration over time.

  13. The Information Content of Note Disclosures and MD&A Information in the Financial Report – A Study of Market Reactions in Denmark

    DEFF Research Database (Denmark)

    Thinggaard, Frank; Sønderby Jeppesen, Carsten; Madsen, Kasper

    2015-01-01

    The preparation of disclosures in the financial report constitutes a significant cost to most companies, but do the disclosures have information content to investors? This paper examines stock market reactions to the release of note disclosures and MD&A (management discussion and analysis......) information. The study is based on data from the Danish capital market in 2006-2009 because here it is largely possible to isolate the release of such information from other information in the financial report. The primary results suggest that for some companies, note disclosures and information in the MD...

  14. A content analysis of visual cancer information: prevalence and use of photographs and illustrations in printed health materials.

    Science.gov (United States)

    King, Andy J

    2015-01-01

    Researchers and practitioners have an increasing interest in visual components of health information and health communication messages. This study contributes to this evolving body of research by providing an account of the visual images and information featured in printed cancer communication materials. Using content analysis, 147 pamphlets and 858 images were examined to determine how frequently images are used in printed materials, what types of images are used, what information is conveyed visually, and whether or not current recommendations for the inclusion of visual content were being followed. Although visual messages were found to be common in printed health materials, existing recommendations about the inclusion of visual content were only partially followed. Results are discussed in terms of how relevant theoretical frameworks in the areas of behavior change and visual persuasion seem to be used in these materials, as well as how more theory-oriented research is necessary in visual messaging efforts.

  15. Assessment of the contents related to screening on Portuguese language websites providing information on breast and prostate cancer

    Directory of Open Access Journals (Sweden)

    Daniel Ferreira

    2013-11-01

    Full Text Available The objective of this study was to assess the quality of the contents related to screening in a sample of websites providing information on breast and prostate cancer in the Portuguese language. The first 200 results of each cancer-specific Google search were considered. The accuracy of the screening contents was defined in accordance with the state of the art, and its readability was assessed. Most websites mentioned mammography as a method for breast cancer screening (80%, although only 28% referred to it as the only recommended method. Almost all websites mentioned PSA evaluation as a possible screening test, but correct information regarding its effectiveness was given in less than 10%. For both breast and prostate cancer screening contents, the potential for overdiagnosis and false positive results was seldom addressed, and the median readability index was approximately 70. There is ample margin for improving the quality of websites providing information on breast and prostate cancer in Portuguese.

  16. Chiropractic wellness on the web: the content and quality of information related to wellness and primary prevention on the Internet

    Directory of Open Access Journals (Sweden)

    Evans Marion

    2011-02-01

    Full Text Available Abstract Background The Internet has become a common source of information for patients wishing to learn about health information. Previous studies found information related to back pain poor and often contradictory to current guidelines. Wellness has become a common topic in the field of chiropractic and accrediting agencies have standards on delivery of wellness-based content in college curricula as well as directives for clinical applications. The purpose of this study was to evaluate the quality of the information on the Internet using the terms "chiropractic wellness," or "wellness chiropractic". Methods Five commonly used search engines were selected and the first 10 sites found using the strategy above were evaluated by two raters. Demographic assessments of the sites were made along with whether they were Health on the Net Foundation (HON certified, contained standard wellness content, mentioned any Healthy People Focus Areas, and other chiropractic topics. Kappa statistics compared inter-rater agreement. Results Potential patients appeared to be the audience 87% of the time and a private doctor of chiropractic appeared to be the typical site owner. The sites usually promoted the provider. No sites displayed HON certification logo nor did any appear to meet the HON certification criteria. Twenty-six sites (55% promoted regular physical activity in some manner and 18 (38% had information on health risks of tobacco. Four (9% had mental health or stress-reduction content but none had information supportive of vaccination. Some had information contradictory to common public health measures. Conclusions Patients searching the Internet for chiropractic wellness information will often find useless information that will not help them maintain health or become well. Most simply market the chiropractic practice or allow for a patients to provide personal information in exchange for more 'wellness' information. More research should be done on how

  17. Observability considerations for multi-sensor and product fusion: Bias, information content, and validation (Invited)

    Science.gov (United States)

    Reid, J. S.; Zhang, J.; Hyer, E. J.; Campbell, J. R.; Christopher, S. A.; Ferrare, R. A.; Leptoukh, G. G.; Stackhouse, P. W.

    2009-12-01

    With the successful development of many aerosol products from the NASA A-train as well as new operational geostationary and polar orbiting sensors, the scientific community now has a host of new parameters to use in their analyses. The variety and quality of products has reached a point where the community has moved from basic observation-based science to sophisticated multi-component research that addresses the complex atmospheric environment. In order for these satellite data contribute to the science their uncertainty levels must move from semi-quantitative to quantitative. Initial attempts to quantify uncertainties have led to some recent debate in the community as to the efficacy of aerosol products from current and future NASA satellite sensors. In an effort to understand the state of satellite product fidelity, the Naval Research Laboratory and a newly reformed Global Energy and Water Cycle Experiment (GEWEX) aerosol panel have both initiated assessments of the nature of aerosol remote sensing uncertainty and bias. In this talk we go over areas of specific concern based on the authors’ experiences with the data, emphasizing the multi-sensor problem. We first enumerate potential biases, including retrieval, sampling/contextual, and cognitive bias. We show examples of how these biases can subsequently lead to the pitfalls of correlated/compensating errors, tautology, and confounding. The nature of bias is closely related to the information content of the sensor signal and its subsequent application to the derived aerosol quantity of interest (e.g., optical depth, flux, index of refraction, etc.). Consequently, purpose-specific validation methods must be employed, especially when generating multi-sensor products. Indeed, cloud and lower boundary condition biases in particular complicate the more typical methods of regressional bias elimination and histogram matching. We close with a discussion of sequestration of uncertainty in multi-sensor applications of

  18. Effects of input data information content on the uncertainty of simulating water resources

    Science.gov (United States)

    Camargos, Carla; Julich, Stefan; Bach, Martin; Breuer, Lutz

    2017-04-01

    Hydrological models like the Soil and Water Assessment Tool (SWAT) demand a large variety of spatial input data. These are commonly available in different resolutions and result from different preprocessing methodologies. Effort is made to apply the most specific data as possible for the study area, which features heterogeneous landscape elements. Most often, modelers prefer to use regional data, especially with fine resolution, which is not always available. Instead, global datasets are considered that are more general. This study investigates how the use of global and regional input datasets may affect the simulation performance and uncertainty of the model. We analyzed eight different setups for the SWAT model, combining two of each Digital Elevation Models (DEM), soil and land use maps of diverse spatial resolution and information content. The models were calibrated to discharge at two stations across the mesoscale Haute-Sûre catchment, which is partly located in the north of Luxembourg and partly in the southeast of Belgium. The region is a rural area of about 743 km2 and mainly covered by forests and complex agricultural system and arable lands. As part of the catchment, the Upper-Sûre Lake is an important source of drinking water for Luxembourgish population, satisfying 30% of the country's demand. The Metropolis Markov Chain Monte Carlo algorithm implemented in the SPOTPY python package was used to infer posterior parameter distributions and assess parameter uncertainty. We are optimizing the mean of the Nash-Sutcliffe Efficiency (NSE) and the logarithm of NSE. We focused on soil physical, groundwater, main channel, land cover management and basin physical process parameters. Preliminary results indicate that the model has the best performance when using the regional DEM and land use map and the global soil map, indicating that SWAT cannot necessarily make use of additional soil information if they are not substantially effecting soil hydrological fluxes

  19. Provision of information to consumers about the calorie content of alcoholic drinks: did the Responsibility Deal pledge by alcohol retailers and producers increase the availability of calorie information?

    Science.gov (United States)

    Petticrew, M; Douglas, N; Knai, C; Maani Hessari, N; Durand, M A; Eastmure, E; Mays, N

    2017-08-01

    Alcohol is a significant source of dietary calories and is a contributor to obesity. Industry pledges to provide calorie information to consumers have been cited as reasons for not introducing mandatory ingredient labelling. As part of the Public Health Responsibility Deal (RD) in England, alcohol retailers and producers committed to providing consumers with information on the calorie content of alcoholic drinks. This study examines what was achieved following this commitment and considers the implications for current industry commitments to provide information on alcohol calories. Analysis of RD pledge delivery plans and progress reports. Assessment of calorie information in supermarkets and in online stores. (i) Analysis of the content of pledge delivery plans and annual progress reports of RD signatories to determine what action they had committed to, and had taken, to provide calorie information. (ii) Analysis of the availability of calorie information on product labels; in UK supermarkets; and on online shopping sites and websites. No information was provided in any of 55 stores chosen to represent all the main UK supermarkets. Calorie information was not routinely provided on supermarkets' websites, or on product labels. One of the stated purposes of the RD was to provide consumers with the information to make informed health-related choices, including providing information on the calorie content of alcoholic drinks. This study indicates that this did not take place to any significant extent. The voluntary implementation of alcohol calorie labelling by industry needs to continue to be carefully monitored to determine whether and how it is done. Copyright © 2017 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  20. Here's an idea: ask the users! Young people's views on navigation, design and content of a health information website.

    Science.gov (United States)

    Franck, Linda S; Noble, Genevieve

    2007-12-01

    Use of the internet to provide health information to young people is a relatively recent development. Few studies have explored young people's views on how they use internet health websites. This study investigated the navigation, design and content preferences of young people using the Children First for Health (CFfH) website. Young people from five secondary schools completed an internet site navigation exercise, website evaluation questionnaire and participated in informal discussions. Of the participants, 45 percent visited the website section aimed at older adolescents within their first two clicks, regardless of their age. There were conflicting preferences for design and strong preference for gender-specific information on topics such as appearance, relationships, fitness and sexual health. The findings indicate the importance of gaining young people's views to ensure that health information websites meet the needs of their intended audience. Cooperation from schools can facilitate the process of gaining young people's views on internet website navigation, design and content.

  1. Sex in advertising research: a review of content, effects, and functions of sexual information in consumer advertising.

    Science.gov (United States)

    Reichert, Tom

    2002-01-01

    This article is a review of academic research on the content and effects of sexual information in advertising (i.e., sex in advertising). In addition to covering common types of sexual content analyzed in research, inquiry on processing and emotional response effects is reviewed. Several areas for continued research are identified, especially with regard to advertisers' use of sexual outcomes as reasons for using brands and the ability of sexual information to influence brand perceptions. This review has applicability to advertising and marketing research and practice, as well as to any area that employs sexual information for persuasive purposes (e.g., safer-sex social marketing campaigns). In addition, it is hoped that sex researchers will recognize and elaborate on the role of sexual response identified in this research to further inform advertising theory and effects research.

  2. The Content and Quality of Health Information on the Internet for Patients and Families on Adult Kidney Cancer.

    Science.gov (United States)

    Alsaiari, Ahmed; Joury, Abdulaziz; Aljuaid, Mossab; Wazzan, Mohammed; Pines, Jesse M

    2017-12-01

    The Internet is one of the major sources for health information for patients and their families, particularly when patients face serious life-threatening conditions such as kidney cancer in adults. In this study, we evaluate the content and quality of health information on adult kidney cancer using several validated instruments. We accessed the three most popular search engines (Google, Yahoo, Bing), using two terms: "kidney cancer" and "renal cell carcinoma," and reviewed the top 30 hits. After exclusion of duplicated websites, websites targeting health care professionals, and unrelated websites, 35 websites were included. Content was assessed using a 22-item checklist adapted from the American Cancer Society. We assessed website quality using the DISCERN questionnaire, HONcode and JAMA benchmark criteria, readability using three readability scores, and ALEXA for global traffic ranking systems. The average website had 16 of 22 content items while 6 websites fulfilled all 22 items. Among all websites, the average DISCERN quality score was 42 out of 80, 15 (42.8 %) of websites had HONcode certification, and only 3 (8.5 %) fulfilled all JAMA benchmark criteria. The average website readability was at the ninth grade reading level. The content and quality of health-related information on the Internet for adult kidney cancer are variable in comprehensiveness and quality. Many websites are difficult to read without a high school education. A standardized approach to presenting cancer information on the Internet for patients and families may be warranted.

  3. Library and Information Science Research Areas: A Content Analysis of Articles from the Top 10 Journals 2007-8

    Science.gov (United States)

    Aharony, Noa

    2012-01-01

    The current study seeks to describe and analyze journal research publications in the top 10 Library and Information Science journals from 2007-8. The paper presents a statistical descriptive analysis of authorship patterns (geographical distribution and affiliation) and keywords. Furthermore, it displays a thorough content analysis of keywords and…

  4. What Health-Related Information Flows through You Every Day? A Content Analysis of Microblog Messages on Air Pollution

    Science.gov (United States)

    Yang, Qinghua; Yang, Fan; Zhou, Chun

    2015-01-01

    Purpose: The purpose of this paper is to investigate how the information about haze, a term used in China to describe the air pollution problem, is portrayed on Chinese social media by different types of organizations using the theoretical framework of the health belief model (HBM). Design/methodology/approach: A content analysis was conducted…

  5. The Significance of Content Knowledge for Informal Reasoning regarding Socioscientific Issues: Applying Genetics Knowledge to Genetic Engineering Issues

    Science.gov (United States)

    Sadler, Troy D.; Zeidler, Dana L.

    2005-01-01

    This study focused on informal reasoning regarding socioscientific issues. It sought to explore how content knowledge influenced the negotiation and resolution of contentious and complex scenarios based on genetic engineering. Two hundred and sixty-nine students drawn from undergraduate natural science and nonnatural science courses completed a…

  6. Learning Styles, Online Content Usage and Exam Performance in a Mixed-Format Introductory Computer Information Systems Course

    Science.gov (United States)

    Lang, Guido; O'Connell, Stephen D.

    2015-01-01

    We investigate the relationship between learning styles, online content usage and exam performance in an undergraduate introductory Computer Information Systems class comprised of both online video tutorials and in-person classes. Our findings suggest that, across students, (1) traditional learning style classification methodologies do not predict…

  7. The Effect of Information Analysis Automation Display Content on Human Judgment Performance in Noisy Environments

    OpenAIRE

    Bass, Ellen J.; Baumgart, Leigh A.; Shepley, Kathryn Klein

    2012-01-01

    Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noi...

  8. Information Seeking in Social Media: A Review of YouTube for Sedentary Behavior Content.

    Science.gov (United States)

    Knight, Emily; Intzandt, Brittany; MacDougall, Alicia; Saunders, Travis J

    2015-01-20

    The global prevalence of sedentary lifestyles is of grave concern for public health around the world. Moreover, the health risk of sedentary behaviors is of growing interest for researchers, clinicians, and the general public as evidence demonstrates that prolonged amounts of sedentary time increases risk for lifestyle-related diseases. There is a growing trend in the literature that reports how social media can facilitate knowledge sharing and collaboration. Social sites like YouTube facilitate the sharing of media content between users. The purpose of this project was to identify sedentary behavior content on YouTube and describe features of this content that may impact the effectiveness of YouTube for knowledge translation. YouTube was searched on a single day by 3 independent reviewers for evidence-based sedentary behavior content. Subjective data (eg, video purpose, source, and activity type portrayed) and objective data (eg, number of views, comments, shares, and length of the video) were collected from video. In total, 106 videos met inclusion criteria. Videos were uploaded from 13 countries around the globe (ie, Australia, Barbados, Belgium, Canada, Colombia, Kenya, New Zealand, Russia, South Africa, Spain, Ukraine, United Kingdom, United States). The median video length was 3:00 minutes: interquartile range (IQR) 1:44-5:40. On average, videos had been on YouTube for 15.0 months (IQR 6.0-27.5) and had been viewed 239.0 times (IQR 44.5-917.5). Videos had remarkably low numbers of shares (median 0) and comments (median 1). Only 37.7% (40/106) of videos portrayed content on sedentary behaviors, while the remaining 66 videos portrayed physical activity or a mix of behaviors. Academic/health organizations (39.6%, 42/106) and individuals (38.7%, 41/106) were the most prevalent source of videos, and most videos (67.0%, 71/106) aimed to educate viewers about the topic. This study explored sedentary behavior content available on YouTube. Findings demonstrate that

  9. Maximum Entropy Fundamentals

    Directory of Open Access Journals (Sweden)

    F. Topsøe

    2001-09-01

    Full Text Available Abstract: In its modern formulation, the Maximum Entropy Principle was promoted by E.T. Jaynes, starting in the mid-fifties. The principle dictates that one should look for a distribution, consistent with available information, which maximizes the entropy. However, this principle focuses only on distributions and it appears advantageous to bring information theoretical thinking more prominently into play by also focusing on the "observer" and on coding. This view was brought forward by the second named author in the late seventies and is the view we will follow-up on here. It leads to the consideration of a certain game, the Code Length Game and, via standard game theoretical thinking, to a principle of Game Theoretical Equilibrium. This principle is more basic than the Maximum Entropy Principle in the sense that the search for one type of optimal strategies in the Code Length Game translates directly into the search for distributions with maximum entropy. In the present paper we offer a self-contained and comprehensive treatment of fundamentals of both principles mentioned, based on a study of the Code Length Game. Though new concepts and results are presented, the reading should be instructional and accessible to a rather wide audience, at least if certain mathematical details are left aside at a rst reading. The most frequently studied instance of entropy maximization pertains to the Mean Energy Model which involves a moment constraint related to a given function, here taken to represent "energy". This type of application is very well known from the literature with hundreds of applications pertaining to several different elds and will also here serve as important illustration of the theory. But our approach reaches further, especially regarding the study of continuity properties of the entropy function, and this leads to new results which allow a discussion of models with so-called entropy loss. These results have tempted us to speculate over

  10. TRX-LOGOS - a graphical tool to demonstrate DNA information content dependent upon backbone dynamics in addition to base sequence.

    Science.gov (United States)

    Fortin, Connor H; Schulze, Katharina V; Babbitt, Gregory A

    2015-01-01

    It is now widely-accepted that DNA sequences defining DNA-protein interactions functionally depend upon local biophysical features of DNA backbone that are important in defining sites of binding interaction in the genome (e.g. DNA shape, charge and intrinsic dynamics). However, these physical features of DNA polymer are not directly apparent when analyzing and viewing Shannon information content calculated at single nucleobases in a traditional sequence logo plot. Thus, sequence logos plots are severely limited in that they convey no explicit information regarding the structural dynamics of DNA backbone, a feature often critical to binding specificity. We present TRX-LOGOS, an R software package and Perl wrapper code that interfaces the JASPAR database for computational regulatory genomics. TRX-LOGOS extends the traditional sequence logo plot to include Shannon information content calculated with regard to the dinucleotide-based BI-BII conformation shifts in phosphate linkages on the DNA backbone, thereby adding a visual measure of intrinsic DNA flexibility that can be critical for many DNA-protein interactions. TRX-LOGOS is available as an R graphics module offered at both SourceForge and as a download supplement at this journal. To demonstrate the general utility of TRX logo plots, we first calculated the information content for 416 Saccharomyces cerevisiae transcription factor binding sites functionally confirmed in the Yeastract database and matched to previously published yeast genomic alignments. We discovered that flanking regions contain significantly elevated information content at phosphate linkages than can be observed at nucleobases. We also examined broader transcription factor classifications defined by the JASPAR database, and discovered that many general signatures of transcription factor binding are locally more information rich at the level of DNA backbone dynamics than nucleobase sequence. We used TRX-logos in combination with MEGA 6.0 software

  11. Approximate maximum parsimony and ancestral maximum likelihood.

    Science.gov (United States)

    Alon, Noga; Chor, Benny; Pardi, Fabio; Rapoport, Anat

    2010-01-01

    We explore the maximum parsimony (MP) and ancestral maximum likelihood (AML) criteria in phylogenetic tree reconstruction. Both problems are NP-hard, so we seek approximate solutions. We formulate the two problems as Steiner tree problems under appropriate distances. The gist of our approach is the succinct characterization of Steiner trees for a small number of leaves for the two distances. This enables the use of known Steiner tree approximation algorithms. The approach leads to a 16/9 approximation ratio for AML and asymptotically to a 1.55 approximation ratio for MP.

  12. Maximum permissible dose

    International Nuclear Information System (INIS)

    Anon.

    1979-01-01

    This chapter presents a historic overview of the establishment of radiation guidelines by various national and international agencies. The use of maximum permissible dose and maximum permissible body burden limits to derive working standards is discussed

  13. Diabetes prevention information in Japanese magazines with the largest print runs. Content analysis using clinical guidelines as a standard.

    Science.gov (United States)

    Noda, Emi; Mifune, Taka; Nakayama, Takeo

    2013-01-01

    To characterize information on diabetes prevention appearing in Japanese general health magazines and to examine the agreement of the content with that in clinical practice guidelines for the treatment of diabetes in Japan. We used the Japanese magazines' databases provided by the Media Research Center and selected magazines with large print runs published in 2006. Two medical professionals independently conducted content analysis based on items in the diabetes prevention guidelines. The number of pages for each item and agreement with the information in the guidelines were determined. We found 63 issues of magazines amounting to 8,982 pages; 484 pages included diabetes prevention related content. For 23 items included in the diabetes prevention guidelines, overall agreement of information printed in the magazines with that in the guidelines was 64.5% (471 out of 730). The number of times these items were referred to in the magazines varied widely, from 247 times for food items to 0 times for items on screening for pregnancy-induced diabetes, dyslipidemia, and hypertension. Among the 20 items that were referred to at least once, 18 items showed more than 90% agreement with the guidelines. However, there was poor agreement for information on vegetable oil (2/14, 14%) and for specific foods (5/247, 2%). For the fatty acids category, "fat" was not mentioned in the guidelines; however, the term frequently appeared in magazines. "Uncertainty" was never mentioned in magazines for specific food items. The diabetes prevention related content in the health magazines differed from that defined in clinical practice guidelines. Most information in the magazines agreed with the guidelines, however some items were referred to inappropriately. To disseminate correct information to the public on diabetes prevention, health professionals and the media must collaborate.

  14. Extraction of Graph Information Based on Image Contents and the Use of Ontology

    Science.gov (United States)

    Kanjanawattana, Sarunya; Kimura, Masaomi

    2016-01-01

    A graph is an effective form of data representation used to summarize complex information. Explicit information such as the relationship between the X- and Y-axes can be easily extracted from a graph by applying human intelligence. However, implicit knowledge such as information obtained from other related concepts in an ontology also resides in…

  15. Informing consumers about 'hidden' advertising. A literature review of the effects of disclosing sponsored content

    NARCIS (Netherlands)

    Boerman, S.C.; van Reijmersdal, E.A.; De Pelsmacker, P.

    2016-01-01

    This chapter provides an overview of what is currently known in the scientific literature about the effects of disclosures of sponsored content on consumers' responses. Methodology We provide a qualitative literature review of 21 empirical studies. Findings Awareness of disclosures is rather low,

  16. AN EMPIRICAL EXAMINATION OF THE DIVIDEND INFORMATION CONTENTS IN THE BALANCE SHEET: A Signaling Approach*

    Directory of Open Access Journals (Sweden)

    R. Agus Sartono

    2013-09-01

    finding shows that in Indonesia, the market reactions to the dividend announcements depend on the role of dividend signals, whether it is confirmatory, clarificatory, or unclear. The other finding shows that this market is more concern to the content expected favorableness rather than to the dividend sign.

  17. Content validity of governing in Building Information Modelling (BIM) implementation assessment instrument

    Science.gov (United States)

    Hadzaman, N. A. H.; Takim, R.; Nawawi, A. H.; Mohamad Yusuwan, N.

    2018-04-01

    BIM governance assessment instrument is a process of analysing the importance in developing BIM governance solution to tackle the existing problems during team collaboration in BIM-based projects. Despite the deployment of integrative technologies in construction industry particularly BIM, it is still insufficient compare to other sectors. Several studies have been established the requirements of BIM implementation concerning all technical and non-technical BIM adoption issues. However, the data are regarded as inadequate to develop a BIM governance framework. Hence, the objective of the paper is to evaluate the content validity of the BIM governance instrument prior to the main data collection. Two methods were employed in the form of literature review and questionnaire survey. Based on the literature review, 273 items with six main constructs are suggested to be incorporated in the BIM governance instrument. The Content Validity Ratio (CVR) scores revealed that 202 out of 273 items are considered as the utmost critical by the content experts. The findings for Item Level Content Validity Index (I-CVI) and Modified Kappa Coefficient however revealed that 257 items in BIM governance instrument are appropriate and excellent. The instrument is highly reliable for future strategies and the development of BIM projects in Malaysia.

  18. A fuzzy model of a European index based on automatically extracted content information

    NARCIS (Netherlands)

    Milea, D.V.; Almeida, R.J.; Kaymak, U.; Frasincar, F.

    2011-01-01

    In this paper we build on previous work related to predicting the MSCI EURO index based on content analysis of ECB statements. Our focus is on reducing the number of features employed for prediction through feature selection. For this purpose we rely on two methodologies: (stepwise) linear

  19. Sexuality Information Needs of Latino and African American Ninth Graders: A Content Analysis of Anonymous Questions

    Science.gov (United States)

    Angulo-Olaiz, Francisca; Goldfarb, Eva S.; Constantine, Norman A.

    2014-01-01

    This study used qualitative content analysis to examine anonymous questions about sex and sexuality submitted by Latino and African American adolescents in Los Angeles, California, classrooms. The majority of questions asked about sexuality and sexual behavior, or anatomy and physiology, with fewer questions about pregnancy and pregnancy…

  20. Informed choice in direct-to-consumer genetic testing (DTCGT) websites: a content analysis of benefits, risks, and limitations.

    Science.gov (United States)

    Singleton, Amanda; Erby, Lori Hamby; Foisie, Kathryn V; Kaphingst, Kimberly A

    2012-06-01

    An informed choice about health-related direct-to-consumer genetic testing (DTCGT) requires knowledge of potential benefits, risks, and limitations. To understand the information that potential consumers of DTCGT services are exposed to on company websites, we conducted a content analysis of 23 health-related DTCGT websites. Results revealed that benefit statements outweighed risk and limitation statements 6 to 1. The most frequently described benefits were: 1) disease prevention, 2) consumer education, 3) personalized medical recommendations, and 4) the ability to make health decisions. Thirty-five percent of websites also presented at least one risk of testing. Seventy-eight percent of websites mentioned at least one limitation of testing. Based on this information, potential consumers might get an inaccurate picture of genetic testing which could impact their ability to make an informed decision. Practices that enhance the presentation of balanced information on DTCGT company websites should be encouraged.

  1. A Study of Information Content in the U.S. Television Commercials: Has It Become Less Informative but More Creative?

    Science.gov (United States)

    Ng, Daniel; Supaporn, Potibut

    A study investigated the trend of current U.S. television commercial informativeness by comparing the results with Alan Resnik and Bruce Stern's previous benchmark study conducted in 1977. A systematic random sampling procedure was used to select viewing dates and times of commercials from the three national networks. Ultimately, a total of 550…

  2. WWW mesothelioma information: Surfing on unreliable waters. A cross-sectional study into the content and quality of online informational resources for mesothelioma patients.

    Science.gov (United States)

    Soloukey Tbalvandany, S Sadaf; Maat, A Alexander P W M; Cornelissen, R Robin; Nuyttens, J Joost J M E; Takkenberg, J Johanna J M

    2018-06-01

    Malignant Mesothelioma (MM) is a rare asbestos related disease mostly diagnosed in low-skilled patients. The decision-making process for MM treatment is complicated, making an adequate provision of information necessary. The objective of this study is to assess the content and quality of online informational resources available for Dutch MM patients. The first 100 hits of a Google search were studied using the JAMA benchmarks, the Modified Information Score (MIS) and the International Patient Decision Aid Standard Scoring (IPDAS). A total of 37 sources were included. Six of the 37 resources were published by hospitals. On average, the informational resources scored 37 points on the MIS (scale 0-100). The resources from a (bio)medical sources scored the best on this scale. However, on the domain of use of language, these resources scored the worst. The current level of medical content and quality of online informational resources for patient with MM is below average and cannot be used as decision-aids for patients. The criteria used in this article could be used for future improvements of online informational resources for patients, both online, offline and through health education in the care path. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Exploiting the information content of hydrological ''outliers'' for goodness-of-fit testing

    Directory of Open Access Journals (Sweden)

    F. Laio

    2010-10-01

    Full Text Available Validation of probabilistic models based on goodness-of-fit tests is an essential step for the frequency analysis of extreme events. The outcome of standard testing techniques, however, is mainly determined by the behavior of the hypothetical model, FX(x, in the central part of the distribution, while the behavior in the tails of the distribution, which is indeed very relevant in hydrological applications, is relatively unimportant for the results of the tests. The maximum-value test, originally proposed as a technique for outlier detection, is a suitable, but seldom applied, technique that addresses this problem. The test is specifically targeted to verify if the maximum (or minimum values in the sample are consistent with the hypothesis that the distribution FX(x is the real parent distribution. The application of this test is hindered by the fact that the critical values for the test should be numerically obtained when the parameters of FX(x are estimated on the same sample used for verification, which is the standard situation in hydrological applications. We propose here a simple, analytically explicit, technique to suitably account for this effect, based on the application of censored L-moments estimators of the parameters. We demonstrate, with an application that uses artificially generated samples, the superiority of this modified maximum-value test with respect to the standard version of the test. We also show that the test has comparable or larger power with respect to other goodness-of-fit tests (e.g., chi-squared test, Anderson-Darling test, Fung and Paul test, in particular when dealing with small samples (sample size lower than 20–25 and when the parent distribution is similar to the distribution being tested.

  4. Information-seeking strategies and science content understandings of sixth-grade students using on-line learning environments

    Science.gov (United States)

    Hoffman, Joseph Loris

    1999-11-01

    This study examined the information-seeking strategies and science content understandings learners developed as a result of using on-line resources in the University of Michigan Digital Library and on the World Wide Web. Eight pairs of sixth grade students from two teachers' classrooms were observed during inquiries for astronomy, ecology, geology, and weather, and a final transfer task assessed learners' capabilities at the end of the school year. Data included video recordings of students' screen activity and conversations, journals and completed activity sheets, final artifacts, and semi-structured interviews. Learners' information-seeking strategies included activities related to asking, planning, tool usage, searching, assessing, synthesizing, writing, and creating. Analysis of data found a majority of learners posed meaningful, openended questions, used technological tools appropriately, developed pertinent search topics, were thoughtful in queries to the digital library, browsed sites purposefully to locate information, and constructed artifacts with novel formats. Students faced challenges when planning activities, assessing resources, and synthesizing information. Possible explanations were posed linking pedagogical practices with learners' growth and use of inquiry strategies. Data from classroom-lab video and teacher interviews showed varying degrees of student scaffolding: development and critique of initial questions, utilization of search tools, use of journals for reflection on activities, and requirements for final artifacts. Science content understandings included recalling information, offering explanations, articulating relationships, and extending explanations. A majority of learners constructed partial understandings limited to information recall and simple explanations, and these occasionally contained inaccurate conceptualizations. Web site design features had some influence on the construction of learners' content understandings. Analysis of

  5. Embedding Sustainability Instruction across Content Areas: best Classroom Practices from Informal Environmental Education

    Science.gov (United States)

    Clary, R. M.; Walker, R. M.; Wissehr, C.

    2017-12-01

    Environmental education (EE) facilitates students' scientific and environmental literacy, and addresses content areas including sustainability, ecology, and civic responsibility. However, U.S. science content compartmentalization and EE's interdisciplinary nature historically made it a fragmented curriculum within U.S. schools. To gain a better understanding of effective EE instruction that can be transferred to traditional K-12 classrooms, we researched the interactions between a recognized environmental residential camp and students and teachers from six participating schools using grounded theory methodology. Our research identified the residential learning center's objectives, methods of instruction, and objectives' alignment to the delivered curricula. Data generated included lesson plans, survey responses, and interviews. Students (n = 215) identified wilderness and geology activities as the activities they wanted to experience more; they also identified developing curiosity and a sense of discovery as the most meaningful. Whereas most student-identified meaningful experiences aligned with the center's curricular objectives within the optional units, categories emerged that were not explicitly targeted in the unit activities but were embedded throughout the curriculum in sustainable practices, data collection, and reflections. We propose that embedded activities and implicit instruction can be included across content areas within K-12 classrooms. Teacher modeling and implicit instruction will require minimal classroom time, and facilitate students' scientific and environmental literacy in topics such as sustainability and citizen responsibility.

  6. Creating Social Reality: Informational Social Influence and the Content of Stereotypic Beliefs.

    Science.gov (United States)

    Wittenbrink, Bernd; Henly, Julia R.

    1996-01-01

    Three experiments tested the hypothesis that comparison information about other people's stereotypic beliefs is used to validate personal beliefs about a target group. A simple manipulation of questionnaire items and their response scales, presented as part of a political opinion survey, served as social comparison information regarding beliefs…

  7. Using measures of information content and complexity of time series as hydrologic metrics

    Science.gov (United States)

    The information theory has been previously used to develop metrics that allowed to characterize temporal patterns in soil moisture dynamics, and to evaluate and to compare performance of soil water flow models. The objective of this study was to apply information and complexity measures to characte...

  8. An international course on strategic information management for medical informatics students: aim, content, structure, and experiences

    NARCIS (Netherlands)

    Haux, R.; Ammenwerth, E.; ter Burg, W. J.; Pilz, J.; Jaspers, M. W. M.

    2004-01-01

    We report on a course for medical informatics students on hospital information systems, especially on its strategic information management. Starting as course at the Medical Informatics Program of the University of Heidelberg/University of Applied Sciences Heilbronn, it is now organized as

  9. Information matching the content of visual working memory is prioritized for conscious access.

    NARCIS (Netherlands)

    Gayet, S.; Paffen, C.L.E.; van der Stigchel, S.

    2013-01-01

    Visual working memory (VWM) is used to retain relevant information for imminent goal-directed behavior. In the experiments reported here, we found that VWM helps to prioritize relevant information that is not yet available for conscious experience. In five experiments, we demonstrated that

  10. Information Security Trends and Issues in the Moodle E-Learning Platform: An Ethnographic Content Analysis

    Science.gov (United States)

    Schultz, Christopher

    2012-01-01

    Empirical research on information security trends and practices in e-learning is scarce. Many articles that have been published apply basic information security concepts to e-learning and list potential threats or propose frameworks for classifying threats. The purpose of this research is to identify, categorize and understand trends and issues in…

  11. A Qualitative Content Analysis of Information Dissemination during the Consolidation of Two Technical Colleges

    Science.gov (United States)

    Drinnon, Charles

    2017-01-01

    This research examined the ways and means administration used to disseminate information concerning the consolidation of two technical colleges in the State of Georgia, USA. These means of information dissemination included press releases by the consolidating colleges and the Technical College System of Georgia (the system which the consolidating…

  12. Review of English Language Library and Information Science Weblogs: Analyzing the Link between Weblog Types and Their Technical /Content Structure

    Directory of Open Access Journals (Sweden)

    Tahereh Karami

    2010-09-01

    Full Text Available Weblog has become well established as one of the Web 2.0 products. Given the essential nature of their job, librarians and information professionals, can use weblog as a quick and easy mean for information and knowledge sharing. The present study reviews some 150 LIS weblogs in order to examine and analyze the link between weblog types (personal, library-owned or group operated with their content and technical structure. Webometric methods were deployed for selection of the sample. The findings indicated that there is a significant correlation between the weblog types and their update frequency. The same holds between the weblog types and their content. But no such significance was observed with respect to the weblog publishing tools. The investigators believe that the links uncovered could also hold true for Iranian LIS weblogs.

  13. Integrating Information Literacy and Evidence-Based Medicine Content within a New School of Medicine Curriculum: Process and Outcome.

    Science.gov (United States)

    Muellenbach, Joanne M; Houk, Kathryn M; E Thimons, Dana; Rodriguez, Bredny

    2018-01-01

    This column describes a process for integrating information literacy (IL) and evidence-based medicine (EBM) content within a new school of medicine curriculum. The project was a collaborative effort among health sciences librarians, curriculum deans, directors, and faculty. The health sciences librarians became members of the curriculum committees, developed a successful proposal for IL and EBM content within the curriculum, and were invited to become course instructors for Analytics in Medicine. As course instructors, the librarians worked with the other faculty instructors to design and deliver active learning class sessions based on a flipped classroom approach using a proprietary Information Mastery curriculum. Results of this collaboration may add to the knowledge base of attitudes and skills needed to practice as full faculty partners in curricular design and instruction.

  14. STRUCTURAL AND FUNCTIONAL CONTENT OF INFORMATION AND COMMUNICATION COMPETENCE AS A PART OF THE PROSPECTIVE MUSIC TEACHERS’ PROFESSIONAL COMPETENCE

    Directory of Open Access Journals (Sweden)

    Lyudmila Gavrilova

    2016-04-01

    Full Text Available The article is devoted to an actual problem of modern art, especially music, education – analysis of the structure and content of the professional competence of the future teachers of music. By studying the basic categories of competence approach, analyzing the research of domestic and foreign scholars, the author offers his own interpretation of the term “professional competence of the future teachers of music”. Systemic analysis of the phenomenon of competence as specific integral ability which provides efficiency of music pedagogy enabled to define professional competence of future music teachers in the context of informatization as a complex dynamic unity of three segments (pedagogy, musical proficiency, exploitation of information and communication technologies, each comprising cognitive, practical, emotive and evaluative spheres of personal development. Special accent is made on the structure and content of the information and communication competence, the importance of its formation confirmed by numerous scientists’ research. The author identifies the following components of the structure of future music teachers’ information and communication competence: - cognitive component (the necessary volume of theoretical knowledge in the area of information and communication technologies, including multimedia; - practical skills working in Information and Communication Pedagogical Environment, the ability to use multimedia educational tools for solving problems of professional activity (ready to use electronic manuals and independently develop their own computer books, skills of the online communication; - interest and positive attitude to the use of computer technology in professional musical and educational activities.

  15. Solar maximum mission

    International Nuclear Information System (INIS)

    Ryan, J.

    1981-01-01

    By understanding the sun, astrophysicists hope to expand this knowledge to understanding other stars. To study the sun, NASA launched a satellite on February 14, 1980. The project is named the Solar Maximum Mission (SMM). The satellite conducted detailed observations of the sun in collaboration with other satellites and ground-based optical and radio observations until its failure 10 months into the mission. The main objective of the SMM was to investigate one aspect of solar activity: solar flares. A brief description of the flare mechanism is given. The SMM satellite was valuable in providing information on where and how a solar flare occurs. A sequence of photographs of a solar flare taken from SMM satellite shows how a solar flare develops in a particular layer of the solar atmosphere. Two flares especially suitable for detailed observations by a joint effort occurred on April 30 and May 21 of 1980. These flares and observations of the flares are discussed. Also discussed are significant discoveries made by individual experiments

  16. 75 FR 16785 - Agency Information Collection Activities; Proposed Collection; Comment Request for Sulfur Content...

    Science.gov (United States)

    2010-04-02

    ... additional information about EPA's public docket visit the EPA Docket Center homepage at http://www.epa.gov... flooding and reopened in the EPA Headquarters Library, Infoterra Room (Room 3334), in the EPA West Building...

  17. Social determinants of content selection in the age of (mis)information

    OpenAIRE

    Bessi, Alessandro; Caldarelli, Guido; Del Vicario, Michela; Scala, Antonio; Quattrociocchi, Walter

    2014-01-01

    Despite the enthusiastic rhetoric about the so called \\emph{collective intelligence}, conspiracy theories -- e.g. global warming induced by chemtrails or the link between vaccines and autism -- find on the Web a natural medium for their dissemination. Users preferentially consume information according to their system of beliefs and the strife within users of opposite narratives may result in heated debates. In this work we provide a genuine example of information consumption from a sample of ...

  18. Factors influencing perceptions of domestic energy information: Content, source and process

    International Nuclear Information System (INIS)

    Simcock, Neil; MacGregor, Sherilyn; Catney, Philip; Dobson, Andrew; Ormerod, Mark; Robinson, Zoe; Ross, Simon; Royston, Sarah; Marie Hall, Sarah

    2014-01-01

    Reducing household energy consumption is an essential element of the UK Government's carbon reduction strategy. Whilst increased knowledge alone will not necessarily lead to tangible actions on the part of consumers, knowledge of various kinds is, we argue, still important if domestic energy usage is to be reduced. In an attempt to ‘educate’ the public, governments have typically resorted to ‘mass information’ campaigns that have been considered largely unsuccessful. Yet understanding what alternative forms of learning could be cultivated has been limited by the dearth of research that explores whether and why people consider information about energy and energy saving to be useful. By exploring this, we can move towards an understanding of how knowledge about energy saving can be better shared and communicated, enabling more meaningful learning to take place. Drawing on in-depth qualitative data with fifty-five participants, this paper highlights a range of factors that affect perceptions of energy information. It argues that these factors are not discrete, but are interlinked. A fundamentally different model of knowledge exchange is needed for more effective learning about energy saving to occur. A number of implications for policy are proposed in our conclusions. - Highlights: • A range of factors influence perceptions of energy information. These factors are interlinked. • Energy information perceived as more relevant when it could be ‘anchored’ to everyday frames of understanding. • Both qualified ‘experts’ and peers with personal experience valued as potential information sources. • ‘One-way’ information communication perceived negatively. Two-way information exchange built trust and a sense of control. • Participants’ active information assessment very different to the passive consumer assumed by knowledge-deficit model

  19. Kinesiology taping and the world wide web: a quality and content analysis of internet-based information.

    Science.gov (United States)

    Beutel, Bryan G; Cardone, Dennis A

    2014-10-01

    Due to limited regulation of websites, the quality and content of online health-related information has been questioned as prior studies have shown that websites often misrepresent orthopaedic conditions and treatments. Kinesio tape has gained popularity among athletes and the general public despite limited evidence supporting its efficacy. The primary objective of this study was to assess the quality and content of Internet-based information on Kinesio taping. An Internet search using the terms "Kinesio tape" and "kinesiology tape" was performed using the Google search engine. Websites returned within the first two pages of results, as well as hyperlinks embedded within these sites, were included in the study. These sites were subsequently classified by type. The quality of the website was determined by the Health On the Net (HON) score, an objective metric based upon recommendations from the United Nations for the ethical representation of health information. A content analysis was performed by noting specific misleading versus balanced features in each website. A total of 31 unique websites were identified. The majority of the websites (71%) were commercial. Out of a total possible 16 points, the mean HON score among the websites was 8.9 points (SD 2.2 points). The number of misleading features was significantly higher than the balanced features (p < 0.001). Fifty-eight percent of sites used anecdotal testimonials to promote the product. Only small percentages of websites discussed complications, alternatives, or provided accurate medical outcomes. Overall, commercial sites had a greater number of misleading features compared to non-commercial sites (p = 0.01). Websites discussing Kinesio tape are predominantly of poor quality and present misleading, imbalanced information. It is of ever-increasing importance that healthcare providers work to ensure that reliable, balanced, and accurate information be available to Internet users. IV.

  20. Analysis of the Relevance of Information Content of the Value Added Statement in the Brazilian Capital Markets

    Directory of Open Access Journals (Sweden)

    Márcio André Veras Machado

    2015-04-01

    Full Text Available The usefulness of financial statements depends, fundamentally, on the degree of relevance of the information they disclose to users. Thus, studies that measure the relevance of accounting information to the users of financial statements are of some importance. One line of research within this subject is in ascertaining the relevance and importance of accounting information for the capital markets: if a particular item of accounting information is minimally reflected in the price of a share, it is because this information has relevance, at least at a certain level of significance, for investors and analysts of the capital markets. This present study aims to analyze the relevance, in the Brazilian capital markets, of the information content of the Value Added Statement (or VAS - referred to in Brazil as the Demonstração do Valor Adicionado, or DVA. It analyzed the ratio between stock price and Wealth created per share (WCPS, using linear regressions, for the period 2005-2011, for non-financial listed companies included in Melhores & Maiores ('Biggest & Best', an annual listing published by Exame Magazine in Brazil. As a secondary objective, this article seeks to establish whether WCPS represents a better indication of a company's result than Net profit per share (in this study, referred to as NPPS. The empirical evidence that was found supports the concept that the VAS has relevant information content, because it shows a capacity to explain a variation in the share price of the companies studied. Additionally, the relationship between WCPS and the stock price was shown to be significant, even after the inclusion of the control variables Stockholders' equity per share (which we abbreviate in this study to SEPS and NPPS. Finally, the evidence found indicates that the market reacts more to WCPS (Wealth created per share than to NPPS. Thus, the results obtained give some indication that, for the Brazilian capital markets, WCPS may be a better proxy

  1. Cosmetic tourism: public opinion and analysis of information and content available on the Internet.

    Science.gov (United States)

    Nassab, Reza; Hamnett, Nathan; Nelson, Kate; Kaur, Simranjit; Greensill, Beverley; Dhital, Sanjiv; Juma, Ali

    2010-01-01

    The medical tourism market is a rapidly growing sector fueled by increasing health care costs, longer domestic waiting times, economic recession, and cheaper air travel. The authors investigate public opinion on undergoing cosmetic surgery abroad and then explore the information patients are likely to encounter on the Internet when searching for such services. A poll of 197 members of the general public was conducted in the United Kingdom. An Internet search including the terms plastic surgery abroad was conducted, and the first 100 relevant sites were reviewed. Of the 197 respondents, 47% had considered having some form of cosmetic surgery. Most (97%) would consider going abroad for their procedure. The Internet was a source of information for 70%. The review of the first 100 sites under "plastic surgery abroad" revealed that most centers were located in Eastern Europe (26%), South America (14%), and the Far East (11%). Exploring the information provided on the Web sites, we found 37% contained no information regarding procedures. Only 10% of sites contained any information about potential complications. Even less frequently mentioned (4%) were details of aftercare or follow-up procedures. The authors found that the overwhelming majority of respondents considering plastic surgery would also consider seeking cosmetic surgical treatment abroad. The Internet sites that appear most prominently in an online search contained a distinct lack of information for potential patients, particularly with regard to complications and aftercare. There is, therefore, a need for improved public awareness and education about the considerations inherent in medical tourism. The introduction of more stringent regulations for international centers providing such services should also be considered to help safeguard patients.

  2. Digital educational contents that promote the effective integration of information and communication technologies

    Directory of Open Access Journals (Sweden)

    Micaela Manso

    2011-07-01

    Full Text Available This qualitative research study explores the relationship between the quality of curriculum designs that integrate ICTs and the quality of teachers’ actual implementation of these designs. To analyze them, we selected 10 qualities that build on TPACK (Technological Pedagogical Content Knowledge and the Teaching for Understanding framework (TfU. We selected three curriculum designs that integrate ICTs and conducted in-depth interviews to 6 secondary teachers, 34 students and 3 curriculum designers in Argentina, Mexico and Colombia. When themajority of the qualities were present in the curriculum designs, the majority of the qualities were also present in the teachers’ implementations. High quality curriculum designs that integrate ICTs tended to promote high quality teacher practices.

  3. An evaluation of multimedia and online support groups (OSG) contents and application of information by infertile patients: Mixed method study

    Science.gov (United States)

    Wiweko, Budi; Narasati, Shabrina; Agung, Prince Gusti; Zesario, Aulia; Wibawa, Yohanes Satrya; Maidarti, Mila; Harzif, Achmad Kemal; Pratama, Gita; Sumapradja, Kanadi; Muharam, Raden; Hestiantoro, Andon

    2018-02-01

    Background: The presence of Online Support Groups (OSG) is expected to empower patients with infertility, thus allowing patients to be the focus of healthcare services. This study will evaluate multimedia content, OSG, and utilization of information for decision-making by patients using infertility services. This study is a mixed method study conducted from January - June 2016 at Yasmin IVF Clinic, Dr. Cipto Mangunkusumo General Hospital; and SMART IVF Clinic, Jakarta. The subjects are patients with infertility who sought treatment at the clinics. Data was collected through a structured interview in the form of a questionnaire. Informed consent was obtained from all individual participants included in the study. All procedures that performed in studies were by the ethical standards of the institutional. The result from 72 respondents showed quantitative analysis did not reveal any association between multimedia and OSG information sources with patient knowledge regarding infertility management. However, qualitative analysis highlighted three issues: the information regarding infertility services in the available multimedia and the OSG; use of the available information by patients when deciding to use infertility services. The level of awareness of respondents on searching information regarding infertility on the clinic website is still limited. It happened because most of the patients in the clinic are unaware of clinic website existence which provided the infertility information. Therefore, the clinic website needs to be promoted so the usage of this website will increase in the future.

  4. Readability and Content Assessment of Informed Consent Forms for Phase II-IV Clinical Trials in China.

    Science.gov (United States)

    Wen, Gaiyan; Liu, Xinchun; Huang, Lihua; Shu, Jingxian; Xu, Nana; Chen, Ruifang; Huang, Zhijun; Yang, Guoping; Wang, Xiaomin; Xiang, Yuxia; Lu, Yao; Yuan, Hong

    2016-01-01

    To explore the readability and content integrity of informed consent forms (ICFs) used in China and to compare the quality of Chinese local ICFs with that of international ICFs. The length, readability and content of 155 consent documents from phase II-IV drug clinical trials from the Third Xiangya Hospital Ethics Committee from November 2009 to January 2015 were evaluated. Reading difficulty was tested using a readability formula adapted for the Chinese language. An ICF checklist containing 27 required elements was successfully constructed to evaluate content integrity. The description of alternatives to participation was assessed. The quality of ICFs from different sponsorships were also compared. Among the 155 evaluable trials, the ICFs had a median length of 5286 words, corresponding to 7 pages. The median readability score was 4.31 (4.02-4.41), with 63.9% at the 2nd level and 36.1% at the 3rd level. Five of the 27 elements were frequently neglected. The average score for the description of alternatives to participation was 1.06, and 27.7% of the ICFs did not mention any alternatives. Compared with Chinese local ICFs, international ICFs were longer, more readable and contained more of the required elements (P readability and content integrity than Chinese local ICFs. More efforts should thus be made to improve the quality of consent documents in China.

  5. Generic Schemes for Single-Molecule Kinetics. 2: Information Content of the Poisson Indicator.

    Science.gov (United States)

    Avila, Thomas R; Piephoff, D Evan; Cao, Jianshu

    2017-08-24

    Recently, we described a pathway analysis technique (paper 1) for analyzing generic schemes for single-molecule kinetics based upon the first-passage time distribution. Here, we employ this method to derive expressions for the Poisson indicator, a normalized measure of stochastic variation (essentially equivalent to the Fano factor and Mandel's Q parameter), for various renewal (i.e., memoryless) enzymatic reactions. We examine its dependence on substrate concentration, without assuming all steps follow Poissonian kinetics. Based upon fitting to the functional forms of the first two waiting time moments, we show that, to second order, the non-Poissonian kinetics are generally underdetermined but can be specified in certain scenarios. For an enzymatic reaction with an arbitrary intermediate topology, we identify a generic minimum of the Poisson indicator as a function of substrate concentration, which can be used to tune substrate concentration to the stochastic fluctuations and to estimate the largest number of underlying consecutive links in a turnover cycle. We identify a local maximum of the Poisson indicator (with respect to substrate concentration) for a renewal process as a signature of competitive binding, either between a substrate and an inhibitor or between multiple substrates. Our analysis explores the rich connections between Poisson indicator measurements and microscopic kinetic mechanisms.

  6. Content-based retrieval of brain tumor in contrast-enhanced MRI images using tumor margin information and learned distance metric.

    Science.gov (United States)

    Yang, Wei; Feng, Qianjin; Yu, Mei; Lu, Zhentai; Gao, Yang; Xu, Yikai; Chen, Wufan

    2012-11-01

    A content-based image retrieval (CBIR) method for T1-weighted contrast-enhanced MRI (CE-MRI) images of brain tumors is presented for diagnosis aid. The method is thoroughly evaluated on a large image dataset. Using the tumor region as a query, the authors' CBIR system attempts to retrieve tumors of the same pathological category. Aside from commonly used features such as intensity, texture, and shape features, the authors use a margin information descriptor (MID), which is capable of describing the characteristics of tissue surrounding a tumor, for representing image contents. In addition, the authors designed a distance metric learning algorithm called Maximum mean average Precision Projection (MPP) to maximize the smooth approximated mean average precision (mAP) to optimize retrieval performance. The effectiveness of MID and MPP algorithms was evaluated using a brain CE-MRI dataset consisting of 3108 2D scans acquired from 235 patients with three categories of brain tumors (meningioma, glioma, and pituitary tumor). By combining MID and other features, the mAP of retrieval increased by more than 6% with the learned distance metrics. The distance metric learned by MPP significantly outperformed the other two existing distance metric learning methods in terms of mAP. The CBIR system using the proposed strategies achieved a mAP of 87.3% and a precision of 89.3% when top 10 images were returned by the system. Compared with scale-invariant feature transform, the MID, which uses the intensity profile as descriptor, achieves better retrieval performance. Incorporating tumor margin information represented by MID with the distance metric learned by the MPP algorithm can substantially improve the retrieval performance for brain tumors in CE-MRI.

  7. With task experience students learn to ignore the content, not just the location of irrelevant information

    NARCIS (Netherlands)

    Rop, Gertjan; Verkoeijen, Peter P J L; van Gog, Tamara

    2017-01-01

    Presentation of irrelevant additional information hampers learning. However, using a word-learning task, recent research demonstrated that an initial negative effect of mismatching pictures on learning no longer occurred once learners gained task experience. It is unclear, however, whether learners

  8. Using crowdsourced web content for informing water systems operations in snow-dominated catchments

    Science.gov (United States)

    Giuliani, Matteo; Castelletti, Andrea; Fedorov, Roman; Fraternali, Piero

    2016-12-01

    Snow is a key component of the hydrologic cycle in many regions of the world. Despite recent advances in environmental monitoring that are making a wide range of data available, continuous snow monitoring systems that can collect data at high spatial and temporal resolution are not well established yet, especially in inaccessible high-latitude or mountainous regions. The unprecedented availability of user-generated data on the web is opening new opportunities for enhancing real-time monitoring and modeling of environmental systems based on data that are public, low-cost, and spatiotemporally dense. In this paper, we contribute a novel crowdsourcing procedure for extracting snow-related information from public web images, either produced by users or generated by touristic webcams. A fully automated process fetches mountain images from multiple sources, identifies the peaks present therein, and estimates virtual snow indexes representing a proxy of the snow-covered area. Our procedure has the potential for complementing traditional snow-related information, minimizing costs and efforts for obtaining the virtual snow indexes and, at the same time, maximizing the portability of the procedure to several locations where such public images are available. The operational value of the obtained virtual snow indexes is assessed for a real-world water-management problem, the regulation of Lake Como, where we use these indexes for informing the daily operations of the lake. Numerical results show that such information is effective in extending the anticipation capacity of the lake operations, ultimately improving the system performance.

  9. Didactic Content of Constructively-Projective Function of Students Learning: The Extrapolation in Information Technology

    Science.gov (United States)

    Kutuev, Ruslan A.; Nuriyeva, Elvira N.; Safiullina, Tatyana R.; Kryukova, Nina I.; Tagirova, Nataliya P.; Karpenko, Galina V.

    2016-01-01

    The relevance of the study is conditioned by a radical impact on the learning process of the university by information technology, which put start a new phase in its transformation. According to experts at the present time the main factor of efficiency of university's activity becomes the expansion of students' learning activities, realized on the…

  10. Content-based organization of the information space in multi-database networks

    NARCIS (Netherlands)

    Papazoglou, M.; Milliner, S.

    1998-01-01

    Abstract. Rapid growth in the volume of network-available data, complexity, diversity and terminological fluctuations, at different data sources, render network-accessible information increasingly difficult to achieve. The situation is particularly cumbersome for users of multi-database systems who

  11. A Hybrid Approach to Finding Relevant Social Media Content for Complex Domain Specific Information Needs.

    Science.gov (United States)

    Cameron, Delroy; Sheth, Amit P; Jaykumar, Nishita; Thirunarayan, Krishnaprasad; Anand, Gaurish; Smith, Gary A

    2014-12-01

    While contemporary semantic search systems offer to improve classical keyword-based search, they are not always adequate for complex domain specific information needs. The domain of prescription drug abuse, for example, requires knowledge of both ontological concepts and "intelligible constructs" not typically modeled in ontologies. These intelligible constructs convey essential information that include notions of intensity, frequency, interval, dosage and sentiments, which could be important to the holistic needs of the information seeker. In this paper, we present a hybrid approach to domain specific information retrieval that integrates ontology-driven query interpretation with synonym-based query expansion and domain specific rules, to facilitate search in social media on prescription drug abuse. Our framework is based on a context-free grammar (CFG) that defines the query language of constructs interpretable by the search system. The grammar provides two levels of semantic interpretation: 1) a top-level CFG that facilitates retrieval of diverse textual patterns, which belong to broad templates and 2) a low-level CFG that enables interpretation of specific expressions belonging to such textual patterns. These low-level expressions occur as concepts from four different categories of data: 1) ontological concepts, 2) concepts in lexicons (such as emotions and sentiments), 3) concepts in lexicons with only partial ontology representation, called lexico-ontology concepts (such as side effects and routes of administration (ROA)), and 4) domain specific expressions (such as date, time, interval, frequency and dosage) derived solely through rules. Our approach is embodied in a novel Semantic Web platform called PREDOSE, which provides search support for complex domain specific information needs in prescription drug abuse epidemiology. When applied to a corpus of over 1 million drug abuse-related web forum posts, our search framework proved effective in retrieving

  12. Mapping query terms to data and schema using content based similarity search in clinical information systems.

    Science.gov (United States)

    Safari, Leila; Patrick, Jon D

    2013-01-01

    This paper reports on the issues in mapping the terms of a query to the field names of the schema of an Entity Relationship (ER) model or to the data part of the Entity Attribute Value (EAV) model using similarity based Top-K algorithm in clinical information system together with an extension of EAV mapping for medication names. In addition, the details of the mapping algorithm and the required pre-processing including NLP (Natural Language Processing) tasks to prepare resources for mapping are explained. The experimental results on an example clinical information system demonstrate more than 84 per cent of accuracy in mapping. The results will be integrated into our proposed Clinical Data Analytics Language (CliniDAL) to automate mapping process in CliniDAL.

  13. Long-term potentiation expands information content of hippocampal dentate gyrus synapses.

    Science.gov (United States)

    Bromer, Cailey; Bartol, Thomas M; Bowden, Jared B; Hubbard, Dusten D; Hanka, Dakota C; Gonzalez, Paola V; Kuwajima, Masaaki; Mendenhall, John M; Parker, Patrick H; Abraham, Wickliffe C; Sejnowski, Terrence J; Harris, Kristen M

    2018-03-06

    An approach combining signal detection theory and precise 3D reconstructions from serial section electron microscopy (3DEM) was used to investigate synaptic plasticity and information storage capacity at medial perforant path synapses in adult hippocampal dentate gyrus in vivo. Induction of long-term potentiation (LTP) markedly increased the frequencies of both small and large spines measured 30 minutes later. This bidirectional expansion resulted in heterosynaptic counterbalancing of total synaptic area per unit length of granule cell dendrite. Control hemispheres exhibited 6.5 distinct spine sizes for 2.7 bits of storage capacity while LTP resulted in 12.9 distinct spine sizes (3.7 bits). In contrast, control hippocampal CA1 synapses exhibited 4.7 bits with much greater synaptic precision than either control or potentiated dentate gyrus synapses. Thus, synaptic plasticity altered total capacity, yet hippocampal subregions differed dramatically in their synaptic information storage capacity, reflecting their diverse functions and activation histories.

  14. Firm Size and the Information Content of Over-the-Counter Common Stock Offerings

    OpenAIRE

    Robert M. Hull; George E. Pinches

    1995-01-01

    We examine the announcement period of stock returns for 179 over-the-counter (OTC) firms that issue common stock to reduce nonconvertible debt. We find that small OTC firms experience returns that are significantly more negative than large OTC firms. Regression tests reveal that firm size is a significant factor in accounting for stock returns. Other tests establish as firm size a dominant effect. Our support for a firm size effect is consistent with a differential information effect given th...

  15. Apparatus and method for gaining the whole information content of radiographic pictures

    International Nuclear Information System (INIS)

    Sasdi, A.

    1978-01-01

    Methods for depth size determination of welding errors from radiographic films are reviewed. Based on information theory the processes of exposure, development and evaluation were studied. The density function of the radiographic film is considered to be the critical filter of the system. A model of a high resolution electronic density meter is introduced. The nonlinearity of the density function was compensated by electronic filtering. The development of a new type radiographic instrument using video technique is proposed. (R.J.)

  16. Ontology lexicalization: Relationship between content and meaning in the context of Information Retrieval

    Directory of Open Access Journals (Sweden)

    Marcelo SCHIESSL

    Full Text Available Abstract The proposal presented in this study seeks to properly represent natural language to ontologies and vice-versa. Therefore, the semi-automatic creation of a lexical database in Brazilian Portuguese containing morphological, syntactic, and semantic information that can be read by machines was proposed, allowing the link between structured and unstructured data and its integration into an information retrieval model to improve precision. The results obtained demonstrated that the methodology can be used in the risco financeiro (financial risk domain in Portuguese for the construction of an ontology and the lexical-semantic database and the proposal of a semantic information retrieval model. In order to evaluate the performance of the proposed model, documents containing the main definitions of the financial risk domain were selected and indexed with and without semantic annotation. To enable the comparison between the approaches, two databases were created based on the texts with the semantic annotations to represent the semantic search. The first one represents the traditional search and the second contained the index built based on the texts with the semantic annotations to represent the semantic search. The evaluation of the proposal was based on recall and precision. The queries submitted to the model showed that the semantic search outperforms the traditional search and validates the methodology used. Although more complex, the procedure proposed can be used in all kinds of domains.

  17. Maximum Acceleration Recording Circuit

    Science.gov (United States)

    Bozeman, Richard J., Jr.

    1995-01-01

    Coarsely digitized maximum levels recorded in blown fuses. Circuit feeds power to accelerometer and makes nonvolatile record of maximum level to which output of accelerometer rises during measurement interval. In comparison with inertia-type single-preset-trip-point mechanical maximum-acceleration-recording devices, circuit weighs less, occupies less space, and records accelerations within narrower bands of uncertainty. In comparison with prior electronic data-acquisition systems designed for same purpose, circuit simpler, less bulky, consumes less power, costs and analysis of data recorded in magnetic or electronic memory devices. Circuit used, for example, to record accelerations to which commodities subjected during transportation on trucks.

  18. A Qualitative Study to Inform the Design, Content and Structure of an Interactive SMS Messaging Service in Chikwawa, Malawi.

    Directory of Open Access Journals (Sweden)

    Rebecca Laidlaw

    2015-10-01

    Five themes were identified encapsulating the opinions and beliefs of the residents in Chimoto and Sikenala; Current Health Education Practices, Message Content, Mobile Phone Access, Trust in the SCHI and Sustainability. Current Health Education Practices refers to the current availability of health education, the access to such information and self-reported need for further information in more accessible means. Message Content depicts participants’ need for practical application of the messages they receive and adequate information for the participant to make an informed choice about their own health. In terms of the SMS Messaging Service, participants had no preference as to frequency or volume of messages but stated they would prefer to receive messages out with school hours. Mobile Phone Access represents the participants’ fears around accessibility of the service to those without mobile devices. The sharing of messages and mobile phones with friends and family was discussed as a potential method to overcome this barrier. Trust in the SCHI depicts the residents’ positive views of the project and that they would believe the content of the messages because they trust the SCHI as the source, especially if they recognised a designated project mobile number. This was affirmed by their declaration to share messages with those without phone access. Weariness of the service was identified only in terms of cost because of negative experiences with other subscription services. Finally Sustainability encapsulates the participants’ views on the long term aims of the messaging service and their request for the project to follow up with visits and services in addition to the messages. They particularly emphasising a need for face-to-face communication. Conclusions From this analysis it appears that the sampled participants are on board with the messaging service, and have provided in depth detailed examples of the type of information they require, specifically

  19. How do field of view and resolution affect the information content of panoramic scenes for visual navigation? A computational investigation.

    Science.gov (United States)

    Wystrach, Antoine; Dewar, Alex; Philippides, Andrew; Graham, Paul

    2016-02-01

    The visual systems of animals have to provide information to guide behaviour and the informational requirements of an animal's behavioural repertoire are often reflected in its sensory system. For insects, this is often evident in the optical array of the compound eye. One behaviour that insects share with many animals is the use of learnt visual information for navigation. As ants are expert visual navigators it may be that their vision is optimised for navigation. Here we take a computational approach in asking how the details of the optical array influence the informational content of scenes used in simple view matching strategies for orientation. We find that robust orientation is best achieved with low-resolution visual information and a large field of view, similar to the optical properties seen for many ant species. A lower resolution allows for a trade-off between specificity and generalisation for stored views. Additionally, our simulations show that orientation performance increases if different portions of the visual field are considered as discrete visual sensors, each giving an independent directional estimate. This suggests that ants might benefit by processing information from their two eyes independently.

  20. Determining the Content of a Pediatric Asthma Website from Parents’ Perspective: The Internet Use and Information Needs

    Directory of Open Access Journals (Sweden)

    Rezvan Ansari

    2017-06-01

    Full Text Available Background The acquisition of knowledge by parents of children with asthma plays an important role in the treatment of children. Thus, it is important to understand their needs and provide this information through available methods such as a website.The aim of this studywas to determine the content of a pediatric asthma website based on the evaluation of parents information needs. Materials and Methods This cross-sectional studywas conducted by a descriptive-analytical approach in Kerman, Iran. Data were collected using a semi-structured questionnaire.The questionnaire was distributed among a sample of 300 parents visiting allergy and asthma specialists’ offices. Three experts confirmed validity of the questionnaire. The reliability of the questionnairewas confirmed using the test- retest method on 40 participants (r = 0.82. Data were analyzed using descriptive and analytical statistics by SPSS version 20.0 software. Results Participants demanded information concerning asthma nutrition (79.0%, prevention (78.1%, treatment (77.1%, medications (72.4% as well as general information (71.4% and information about etiology of the disease (70.5%, respectively. The results showed that the fathers use the Internet significantly more than the mothers (p=0.0001. There was a statistically significant relationship between participants’ educational level and the type of resources they use to obtain information (P

  1. Exploring the effectiveness of obstetrics and gynecology information systems in hospitals of a developing country: A qualitative content analysis

    Directory of Open Access Journals (Sweden)

    Hassan Babamohamadi

    2016-07-01

    Full Text Available Obstetrics and gynecology information systems are designed to replace paper charts, interact with other clinical wards of hospital, and to better care for patients. This qualitative study was performed to explore the perception of midwives about the effectiveness of information systems. In this qualitative study, data were collected through semistructured and in-depth interviews and analyzed by content analysis and constant comparison method. Participants were 15 midwives from obstetrics and gynecology units of affiliated hospitals of Semnan University of Medical Sciences, Iran. Purposeful sampling method was used and continued until data saturation. The several themes that emerged from the interviews were divided into strength and weak points. Strength points included the facilitating the recording of information, reduction of costs and time, and the weakness points were repetition of tasks, low computer literacy of the staff, system restrictions on recording and editing, the unavailability of system and reduced the role of midwives in patient care. Midwives were faced with challenges in the use of information systems indicating the lack of quality of the information system. It seems that reinforcing strength points and resolving hardware and software problems can increase obstetrics and gynecology staff’s acceptance of this information system and reduce their cultural resistance toward it.

  2. Shared vision, shared vulnerability: A content analysis of corporate social responsibility information on tobacco industry websites.

    Science.gov (United States)

    McDaniel, Patricia A; Cadman, Brie; Malone, Ruth E

    2016-08-01

    Tobacco companies rely on corporate social responsibility (CSR) initiatives to improve their public image and advance their political objectives, which include thwarting or undermining tobacco control policies. For these reasons, implementation guidelines for the World Health Organization's Framework Convention on Tobacco Control (FCTC) recommend curtailing or prohibiting tobacco industry CSR. To understand how and where major tobacco companies focus their CSR resources, we explored CSR-related content on 4 US and 4 multinational tobacco company websites in February 2014. The websites described a range of CSR-related activities, many common across all companies, and no programs were unique to a particular company. The websites mentioned CSR activities in 58 countries, representing nearly every region of the world. Tobacco companies appear to have a shared vision about what constitutes CSR, due perhaps to shared vulnerabilities. Most countries that host tobacco company CSR programs are parties to the FCTC, highlighting the need for full implementation of the treaty, and for funding to monitor CSR activity, replace industry philanthropy, and enforce existing bans. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Quantal basis of vesicle growth and information content, a unified approach.

    Science.gov (United States)

    Nitzany, Eyal; Hammel, Ilan; Meilijson, Isaac

    2010-09-07

    Secretory vesicles express a periodic multimodal size distribution. The successive modes are integral multiples of the smallest mode (G(1)). The vesicle content ranges from macromolecules (proteins, mucopolysaccharides and hormones) to low molecular weight molecules (neurotransmitters). A steady-state model has been developed to emulate a mechanism for the introduction of vesicles of monomer size, which grow by a unit addition mechanism, G(1)+G(n)-->G(n+1) which, at a later stage are eliminated from the system. We describe a model of growth and elimination transition rates which adequately illustrates the distributions of vesicle population size at steady-state and upon elimination. Consequently, prediction of normal behavior and pathological perturbations is feasible. Careful analysis of spontaneous secretion, as compared to short burst-induced secretion, suggests that the basic character-code for reliable communication should be within a range of only 8-10 vesicles' burst which may serve as a yes/no message. Copyright 2010 Elsevier Ltd. All rights reserved.

  4. Informing Estimates of Program Effects for Studies of Mathematics Professional Development Using Teacher Content Knowledge Outcomes.

    Science.gov (United States)

    Phelps, Geoffrey; Kelcey, Benjamin; Jones, Nathan; Liu, Shuangshuang

    2016-10-03

    Mathematics professional development is widely offered, typically with the goal of improving teachers' content knowledge, the quality of teaching, and ultimately students' achievement. Recently, new assessments focused on mathematical knowledge for teaching (MKT) have been developed to assist in the evaluation and improvement of mathematics professional development. This study presents empirical estimates of average program change in MKT and its variation with the goal of supporting the design of experimental trials that are adequately powered to detect a specified program effect. The study drew on a large database representing five different assessments of MKT and collectively 326 professional development programs and 9,365 teachers. Results from cross-classified hierarchical growth models found that standardized average change estimates across the five assessments ranged from a low of 0.16 standard deviations (SDs) to a high of 0.26 SDs. Power analyses using the estimated pre- and posttest change estimates indicated that hundreds of teachers are needed to detect changes in knowledge at the lower end of the distribution. Even studies powered to detect effects at the higher end of the distribution will require substantial resources to conduct rigorous experimental trials. Empirical benchmarks that describe average program change and its variation provide a useful preliminary resource for interpreting the relative magnitude of effect sizes associated with professional development programs and for designing adequately powered trials. © The Author(s) 2016.

  5. Rapid Ethical Assessment on Informed Consent Content and Procedure in Hintalo-Wajirat, Northern Ethiopia: A Qualitative Study.

    Directory of Open Access Journals (Sweden)

    Serebe Abay

    Full Text Available Informed consent is a key component of bio-medical research involving human participants. However, obtaining informed consent is challenging in low literacy and resource limited settings. Rapid Ethical Assessment (REA can be used to contextualize and simplify consent information within a given study community. The current study aimed to explore the effects of social, cultural, and religious factors during informed consent process on a proposed HPV-serotype prevalence study.A qualitative community-based REA was conducted in Adigudom and Mynebri Kebeles, Northern Ethiopia, from July to August 2013. Data were collected by a multi-disciplinary team using open ended questions concerning informed consent components in relation to the parent study. The team conducted one-to-one In-Depth Interviews (IDI and Focus Group Discussions (FGDs with key informants and community members to collect data based on the themes of the study. Tape recorded data were transcribed in Tigrigna and then translated into English. Data were categorized and thematically analyzed using open coding and content analysis based on pre-defined themes.The REA study revealed a number of socio-cultural issues relevant to the proposed study. Low community awareness about health research, participant rights and cervical cancer were documented. Giving a vaginal sample for testing was considered to be highly embarrassing, whereas giving a blood sample made participants worry that they might be given a result without the possibility of treatment. Verbal consent was preferred to written consent for the proposed study.This rapid ethical assessment disclosed important socio-cultural issues which might act as barriers to informed decision making. The findings were important for contextual modification of the Information Sheet, and to guide the best consent process for the proposed study. Both are likely to have enabled participants to understand the informed consent better and consequently to

  6. THE Ĝ SEARCH FOR EXTRATERRESTRIAL CIVILIZATIONS WITH LARGE ENERGY SUPPLIES. IV. THE SIGNATURES AND INFORMATION CONTENT OF TRANSITING MEGASTRUCTURES

    Energy Technology Data Exchange (ETDEWEB)

    Wright, Jason T.; Cartier, Kimberly M. S.; Zhao, Ming; Jontof-Hutter, Daniel; Ford, Eric B. [Department of Astronomy and Astrophysics, and Center for Exoplanets and Habitable Worlds, 525 Davey Lab, The Pennsylvania State University, University Park, PA, 16802 (United States)

    2016-01-01

    Arnold, Forgan, and Korpela et al. noted that planet-sized artificial structures could be discovered with Kepler as they transit their host star. We present a general discussion of transiting megastructures, and enumerate 10 potential ways their anomalous silhouettes, orbits, and transmission properties would distinguish them from exoplanets. We also enumerate the natural sources of such signatures. Several anomalous objects, such as KIC 12557548 and CoRoT-29, have variability in depth consistent with Arnold’s prediction and/or an asymmetric shape consistent with Forgan’s model. Since well-motivated physical models have so far provided natural explanations for these signals, the ETI hypothesis is not warranted for these objects, but they still serve as useful examples of how non-standard transit signatures might be identified and interpreted in a SETI context. Boyajian et al. recently announced KIC 8462852, an object with a bizarre light curve consistent with a “swarm” of megastructures. We suggest that this is an outstanding SETI target. We develop the normalized information content statistic M to quantify the information content in a signal embedded in a discrete series of bounded measurements, such as variable transit depths, and show that it can be used to distinguish among constant sources, interstellar beacons, and naturally stochastic or artificial, information-rich signals. We apply this formalism to KIC 12557548 and a specific form of beacon suggested by Arnold to illustrate its utility.

  7. Text analysis of radiation information in newspaper articles headlines and internet contents after the Fukushima Nuclear Power Plant accident

    International Nuclear Information System (INIS)

    Kanda, Reiko; Tsuji, Satsuki; Yonehara, Hidenori

    2014-01-01

    In general, the press is considered to have amplified the level of public's anxiety and perception of risk. In the present study, we analyzed newspaper article headlines and Internet contents that were released from March 11, 2011 to January 31, 2012 using text mining techniques. The aim is to reveal the particular characteristics of the information propagated regarding the Fukushima NPP Accident. The article headlines of the newspapers which had a largest circulation were chosen for analysis, and contents of Internet media were chose based on the number of times they were linked or retweeted. According to our text mining analysis, newspaper frequently reported the 'measurement, investigation and examination' of radiation/radioactive materials caused by the Fukushima Accident, and this information might be spread selectively via the social media. On the other hand, the words related to health effects of radiation exposure (i. e., cancer, hereditary effects) were rare in newspaper headlines. Instead, words like 'anxiety' and 'safe' were often used to convey the degree of health effects. Particularly in March of 2011, the concept of 'danger' was used frequently in newspaper headlines. These indirect characterizations of the situation may have contributed more or less to the misunderstanding of the health effects and to the enhanced perception of risk felt by the public. In conclusion, there were found no evidence to suggest that newspaper or Internet media users released sensational information that increased the health anxiety of reads throughout the period of analysis. (author)

  8. The Search for Extraterrestrial Civilizations with Large Energy Supplies. IV. The Signatures and Information Content of Transiting Megastructures

    Science.gov (United States)

    Wright, Jason T.; Cartier, Kimberly M. S.; Zhao, Ming; Jontof-Hutter, Daniel; Ford, Eric B.

    2016-01-01

    Arnold, Forgan, and Korpela et al. noted that planet-sized artificial structures could be discovered with Kepler as they transit their host star. We present a general discussion of transiting megastructures, and enumerate 10 potential ways their anomalous silhouettes, orbits, and transmission properties would distinguish them from exoplanets. We also enumerate the natural sources of such signatures. Several anomalous objects, such as KIC 12557548 and CoRoT-29, have variability in depth consistent with Arnold’s prediction and/or an asymmetric shape consistent with Forgan’s model. Since well-motivated physical models have so far provided natural explanations for these signals, the ETI hypothesis is not warranted for these objects, but they still serve as useful examples of how non-standard transit signatures might be identified and interpreted in a SETI context. Boyajian et al. recently announced KIC 8462852, an object with a bizarre light curve consistent with a “swarm” of megastructures. We suggest that this is an outstanding SETI target. We develop the normalized information content statistic M to quantify the information content in a signal embedded in a discrete series of bounded measurements, such as variable transit depths, and show that it can be used to distinguish among constant sources, interstellar beacons, and naturally stochastic or artificial, information-rich signals. We apply this formalism to KIC 12557548 and a specific form of beacon suggested by Arnold to illustrate its utility.

  9. Pharmaceutical companies and their drugs on social media: a content analysis of drug information on popular social media sites.

    Science.gov (United States)

    Tyrawski, Jennifer; DeAndrea, David C

    2015-06-01

    Many concerns have been raised about pharmaceutical companies marketing their drugs directly to consumers on social media. This form of direct-to-consumer advertising (DTCA) can be interactive and, because it is largely unmonitored, the benefits of pharmaceutical treatment could easily be overemphasized compared to the risks. Additionally, nonexpert consumers can share their own drug product testimonials on social media and illegal online pharmacies can market their services on popular social media sites. There is great potential for the public to be exposed to misleading or dangerous information about pharmaceutical drugs on social media. Our central aim was to examine how pharmaceutical companies use social media to interact with the general public and market their drugs. We also sought to analyze the nature of information that appears in search results for widely used pharmaceutical drugs in the United States on Facebook, Twitter, and YouTube with a particular emphasis on the presence of illegal pharmacies. Content analyses were performed on (1) social media content on the Facebook, Twitter, and YouTube accounts of the top 15 pharmaceutical companies in the world and (2) the content that appears when searching on Facebook, Twitter, and YouTube for the top 20 pharmaceutical drugs purchased in the United States. Notably, for the company-specific analysis, we examined the presence of information similar to various forms of DTCA, the audience reach of company postings, and the quantity and quality of company-consumer interaction. For the drug-specific analysis, we documented the presence of illegal pharmacies, personal testimonials, and drug efficacy claims. From the company-specific analysis, we found information similar to help-seeking DTCA in 40.7% (301/740) of pharmaceutical companies' social media posts. Drug product claims were present in only 1.6% (12/740) of posts. Overall, there was a substantial amount of consumers who interacted with pharmaceutical

  10. Pharmaceutical Companies and Their Drugs on Social Media: A Content Analysis of Drug Information on Popular Social Media Sites

    Science.gov (United States)

    2015-01-01

    Background Many concerns have been raised about pharmaceutical companies marketing their drugs directly to consumers on social media. This form of direct-to-consumer advertising (DTCA) can be interactive and, because it is largely unmonitored, the benefits of pharmaceutical treatment could easily be overemphasized compared to the risks. Additionally, nonexpert consumers can share their own drug product testimonials on social media and illegal online pharmacies can market their services on popular social media sites. There is great potential for the public to be exposed to misleading or dangerous information about pharmaceutical drugs on social media. Objective Our central aim was to examine how pharmaceutical companies use social media to interact with the general public and market their drugs. We also sought to analyze the nature of information that appears in search results for widely used pharmaceutical drugs in the United States on Facebook, Twitter, and YouTube with a particular emphasis on the presence of illegal pharmacies. Methods Content analyses were performed on (1) social media content on the Facebook, Twitter, and YouTube accounts of the top 15 pharmaceutical companies in the world and (2) the content that appears when searching on Facebook, Twitter, and YouTube for the top 20 pharmaceutical drugs purchased in the United States. Notably, for the company-specific analysis, we examined the presence of information similar to various forms of DTCA, the audience reach of company postings, and the quantity and quality of company-consumer interaction. For the drug-specific analysis, we documented the presence of illegal pharmacies, personal testimonials, and drug efficacy claims. Results From the company-specific analysis, we found information similar to help-seeking DTCA in 40.7% (301/740) of pharmaceutical companies’ social media posts. Drug product claims were present in only 1.6% (12/740) of posts. Overall, there was a substantial amount of consumers

  11. The Information Content of Discrete Functions and Their Application in Genetic Data Analysis.

    Science.gov (United States)

    Sakhanenko, Nikita A; Kunert-Graf, James; Galas, David J

    2017-12-01

    The complex of central problems in data analysis consists of three components: (1) detecting the dependence of variables using quantitative measures, (2) defining the significance of these dependence measures, and (3) inferring the functional relationships among dependent variables. We have argued previously that an information theory approach allows separation of the detection problem from the inference of functional form problem. We approach here the third component of inferring functional forms based on information encoded in the functions. We present here a direct method for classifying the functional forms of discrete functions of three variables represented in data sets. Discrete variables are frequently encountered in data analysis, both as the result of inherently categorical variables and from the binning of continuous numerical variables into discrete alphabets of values. The fundamental question of how much information is contained in a given function is answered for these discrete functions, and their surprisingly complex relationships are illustrated. The all-important effect of noise on the inference of function classes is found to be highly heterogeneous and reveals some unexpected patterns. We apply this classification approach to an important area of biological data analysis-that of inference of genetic interactions. Genetic analysis provides a rich source of real and complex biological data analysis problems, and our general methods provide an analytical basis and tools for characterizing genetic problems and for analyzing genetic data. We illustrate the functional description and the classes of a number of common genetic interaction modes and also show how different modes vary widely in their sensitivity to noise.

  12. The incremental information content of earnings, working capital from operations, and cash flows

    Directory of Open Access Journals (Sweden)

    Simin Banifatemi Kashi

    2015-09-01

    Full Text Available This paper presents an empirical study to determine the effects of different factors including present profit, depreciation, working capital, operating cash flow and other accruals on future earnings stability. The study selects the information of 124 selected firms from Tehran Stock Exchange over the period 2006-2012. Using two regression analysis, the study has determined that as the fluctuation of profit increases, the profitability increases too. In addition, the study has concluded that firms with minimum fluctuations preserve more stable profitability. Moreover, firms with higher fluctuation in profitability maintain more volatile profitability for the next consecutive period.

  13. [Information content of immunologic parameters in the evaluation of the effects of hazardous substances].

    Science.gov (United States)

    Litovskaia, A V; Sadovskiĭ, V V; Vifleemskiĭ, A B

    1995-01-01

    Clinical and immunologic examination including 1 and 2 level tests covered 429 staffers of chemical enterprises and 1122 of those engaged into microbiological synthesis of proteins, both the groups exposed to some irritating gases and isocyanates. Using calculation of Kulbak's criterion, the studies selected informative parameters to diagnose immune disturbances caused by occupational hazards. For integral evaluation of immune state, the authors applied general immunologic parameter, meanings of which can serve as criteria for early diagnosis of various immune disorders and for definition of risk groups among industrial workers exposed to occupational biologic and chemical hazards.

  14. Maximum Quantum Entropy Method

    OpenAIRE

    Sim, Jae-Hoon; Han, Myung Joon

    2018-01-01

    Maximum entropy method for analytic continuation is extended by introducing quantum relative entropy. This new method is formulated in terms of matrix-valued functions and therefore invariant under arbitrary unitary transformation of input matrix. As a result, the continuation of off-diagonal elements becomes straightforward. Without introducing any further ambiguity, the Bayesian probabilistic interpretation is maintained just as in the conventional maximum entropy method. The applications o...

  15. Maximum power demand cost

    International Nuclear Information System (INIS)

    Biondi, L.

    1998-01-01

    The charging for a service is a supplier's remuneration for the expenses incurred in providing it. There are currently two charges for electricity: consumption and maximum demand. While no problem arises about the former, the issue is more complicated for the latter and the analysis in this article tends to show that the annual charge for maximum demand arbitrarily discriminates among consumer groups, to the disadvantage of some [it

  16. Annual summary of the Oak Ridge Environmental Information System 1994 data base contents

    International Nuclear Information System (INIS)

    James, T.L.; Zygmunt, B.C.; Hines, J.F.

    1995-04-01

    The environmental measurements and geographic data bases of the Oak Ridge Environmental Information System (OREIS) contain data of known quality that can be accessed by OREIS users. The data within OREIS include environmental measurements data from the following environmental media: groundwater, surface water, sediment, soils, air, and biota. The types of environmental data within OREIS include but are not limited to chemical, biological, ecological, radiological, geophysical, and lithological data. Coordinate data within the environmental measurements data base provide the spatial context of the measurements data and are used to link the measurements data to the geographic data base. Descriptive and qualifier metadata are also part of the data bases. As of 30 September 1994, the OREIS environmental measurements data base consisted of approximately 380,000 rows associated with data generated by environmental restoration projects. The data base also contained 3,400 supporting codes and other reference data rows. Geographic data included the S-16A base map for the Oak Ridge Reservation, boundaries for operable units and ORNL waste area groupings, boundaries of groundwater coordination areas, contours generated as a result of the gamma radiation survey, representations of the environmentally sensitive areas, information received as part of the remedial investigation of East Fork Poplar Creek, high resolution background raster images for the three ORR installations, and locations of wells and other point features generated from ORACLE tables

  17. The electron localization as the information content of the conditional pair density

    Energy Technology Data Exchange (ETDEWEB)

    Urbina, Andres S.; Torres, F. Javier [Universidad San Francisco de Quito (USFQ), Grupo de Química Computacional y Teórica (QCT-USFQ), Departamento de Química e Ingeniería Química, Diego de Robles y Via Interoceanica, Quito 17-1200-841 (Ecuador); Universidad San Francisco de Quito (USFQ), Instituto de Simulación Computacional (ISC-USFQ), Diego de Robles y Via Interoceanica, Quito 17-1200-841 (Ecuador); Rincon, Luis, E-mail: lrincon@usfq.edu.ec, E-mail: lrincon@ula.ve [Universidad San Francisco de Quito (USFQ), Grupo de Química Computacional y Teórica (QCT-USFQ), Departamento de Química e Ingeniería Química, Diego de Robles y Via Interoceanica, Quito 17-1200-841 (Ecuador); Universidad San Francisco de Quito (USFQ), Instituto de Simulación Computacional (ISC-USFQ), Diego de Robles y Via Interoceanica, Quito 17-1200-841 (Ecuador); Departamento de Química, Facultad de Ciencias, Universidad de Los Andes (ULA), La Hechicera, Mérida-5101 (Venezuela, Bolivarian Republic of)

    2016-06-28

    In the present work, the information gained by an electron for “knowing” about the position of another electron with the same spin is calculated using the Kullback-Leibler divergence (D{sub KL}) between the same-spin conditional pair probability density and the marginal probability. D{sub KL} is proposed as an electron localization measurement, based on the observation that regions of the space with high information gain can be associated with strong correlated localized electrons. Taking into consideration the scaling of D{sub KL} with the number of σ-spin electrons of a system (N{sup σ}), the quantity χ = (N{sup σ} − 1) D{sub KL}f{sub cut} is introduced as a general descriptor that allows the quantification of the electron localization in the space. f{sub cut} is defined such that it goes smoothly to zero for negligible densities. χ is computed for a selection of atomic and molecular systems in order to test its capability to determine the region in space where electrons are localized. As a general conclusion, χ is able to explain the electron structure of molecules on the basis of chemical grounds with a high degree of success and to produce a clear differentiation of the localization of electrons that can be traced to the fluctuation in the average number of electrons in these regions.

  18. Risk communication and informed consent in the medical tourism industry: a thematic content analysis of Canadian broker websites.

    Science.gov (United States)

    Penney, Kali; Snyder, Jeremy; Crooks, Valorie A; Johnston, Rory

    2011-09-26

    Medical tourism, thought of as patients seeking non-emergency medical care outside of their home countries, is a growing industry worldwide. Canadians are amongst those engaging in medical tourism, and many are helped in the process of accessing care abroad by medical tourism brokers - agents who specialize in making international medical care arrangements for patients. As a key source of information for these patients, brokers are likely to play an important role in communicating the risks and benefits of undergoing surgery or other procedures abroad to their clientele. This raises important ethical concerns regarding processes such as informed consent and the liability of brokers in the event that complications arise from procedures. The purpose of this article is to examine the language, information, and online marketing of Canadian medical tourism brokers' websites in light of such ethical concerns. An exhaustive online search using multiple search engines and keywords was performed to compile a comprehensive directory of English-language Canadian medical tourism brokerage websites. These websites were examined using thematic content analysis, which included identifying informational themes, generating frequency counts of these themes, and comparing trends in these counts to the established literature. Seventeen websites were identified for inclusion in this study. It was found that Canadian medical tourism broker websites varied widely in scope, content, professionalism and depth of information. Three themes emerged from the thematic content analysis: training and accreditation, risk communication, and business dimensions. Third party accreditation bodies of debatable regulatory value were regularly mentioned on the reviewed websites, and discussion of surgical risk was absent on 47% of the websites reviewed, with limited discussion of risk on the remaining ones. Terminology describing brokers' roles was somewhat inconsistent across the websites. Finally

  19. Risk communication and informed consent in the medical tourism industry: A thematic content analysis of canadian broker websites

    Science.gov (United States)

    2011-01-01

    Background Medical tourism, thought of as patients seeking non-emergency medical care outside of their home countries, is a growing industry worldwide. Canadians are amongst those engaging in medical tourism, and many are helped in the process of accessing care abroad by medical tourism brokers - agents who specialize in making international medical care arrangements for patients. As a key source of information for these patients, brokers are likely to play an important role in communicating the risks and benefits of undergoing surgery or other procedures abroad to their clientele. This raises important ethical concerns regarding processes such as informed consent and the liability of brokers in the event that complications arise from procedures. The purpose of this article is to examine the language, information, and online marketing of Canadian medical tourism brokers' websites in light of such ethical concerns. Methods An exhaustive online search using multiple search engines and keywords was performed to compile a comprehensive directory of English-language Canadian medical tourism brokerage websites. These websites were examined using thematic content analysis, which included identifying informational themes, generating frequency counts of these themes, and comparing trends in these counts to the established literature. Results Seventeen websites were identified for inclusion in this study. It was found that Canadian medical tourism broker websites varied widely in scope, content, professionalism and depth of information. Three themes emerged from the thematic content analysis: training and accreditation, risk communication, and business dimensions. Third party accreditation bodies of debatable regulatory value were regularly mentioned on the reviewed websites, and discussion of surgical risk was absent on 47% of the websites reviewed, with limited discussion of risk on the remaining ones. Terminology describing brokers' roles was somewhat inconsistent across

  20. Risk communication and informed consent in the medical tourism industry: A thematic content analysis of canadian broker websites

    Directory of Open Access Journals (Sweden)

    Crooks Valorie A

    2011-09-01

    Full Text Available Abstract Background Medical tourism, thought of as patients seeking non-emergency medical care outside of their home countries, is a growing industry worldwide. Canadians are amongst those engaging in medical tourism, and many are helped in the process of accessing care abroad by medical tourism brokers - agents who specialize in making international medical care arrangements for patients. As a key source of information for these patients, brokers are likely to play an important role in communicating the risks and benefits of undergoing surgery or other procedures abroad to their clientele. This raises important ethical concerns regarding processes such as informed consent and the liability of brokers in the event that complications arise from procedures. The purpose of this article is to examine the language, information, and online marketing of Canadian medical tourism brokers' websites in light of such ethical concerns. Methods An exhaustive online search using multiple search engines and keywords was performed to compile a comprehensive directory of English-language Canadian medical tourism brokerage websites. These websites were examined using thematic content analysis, which included identifying informational themes, generating frequency counts of these themes, and comparing trends in these counts to the established literature. Results Seventeen websites were identified for inclusion in this study. It was found that Canadian medical tourism broker websites varied widely in scope, content, professionalism and depth of information. Three themes emerged from the thematic content analysis: training and accreditation, risk communication, and business dimensions. Third party accreditation bodies of debatable regulatory value were regularly mentioned on the reviewed websites, and discussion of surgical risk was absent on 47% of the websites reviewed, with limited discussion of risk on the remaining ones. Terminology describing brokers' roles was

  1. Maximum Entropy in Drug Discovery

    Directory of Open Access Journals (Sweden)

    Chih-Yuan Tseng

    2014-07-01

    Full Text Available Drug discovery applies multidisciplinary approaches either experimentally, computationally or both ways to identify lead compounds to treat various diseases. While conventional approaches have yielded many US Food and Drug Administration (FDA-approved drugs, researchers continue investigating and designing better approaches to increase the success rate in the discovery process. In this article, we provide an overview of the current strategies and point out where and how the method of maximum entropy has been introduced in this area. The maximum entropy principle has its root in thermodynamics, yet since Jaynes’ pioneering work in the 1950s, the maximum entropy principle has not only been used as a physics law, but also as a reasoning tool that allows us to process information in hand with the least bias. Its applicability in various disciplines has been abundantly demonstrated. We give several examples of applications of maximum entropy in different stages of drug discovery. Finally, we discuss a promising new direction in drug discovery that is likely to hinge on the ways of utilizing maximum entropy.

  2. Stem Cell Therapy on the Internet: Information Quality and Content Analysis of English Language Web Pages Returned by Google

    Directory of Open Access Journals (Sweden)

    Douglas Meehan

    2017-12-01

    Full Text Available There are expectations that stem cell therapy (SCT will treat many currently untreatable diseases. The Internet is widely used by patients seeking information about new treatments, and hence, analyzing websites is a representative sample of the information available to the public. Our aim was to understand what information the public would find when searching for information on SCT on Google, as this would inform us on how lay people form their knowledge about SCT. We analyzed the content and information quality of the first 200 websites returned by a Google.com search on SCT. Most websites returned were from treatment centers (TCs, 44% followed by news and medical professional websites. The specialty most mentioned in non-TC websites was “neurological” (67%, followed by “cardiovascular” (42%, while the most frequent indication for which SCT is offered by TCs was musculoskeletal (89% followed by neurological (47%. 45% of the centers specialized in treating 1 specialty, 10% offer 2, and 45% offered between 3 and 18 different specialties. Of the 78 TCs, 65% were in the USA, 23% in Asia, and 8% in Latin America. None of the centers offered SCT based on embryonic cells. Health information quality (JAMA score, measuring trustworthiness was lowest for TCs and commercial websites and highest for scientific journals and health portals. This study shows a disconnection between information about SCT and what is actually offered by TCs. The study also shows that TCs, potentially acting in a regulatory gray area, have a high visibility on the Internet.

  3. A Study on Gendered Portrayals in Children's Informational Books with Scientific Content

    Directory of Open Access Journals (Sweden)

    Patricia R. Ladd

    2012-12-01

    Full Text Available This study analyzes gender bias in children's informational books about science and science careers to determine how these early resources are affecting the disparity between males and females in science and engineering fields. The study focused on the number of male and female scientists both in pictures and text, and how much space was devoted to discussion of scientists of each gender. Overall, the findings of the study show that only 18% of the pictured scientists were female as well as only 16% of the scientists discussed in the text. These numbers are below current industry data that puts the number of females working in science and engineering fields at 26%.

  4. Annual summary of the contents of the Oak Ridge Environmental Information System (OREIS) 1993 data base

    International Nuclear Information System (INIS)

    McCord, R.A.; Herr, D.D.; Durfee, R.C.; Land, M.L.; Monroe, F.E.; Olson, R.J.; Thomas, J.K.; Tinnel, E.P.

    1994-06-01

    The data base of the Oak Ridge Environmental Information System (OREIS) contains data of known quality that can be accessed by OREIS users. OREIS meets data management/access requirements for environmental data as specified in the Federal Facility Agreement for the Oak Ridge Reservation and the State Oversight Agreement between the State of Tennessee and the Department of Energy. The types of environmental data within OREIS include measurement data from the following environmental disciplines: groundwater, surface water, sediment, soils, air, and biota. In addition to measurement data, the OREIS data base contains extensive descriptive and qualifier metadata to help define data quality and to enable end users to analyze the appropriateness of data for their purposes. Another important aspect of measurement data is their spatial context; OREIS maintains a comprehensive library of geographic data and tools to analyze and display spatial relationships of the data. As of November 1993, the OREIS data base consists of approximately 100,000 records associated with three environmental restoration projects along with coordinate data and background map data. The data base also contains 2,700 supporting codes and other reference data records. Geographic data include the S-16A base map for the Oak Ridge Reservation, boundaries for operable units, and high-resolution raster images for each of the sites

  5. Measuring what matters to patients: Using goal content to inform measure choice and development.

    Science.gov (United States)

    Jacob, Jenna; Edbrooke-Childs, Julian; Law, Duncan; Wolpert, Miranda

    2017-04-01

    Personalised care requires personalised outcomes and ways of feeding back clinically useful information to clinicians and practitioners, but it is not clear how to best personalise outcome measurement and feedback using existing standardised outcome measures. The constant comparison method of grounded theory was used to compare goal themes derived from goals set at the outset of therapy for 180 children aged between 4 and 17 years, visiting eight child and adolescent mental health services, to existing standardised outcome measures used as part of common national datasets. In all, 20 out of 27 goal themes corresponded to items on at least one commonly used outcome measure. Consideration of goal themes helped to identify potential relevant outcome measures. However, there were several goal themes that were not captured by items on standardised outcome measures. These seemed to be related to existential factors such as understanding, thinking about oneself and future planning. This presents a powerful framework for how clinicians can use goals to help select a standardised outcome measure (where this is helpful) in addition to the use of a goal-based outcome measure and personalise choices. There may be areas not captured by standardised outcome measures that may be important for children and young people and which may only be currently captured in goal measurement. There is an indication that we may not be measuring what is important to children and young people. We may need to develop or look for new measures that capture these areas.

  6. Trans fatty acid content in Serbian margarines: Urgent need for legislative changes and consumer information.

    Science.gov (United States)

    Vučić, Vesna; Arsić, Aleksandra; Petrović, Snježana; Milanović, Sandra; Gurinović, Mirjana; Glibetić, Maria

    2015-10-15

    This study examined the fatty acid (FA) composition of 13 (7 soft and 6 hard) Serbian margarines. Significantly higher amounts of trans fatty acids (TFA) were found in hard margarines (up to 28.84% of total FA), than in soft ones (0.17-6.89%). Saturated FA (SFA) were present with 22.76-51.17%. Oleic acid ranged from 26.78% to 43.78%. The proportion of polyunsaturated FA (PUFA) was 22.15-49.29% in soft margarines, but only 8.02-15.28% in hard margarines, probably due to the hydrogenisation process. The atherogenic and thrombogenic indexes (AI and TI, respectively) in soft margarines were relatively low (AI 0.23-0.63 and TI 0.44-0.97), but in hard margarines AI and particularly TI were high (1.03-1.67 and 1.96-3.04, respectively). These findings suggest that FA composition of Serbian margarines should be improved by replacing atherogenic TFA and SFA with beneficial ones, in order to avoid adverse effects on health. Therefore legislative changes and consumer information are urgently needed. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Informational and emotional elements in online support groups: a Bayesian approach to large-scale content analysis.

    Science.gov (United States)

    Deetjen, Ulrike; Powell, John A

    2016-05-01

    This research examines the extent to which informational and emotional elements are employed in online support forums for 14 purposively sampled chronic medical conditions and the factors that influence whether posts are of a more informational or emotional nature. Large-scale qualitative data were obtained from Dailystrength.org. Based on a hand-coded training dataset, all posts were classified into informational or emotional using a Bayesian classification algorithm to generalize the findings. Posts that could not be classified with a probability of at least 75% were excluded. The overall tendency toward emotional posts differs by condition: mental health (depression, schizophrenia) and Alzheimer's disease consist of more emotional posts, while informational posts relate more to nonterminal physical conditions (irritable bowel syndrome, diabetes, asthma). There is no gender difference across conditions, although prostate cancer forums are oriented toward informational support, whereas breast cancer forums rather feature emotional support. Across diseases, the best predictors for emotional content are lower age and a higher number of overall posts by the support group member. The results are in line with previous empirical research and unify empirical findings from single/2-condition research. Limitations include the analytical restriction to predefined categories (informational, emotional) through the chosen machine-learning approach. Our findings provide an empirical foundation for building theory on informational versus emotional support across conditions, give insights for practitioners to better understand the role of online support groups for different patients, and show the usefulness of machine-learning approaches to analyze large-scale qualitative health data from online settings. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  8. Sedative and Analgesic Drugs Online: A Content Analysis of the Supply and Demand Information Available in Thailand.

    Science.gov (United States)

    Pinyopornpanish, Kanokporn; Jiraporncharoen, Wichuda; Thaikla, Kanittha; Yoonut, Kulyapa; Angkurawaranon, Chaisiri

    2018-03-21

    Evidence from other countries has suggested that many controlled drugs are also offered online, even though it is illegal to sell these drugs without a license. To evaluate the current contents related to the supply and demand of sedatives and analgesic drugs available online in Thailand, with a particular focus on Facebook. A team of reviewers manually searched for data by entering keywords related to analgesic drugs and sedatives. The contents of the website were screened for supply and demand-related information. A total of 5,352 websites were found publicly available. The number of websites and Facebook pages containing the information potentially related to the supply and demand of analgesic drugs and sedatives was limited. Nine websites sold sedatives, and six websites sold analgesics directly. Fourteen Facebook pages were found, including 7 sedative pages and 7 analgesic pages. Within one year, the three remaining active pages multiplied in the number of followers by three- to nine-fold. The most popular Facebook page had over 2,900 followers. Both the internet and social media contain sites and pages where sedatives and analgesics are illegally advertised. These websites are searchable through common search engines. Although the number of websites is limited, the number of followers on these Facebook pages does suggest a growing number of people who are interested in such pages. Our study emphasized the importance of monitoring and developing potential plans relative to the online marketing of prescription drugs in Thailand.

  9. RETRIEVAL OF AEROSOL MICROPHYSICAL PROPERTIES BASED ON THE OPTIMAL ESTIMATION METHOD: INFORMATION CONTENT ANALYSIS FOR SATELLITE POLARIMETRIC REMOTE SENSING MEASUREMENTS

    Directory of Open Access Journals (Sweden)

    W. Z. Hou

    2018-04-01

    Full Text Available This paper evaluates the information content for the retrieval of key aerosol microphysical and surface properties for multispectral single-viewing satellite polarimetric measurements cantered at 410, 443, 555, 670, 865, 1610 and 2250 nm over bright land. To conduct the information content analysis, the synthetic data are simulated by the Unified Linearized Vector Radiative Transfer Model (UNLVTM with the intensity and polarization together over bare soil surface for various scenarios. Following the optimal estimation theory, a principal component analysis method is employed to reconstruct the multispectral surface reflectance from 410 nm to 2250 nm, and then integrated with a linear one-parametric BPDF model to represent the contribution of polarized surface reflectance, thus further to decouple the surface-atmosphere contribution from the TOA measurements. Focusing on two different aerosol models with the aerosol optical depth equal to 0.8 at 550 nm, the total DFS and DFS component of each retrieval aerosol and surface parameter are analysed. The DFS results show that the key aerosol microphysical properties, such as the fine- and coarse-mode columnar volume concentration, the effective radius and the real part of complex refractive index at 550 nm, could be well retrieved with the surface parameters simultaneously over bare soil surface type. The findings of this study can provide the guidance to the inversion algorithm development over bright surface land by taking full use of the single-viewing satellite polarimetric measurements.

  10. E-learning for Critical Thinking: Using Nominal Focus Group Method to Inform Software Content and Design

    Science.gov (United States)

    Parker, Steve; Mayner, Lidia; Michael Gillham, David

    2015-01-01

    Background: Undergraduate nursing students are often confused by multiple understandings of critical thinking. In response to this situation, the Critiique for critical thinking (CCT) project was implemented to provide consistent structured guidance about critical thinking. Objectives: This paper introduces Critiique software, describes initial validation of the content of this critical thinking tool and explores wider applications of the Critiique software. Materials and Methods: Critiique is flexible, authorable software that guides students step-by-step through critical appraisal of research papers. The spelling of Critiique was deliberate, so as to acquire a unique web domain name and associated logo. The CCT project involved implementation of a modified nominal focus group process with academic staff working together to establish common understandings of critical thinking. Previous work established a consensus about critical thinking in nursing and provided a starting point for the focus groups. The study was conducted at an Australian university campus with the focus group guided by open ended questions. Results: Focus group data established categories of content that academic staff identified as important for teaching critical thinking. This emerging focus group data was then used to inform modification of Critiique software so that students had access to consistent and structured guidance in relation to critical thinking and critical appraisal. Conclusions: The project succeeded in using focus group data from academics to inform software development while at the same time retaining the benefits of broader philosophical dimensions of critical thinking. PMID:26835469

  11. Retrieval of Aerosol Microphysical Properties Based on the Optimal Estimation Method: Information Content Analysis for Satellite Polarimetric Remote Sensing Measurements

    Science.gov (United States)

    Hou, W. Z.; Li, Z. Q.; Zheng, F. X.; Qie, L. L.

    2018-04-01

    This paper evaluates the information content for the retrieval of key aerosol microphysical and surface properties for multispectral single-viewing satellite polarimetric measurements cantered at 410, 443, 555, 670, 865, 1610 and 2250 nm over bright land. To conduct the information content analysis, the synthetic data are simulated by the Unified Linearized Vector Radiative Transfer Model (UNLVTM) with the intensity and polarization together over bare soil surface for various scenarios. Following the optimal estimation theory, a principal component analysis method is employed to reconstruct the multispectral surface reflectance from 410 nm to 2250 nm, and then integrated with a linear one-parametric BPDF model to represent the contribution of polarized surface reflectance, thus further to decouple the surface-atmosphere contribution from the TOA measurements. Focusing on two different aerosol models with the aerosol optical depth equal to 0.8 at 550 nm, the total DFS and DFS component of each retrieval aerosol and surface parameter are analysed. The DFS results show that the key aerosol microphysical properties, such as the fine- and coarse-mode columnar volume concentration, the effective radius and the real part of complex refractive index at 550 nm, could be well retrieved with the surface parameters simultaneously over bare soil surface type. The findings of this study can provide the guidance to the inversion algorithm development over bright surface land by taking full use of the single-viewing satellite polarimetric measurements.

  12. E-learning for Critical Thinking: Using Nominal Focus Group Method to Inform Software Content and Design.

    Science.gov (United States)

    Parker, Steve; Mayner, Lidia; Michael Gillham, David

    2015-12-01

    Undergraduate nursing students are often confused by multiple understandings of critical thinking. In response to this situation, the Critiique for critical thinking (CCT) project was implemented to provide consistent structured guidance about critical thinking. This paper introduces Critiique software, describes initial validation of the content of this critical thinking tool and explores wider applications of the Critiique software. Critiique is flexible, authorable software that guides students step-by-step through critical appraisal of research papers. The spelling of Critiique was deliberate, so as to acquire a unique web domain name and associated logo. The CCT project involved implementation of a modified nominal focus group process with academic staff working together to establish common understandings of critical thinking. Previous work established a consensus about critical thinking in nursing and provided a starting point for the focus groups. The study was conducted at an Australian university campus with the focus group guided by open ended questions. Focus group data established categories of content that academic staff identified as important for teaching critical thinking. This emerging focus group data was then used to inform modification of Critiique software so that students had access to consistent and structured guidance in relation to critical thinking and critical appraisal. The project succeeded in using focus group data from academics to inform software development while at the same time retaining the benefits of broader philosophical dimensions of critical thinking.

  13. Advertising Content

    OpenAIRE

    Simon P. Anderson; Régis Renault

    2002-01-01

    Empirical evidence suggests that most advertisements contain little direct informa- tion. Many do not mention prices. We analyze a firm'ss choice of advertising content and the information disclosed to consumers. A firm advertises only product informa- tion, price information, or both; and prefers to convey only limited product information if possible. Extending the "persuasion" game, we show that quality information takes precedence over price information and horizontal product information.T...

  14. THE INFORMATION CONTENT IN ANALYTIC SPOT MODELS OF BROADBAND PRECISION LIGHT CURVES

    Energy Technology Data Exchange (ETDEWEB)

    Walkowicz, Lucianne M. [Department of Astrophysical Sciences, Princeton University, Peyton Hall, 4 Ivy Lane, Princeton, NJ 08534 (United States); Basri, Gibor [Astronomy Department, University of California at Berkeley, Hearst Field Annex, Berkeley, CA 94720 (United States); Valenti, Jeff A. [Space Telescope Science Institute, 3700 San Martin Dr., Baltimore, MD 21218 (United States)

    2013-04-01

    We present the results of numerical experiments to assess degeneracies in light curve models of starspots. Using synthetic light curves generated with the Cheetah starspot modeling code, we explore the extent to which photometric light curves constrain spot model parameters, including spot latitudes and stellar inclination. We also investigate the effects of spot parameters and differential rotation on one's ability to correctly recover rotation periods and differential rotation in the Kepler light curves. We confirm that in the absence of additional constraints on the stellar inclination, such as spectroscopic measurements of vsin i or occultations of starspots by planetary transits, the spot latitude and stellar inclination are difficult to determine uniquely from the photometry alone. We find that for models with no differential rotation, spots that appear on opposite hemispheres of the star may cause one to interpret the rotation period to be half of the true period. When differential rotation is included, the changing longitude separation between spots breaks the symmetry of the hemispheres and the correct rotation period is more likely to be found. The dominant period found via periodogram analysis is typically that of the largest spot. Even when multiple spots with periods representative of the star's differential rotation exist, if one spot dominates the light curve the signal of differential rotation may not be detectable from the periodogram alone. Starspot modeling is applicable to stars with a wider range of rotation rates than other surface imaging techniques (such as Doppler imaging), allows subtle signatures of differential rotation to be measured, and may provide valuable information on the distribution of stellar spots. However, given the inherent degeneracies and uncertainty present in starspot models, caution should be exercised in their interpretation.

  15. THE INFORMATION CONTENT IN ANALYTIC SPOT MODELS OF BROADBAND PRECISION LIGHT CURVES

    International Nuclear Information System (INIS)

    Walkowicz, Lucianne M.; Basri, Gibor; Valenti, Jeff A.

    2013-01-01

    We present the results of numerical experiments to assess degeneracies in light curve models of starspots. Using synthetic light curves generated with the Cheetah starspot modeling code, we explore the extent to which photometric light curves constrain spot model parameters, including spot latitudes and stellar inclination. We also investigate the effects of spot parameters and differential rotation on one's ability to correctly recover rotation periods and differential rotation in the Kepler light curves. We confirm that in the absence of additional constraints on the stellar inclination, such as spectroscopic measurements of vsin i or occultations of starspots by planetary transits, the spot latitude and stellar inclination are difficult to determine uniquely from the photometry alone. We find that for models with no differential rotation, spots that appear on opposite hemispheres of the star may cause one to interpret the rotation period to be half of the true period. When differential rotation is included, the changing longitude separation between spots breaks the symmetry of the hemispheres and the correct rotation period is more likely to be found. The dominant period found via periodogram analysis is typically that of the largest spot. Even when multiple spots with periods representative of the star's differential rotation exist, if one spot dominates the light curve the signal of differential rotation may not be detectable from the periodogram alone. Starspot modeling is applicable to stars with a wider range of rotation rates than other surface imaging techniques (such as Doppler imaging), allows subtle signatures of differential rotation to be measured, and may provide valuable information on the distribution of stellar spots. However, given the inherent degeneracies and uncertainty present in starspot models, caution should be exercised in their interpretation.

  16. Informed consent and placebo effects: a content analysis of information leaflets to identify what clinical trial participants are told about placebos.

    Directory of Open Access Journals (Sweden)

    Felicity L Bishop

    Full Text Available Placebo groups are used in randomised clinical trials (RCTs to control for placebo effects, which can be large. Participants in trials can misunderstand written information particularly regarding technical aspects of trial design such as randomisation; the adequacy of written information about placebos has not been explored. We aimed to identify what participants in major RCTs in the UK are told about placebos and their effects.We conducted a content analysis of 45 Participant Information Leaflets (PILs using quantitative and qualitative methodologies. PILs were obtained from trials on a major registry of current UK clinical trials (the UKCRN database. Eligible leaflets were received from 44 non-commercial trials but only 1 commercial trial. The main limitation is the low response rate (13.5%, but characteristics of included trials were broadly representative of all non-commercial trials on the database. 84% of PILs were for trials with 50:50 randomisation ratios yet in almost every comparison the target treatments were prioritized over the placebos. Placebos were referred to significantly less frequently than target treatments (7 vs. 27 mentions, p<001 and were significantly less likely than target treatments to be described as triggering either beneficial effects (1 vs. 45, p<001 or adverse effects (4 vs. 39, p<001. 8 PILs (18% explicitly stated that the placebo treatment was either undesirable or ineffective.PILs from recent high quality clinical trials emphasise the benefits and adverse effects of the target treatment, while largely ignoring the possible effects of the placebo. Thus they provide incomplete and at times inaccurate information about placebos. Trial participants should be more fully informed about the health changes that they might experience from a placebo. To do otherwise jeopardises informed consent and is inconsistent with not only the science of placebos but also the fundamental rationale underpinning placebo controlled

  17. Information content and sensitivity of the 3β + 2α lidar measurement system for aerosol microphysical retrievals

    Science.gov (United States)

    Burton, Sharon P.; Chemyakin, Eduard; Liu, Xu; Knobelspiesse, Kirk; Stamnes, Snorre; Sawamura, Patricia; Moore, Richard H.; Hostetler, Chris A.; Ferrare, Richard A.

    2016-11-01

    There is considerable interest in retrieving profiles of aerosol effective radius, total number concentration, and complex refractive index from lidar measurements of extinction and backscatter at several wavelengths. The combination of three backscatter channels plus two extinction channels (3β + 2α) is particularly important since it is believed to be the minimum configuration necessary for the retrieval of aerosol microphysical properties and because the technological readiness of lidar systems permits this configuration on both an airborne and future spaceborne instrument. The second-generation NASA Langley airborne High Spectral Resolution Lidar (HSRL-2) has been making 3β + 2α measurements since 2012. The planned NASA Aerosol/Clouds/Ecosystems (ACE) satellite mission also recommends the 3β + 2α combination.Here we develop a deeper understanding of the information content and sensitivities of the 3β + 2α system in terms of aerosol microphysical parameters of interest. We use a retrieval-free methodology to determine the basic sensitivities of the measurements independent of retrieval assumptions and constraints. We calculate information content and uncertainty metrics using tools borrowed from the optimal estimation methodology based on Bayes' theorem, using a simplified forward model look-up table, with no explicit inversion. The forward model is simplified to represent spherical particles, monomodal log-normal size distributions, and wavelength-independent refractive indices. Since we only use the forward model with no retrieval, the given simplified aerosol scenario is applicable as a best case for all existing retrievals in the absence of additional constraints. Retrieval-dependent errors due to mismatch between retrieval assumptions and true atmospheric aerosols are not included in this sensitivity study, and neither are retrieval errors that may be introduced in the inversion process. The choice of a simplified model adds clarity to the

  18. Maximum likely scale estimation

    DEFF Research Database (Denmark)

    Loog, Marco; Pedersen, Kim Steenstrup; Markussen, Bo

    2005-01-01

    A maximum likelihood local scale estimation principle is presented. An actual implementation of the estimation principle uses second order moments of multiple measurements at a fixed location in the image. These measurements consist of Gaussian derivatives possibly taken at several scales and/or ...

  19. Robust Maximum Association Estimators

    NARCIS (Netherlands)

    A. Alfons (Andreas); C. Croux (Christophe); P. Filzmoser (Peter)

    2017-01-01

    textabstractThe maximum association between two multivariate variables X and Y is defined as the maximal value that a bivariate association measure between one-dimensional projections αX and αY can attain. Taking the Pearson correlation as projection index results in the first canonical correlation

  20. Population distribution of flexible molecules from maximum entropy analysis using different priors as background information: application to the Φ, Ψ-conformational space of the α-(1-->2)-linked mannose disaccharide present in N- and O-linked glycoproteins.

    Science.gov (United States)

    Säwén, Elin; Massad, Tariq; Landersjö, Clas; Damberg, Peter; Widmalm, Göran

    2010-08-21

    The conformational space available to the flexible molecule α-D-Manp-(1-->2)-α-D-Manp-OMe, a model for the α-(1-->2)-linked mannose disaccharide in N- or O-linked glycoproteins, is determined using experimental data and molecular simulation combined with a maximum entropy approach that leads to a converged population distribution utilizing different input information. A database survey of the Protein Data Bank where structures having the constituent disaccharide were retrieved resulted in an ensemble with >200 structures. Subsequent filtering removed erroneous structures and gave the database (DB) ensemble having three classes of mannose-containing compounds, viz., N- and O-linked structures, and ligands to proteins. A molecular dynamics (MD) simulation of the disaccharide revealed a two-state equilibrium with a major and a minor conformational state, i.e., the MD ensemble. These two different conformation ensembles of the disaccharide were compared to measured experimental spectroscopic data for the molecule in water solution. However, neither of the two populations were compatible with experimental data from optical rotation, NMR (1)H,(1)H cross-relaxation rates as well as homo- and heteronuclear (3)J couplings. The conformational distributions were subsequently used as background information to generate priors that were used in a maximum entropy analysis. The resulting posteriors, i.e., the population distributions after the application of the maximum entropy analysis, still showed notable deviations that were not anticipated based on the prior information. Therefore, reparameterization of homo- and heteronuclear Karplus relationships for the glycosidic torsion angles Φ and Ψ were carried out in which the importance of electronegative substituents on the coupling pathway was deemed essential resulting in four derived equations, two (3)J(COCC) and two (3)J(COCH) being different for the Φ and Ψ torsions, respectively. These Karplus relationships are denoted

  1. "Wow! Look at That!": Discourse as a Means to Improve Teachers' Science Content Learning in Informal Science Institutions

    Science.gov (United States)

    Holliday, Gary M.; Lederman, Judith S.; Lederman, Norman G.

    2014-12-01

    Currently, it is not clear whether professional development staff at Informal Science Institutions (ISIs) are considering the way exhibits contribute to the social aspects of learning as described by the contextual model of learning (CML) (Falk & Dierking in The museum experience. Whalesback, Washington, 1992; Learning from museums: visitor experiences and the making of meaning. Altamira Press, New York, 2000) and recommended in the reform documents (see Cox-Peterson et al. in Journal of Research in Science Teaching 40:200-218, 2003). In order to move beyond only preparing science teachers for field trips, while necessary, it is also important to understand the role exhibits play in influencing teachers' content-related social interactions while engaged in ISI professional development. This study looked at a life science course that was offered at and taught by education staff of a large science and technology museum located in the Midwest, USA. The course was offered to three sections of teachers throughout the school year and met six times for a full day. The courses met approximately once a month from September through the beginning of June and provided 42 contact hours overall. Elementary and middle school teachers ( n = 94) were audio- and videotaped while participating in the content courses and interacting with the museum's exhibits. When considering the two factors within the sociocultural context of CML: within-group sociocultural mediation and facilitated mediation by others, the use of exhibits during both courses generally did not fully take into account these elements. In this study, it seemed that teachers' talk always had a purpose but it is argued that it did not always have a direction or connection to the desired content or exhibit. When freely exploring the museum, teachers often purely reacted to the display itself or the novelty of it. However, when PD staff made explicit connections between exhibits, content, and activities, participants were

  2. Characterizing the information content of cloud thermodynamic phase retrievals from the notional PACE OCI shortwave reflectance measurements

    Science.gov (United States)

    Coddington, O. M.; Vukicevic, T.; Schmidt, K. S.; Platnick, S.

    2017-08-01

    We rigorously quantify the probability of liquid or ice thermodynamic phase using only shortwave spectral channels specific to the National Aeronautics and Space Administration's Moderate Resolution Imaging Spectroradiometer, Visible Infrared Imaging Radiometer Suite, and the notional future Plankton, Aerosol, Cloud, ocean Ecosystem imager. The results show that two shortwave-infrared channels (2135 and 2250 nm) provide more information on cloud thermodynamic phase than either channel alone; in one case, the probability of ice phase retrieval increases from 65 to 82% by combining 2135 and 2250 nm channels. The analysis is performed with a nonlinear statistical estimation approach, the GEneralized Nonlinear Retrieval Analysis (GENRA). The GENRA technique has previously been used to quantify the retrieval of cloud optical properties from passive shortwave observations, for an assumed thermodynamic phase. Here we present the methodology needed to extend the utility of GENRA to a binary thermodynamic phase space (i.e., liquid or ice). We apply formal information content metrics to quantify our results; two of these (mutual and conditional information) have not previously been used in the field of cloud studies.

  3. Maximum Path Information and Fokker Planck Equation

    Science.gov (United States)

    Li, Wei; Wang A., Q.; LeMehaute, A.

    2008-04-01

    We present a rigorous method to derive the nonlinear Fokker-Planck (FP) equation of anomalous diffusion directly from a generalization of the principle of least action of Maupertuis proposed by Wang [Chaos, Solitons & Fractals 23 (2005) 1253] for smooth or quasi-smooth irregular dynamics evolving in Markovian process. The FP equation obtained may take two different but equivalent forms. It was also found that the diffusion constant may depend on both q (the index of Tsallis entropy [J. Stat. Phys. 52 (1988) 479] and the time t.

  4. High-performance information search filters for acute kidney injury content in PubMed, Ovid Medline and Embase.

    Science.gov (United States)

    Hildebrand, Ainslie M; Iansavichus, Arthur V; Haynes, R Brian; Wilczynski, Nancy L; Mehta, Ravindra L; Parikh, Chirag R; Garg, Amit X

    2014-04-01

    We frequently fail to identify articles relevant to the subject of acute kidney injury (AKI) when searching the large bibliographic databases such as PubMed, Ovid Medline or Embase. To address this issue, we used computer automation to create information search filters to better identify articles relevant to AKI in these databases. We first manually reviewed a sample of 22 992 full-text articles and used prespecified criteria to determine whether each article contained AKI content or not. In the development phase (two-thirds of the sample), we developed and tested the performance of >1.3-million unique filters. Filters with high sensitivity and high specificity for the identification of AKI articles were then retested in the validation phase (remaining third of the sample). We succeeded in developing and validating high-performance AKI search filters for each bibliographic database with sensitivities and specificities in excess of 90%. Filters optimized for sensitivity reached at least 97.2% sensitivity, and filters optimized for specificity reached at least 99.5% specificity. The filters were complex; for example one PubMed filter included >140 terms used in combination, including 'acute kidney injury', 'tubular necrosis', 'azotemia' and 'ischemic injury'. In proof-of-concept searches, physicians found more articles relevant to topics in AKI with the use of the filters. PubMed, Ovid Medline and Embase can be filtered for articles relevant to AKI in a reliable manner. These high-performance information filters are now available online and can be used to better identify AKI content in large bibliographic databases.

  5. Maximum power point tracking

    International Nuclear Information System (INIS)

    Enslin, J.H.R.

    1990-01-01

    A well engineered renewable remote energy system, utilizing the principal of Maximum Power Point Tracking can be m ore cost effective, has a higher reliability and can improve the quality of life in remote areas. This paper reports that a high-efficient power electronic converter, for converting the output voltage of a solar panel, or wind generator, to the required DC battery bus voltage has been realized. The converter is controlled to track the maximum power point of the input source under varying input and output parameters. Maximum power point tracking for relative small systems is achieved by maximization of the output current in a battery charging regulator, using an optimized hill-climbing, inexpensive microprocessor based algorithm. Through practical field measurements it is shown that a minimum input source saving of 15% on 3-5 kWh/day systems can easily be achieved. A total cost saving of at least 10-15% on the capital cost of these systems are achievable for relative small rating Remote Area Power Supply systems. The advantages at larger temperature variations and larger power rated systems are much higher. Other advantages include optimal sizing and system monitor and control

  6. A Content Analysis of Health and Safety Communications Among Internet-Based Sex Work Advertisements: Important Information for Public Health.

    Science.gov (United States)

    Kille, Julie; Bungay, Vicky; Oliffe, John; Atchison, Chris

    2017-04-13

    The capacity to advertise via the Internet continues to contribute to the shifting dynamics in adult commercial sex work. eHealth interventions have shown promise to promote Internet-based sex workers' health and safety internationally, yet minimal attention has been paid in Canada to developing such interventions. Understanding the information communicated in Internet-based sex work advertisements is a critical step in knowledge development to inform such interventions. The purpose of this content analysis was to increase our understanding of the health and safety information within the Internet advertisements among women, men, and transgender sex workers and to describe how this information may be utilized to inform eHealth service development for this population. A total of 75 Internet-based sex worker advertisements (45 women, 24 men, and 6 transgender persons) were purposefully selected from 226 advertisements collected as part of a larger study in Western Canada. Content analysis was employed to guide data extraction about demographic characteristics, sexual services provided, service restrictions, health practices and concerns, safety and security, and business practices. Frequencies for each variable were calculated and further classified by gender. Thematic analysis was then undertaken to situate the communications within the social and commercialized contexts of the sex industry. Four communications themes were identified: (1) demographic characteristics; (2) sexual services; (3) health; and (4) safety and security. White was the most common ethnicity (46/75, 61%) of advertisements. It was found that 20-29 years of age accounted for 32 of the 51 advertisements that provided age. Escort, the only legal business title, was the most common role title used (48/75, 64%). In total, 85% (64/75) of advertisements detailed lists of sexual services provided and 41% (31/75) of advertisements noted never offering uncovered services (ie, no condom). Gender and the

  7. A Content Analysis of Health and Safety Communications Among Internet-Based Sex Work Advertisements: Important Information for Public Health

    Science.gov (United States)

    Atchison, Chris

    2017-01-01

    Background The capacity to advertise via the Internet continues to contribute to the shifting dynamics in adult commercial sex work. eHealth interventions have shown promise to promote Internet-based sex workers’ health and safety internationally, yet minimal attention has been paid in Canada to developing such interventions. Understanding the information communicated in Internet-based sex work advertisements is a critical step in knowledge development to inform such interventions. Objective The purpose of this content analysis was to increase our understanding of the health and safety information within the Internet advertisements among women, men, and transgender sex workers and to describe how this information may be utilized to inform eHealth service development for this population. Methods A total of 75 Internet-based sex worker advertisements (45 women, 24 men, and 6 transgender persons) were purposefully selected from 226 advertisements collected as part of a larger study in Western Canada. Content analysis was employed to guide data extraction about demographic characteristics, sexual services provided, service restrictions, health practices and concerns, safety and security, and business practices. Frequencies for each variable were calculated and further classified by gender. Thematic analysis was then undertaken to situate the communications within the social and commercialized contexts of the sex industry. Results Four communications themes were identified: (1) demographic characteristics; (2) sexual services; (3) health; and (4) safety and security. White was the most common ethnicity (46/75, 61%) of advertisements. It was found that 20-29 years of age accounted for 32 of the 51 advertisements that provided age. Escort, the only legal business title, was the most common role title used (48/75, 64%). In total, 85% (64/75) of advertisements detailed lists of sexual services provided and 41% (31/75) of advertisements noted never offering uncovered

  8. Information content in frequency-dependent, multi-offset GPR data for layered media reconstruction using full-wave inversion

    Science.gov (United States)

    De Coster, Albéric; Phuong Tran, Anh; Lambot, Sébastien

    2014-05-01

    Water lost through leaks can represent high percentages of the total production in water supply systems and constitutes an important issue. Leak detection can be tackled with various techniques such as the ground-penetrating radar (GPR). Based on this technology, various procedures have been elaborated to characterize a leak and its evolution. In this study, we focus on a new full-wave radar modelling approach for near-field conditions, which takes into account the antenna effects as well as the interactions between the antenna(s) and the medium through frequency-dependent global transmission and reflection coefficients. This approach is applied to layered media for which 3-D Green's functions can be calculated. The model allows for a quantitative estimation of the properties of multilayered media by using full-wave inversion. This method, however, proves to be limited to provide users with an on-demand assessment as it is generally computationally demanding and time consuming, depending on the medium configuration as well as the number of unknown parameters to retrieve. In that respect, we propose two leads in order to enhance the parameter retrieval step. The first one consists in analyzing the impact of the reduction of the number of frequencies on the information content. For both numerical and laboratory experiments, this operation has been achieved by investigating the response surface topography of objective functions arising from the comparison between measured and modelled data. The second one involves the numerical implementation of multistatic antenna configurations with constant and variable offsets in the model. These two kinds of analyses are then combined in numerical experiments to observe the conjugated effect of the number of frequencies and the offset configuration. To perform the numerical analyses, synthetic Green's functions were simulated for different multilayered medium configurations. The results show that an antenna offset increase leads

  9. High-performance information search filters for CKD content in PubMed, Ovid MEDLINE, and EMBASE.

    Science.gov (United States)

    Iansavichus, Arthur V; Hildebrand, Ainslie M; Haynes, R Brian; Wilczynski, Nancy L; Levin, Adeera; Hemmelgarn, Brenda R; Tu, Karen; Nesrallah, Gihad E; Nash, Danielle M; Garg, Amit X

    2015-01-01

    Finding relevant articles in large bibliographic databases such as PubMed, Ovid MEDLINE, and EMBASE to inform care and future research is challenging. Articles relevant to chronic kidney disease (CKD) are particularly difficult to find because they are often published under different terminology and are found across a wide range of journal types. We used computer automation within a diagnostic test assessment framework to develop and validate information search filters to identify CKD articles in large bibliographic databases. 22,992 full-text articles in PubMed, Ovid MEDLINE, or EMBASE. 1,374,148 unique search filters. We established the reference standard of article relevance to CKD by manual review of all full-text articles using prespecified criteria to determine whether each article contained CKD content or not. We then assessed filter performance by calculating sensitivity, specificity, and positive predictive value for the retrieval of CKD articles. Filters with high sensitivity and specificity for the identification of CKD articles in the development phase (two-thirds of the sample) were then retested in the validation phase (remaining one-third of the sample). We developed and validated high-performance CKD search filters for each bibliographic database. Filters optimized for sensitivity reached at least 99% sensitivity, and filters optimized for specificity reached at least 97% specificity. The filters were complex; for example, one PubMed filter included more than 89 terms used in combination, including "chronic kidney disease," "renal insufficiency," and "renal fibrosis." In proof-of-concept searches, physicians found more articles relevant to the topic of CKD with the use of these filters. As knowledge of the pathogenesis of CKD grows and definitions change, these filters will need to be updated to incorporate new terminology used to index relevant articles. PubMed, Ovid MEDLINE, and EMBASE can be filtered reliably for articles relevant to CKD. These

  10. Sexual and Reproductive Health Services and Related Health Information on Pregnancy Resource Center Websites: A Statewide Content Analysis.

    Science.gov (United States)

    Swartzendruber, Andrea; Newton-Levinson, Anna; Feuchs, Ashley E; Phillips, Ashley L; Hickey, Jennifer; Steiner, Riley J

    Pregnancy resource centers (PRCs) are nonprofit organizations with a primary mission of promoting childbirth among pregnant women. Given a new state grant program to publicly fund PRCs, we analyzed Georgia PRC websites to describe advertised services and related health information. We systematically identified all accessible Georgia PRC websites available from April to June 2016. Entire websites were obtained and coded using defined protocols. Of 64 reviewed websites, pregnancy tests and testing (98%) and options counseling (84%) were most frequently advertised. However, 58% of sites did not provide notice that PRCs do not provide or refer for abortion, and 53% included false or misleading statements regarding the need to make a decision about abortion or links between abortion and mental health problems or breast cancer. Advertised contraceptive services were limited to counseling about natural family planning (3%) and emergency contraception (14%). Most sites (89%) did not provide notice that PRCs do not provide or refer for contraceptives. Two sites (3%) advertised unproven "abortion reversal" services. Approximately 63% advertised ultrasound examinations, 22% sexually transmitted infection testing, and 5% sexually transmitted infection treatment. None promoted consistent and correct condom use; 78% with content about condoms included statements that seemed to be designed to undermine confidence in condom effectiveness. Approximately 84% advertised educational programs, and 61% material resources. Georgia PRC websites contain high levels of false and misleading health information; the advertised services do not seem to align with prevailing medical guidelines. Public funding for PRCs, an increasing national trend, should be rigorously examined. Increased regulation may be warranted to ensure quality health information and services. Copyright © 2017 Jacobs Institute of Women's Health. Published by Elsevier Inc. All rights reserved.

  11. A content analysis of the quantity and accuracy of dietary supplement information found in magazines with high adolescent readership.

    Science.gov (United States)

    Shaw, Patricia; Zhang, Vivien; Metallinos-Katsaras, Elizabeth

    2009-02-01

    The objective of this study was to examine the quantity and accuracy of dietary supplement (DS) information through magazines with high adolescent readership. Eight (8) magazines (3 teen and 5 adult with high teen readership) were selected. A content analysis for DS was conducted on advertisements and editorials (i.e., articles, advice columns, and bulletins). Noted claims/cautions regarding DS were evaluated for accuracy using Medlineplus.gov and Naturaldatabase.com. Claims for dietary supplements with three or more types of ingredients and those in advertisements were not evaluated. Advertisements were evaluated with respect to size, referenced research, testimonials, and Dietary Supplement Health and Education Act of 1994 (DSHEA) warning visibility. Eighty-eight (88) issues from eight magazines yielded 238 DS references. Fifty (50) issues from five magazines contained no DS reference. Among teen magazines, seven DS references were found: five in the editorials and two in advertisements. In adult magazines, 231 DS references were found: 139 in editorials and 92 in advertisements. Of the 88 claims evaluated, 15% were accurate, 23% were inconclusive, 3% were inaccurate, 5% were partially accurate, and 55% were unsubstantiated (i.e., not listed in reference databases). Of the 94 DS evaluated in advertisements, 43% were full page or more, 79% did not have a DSHEA warning visible, 46% referred to research, and 32% used testimonials. Teen magazines contain few references to DS, none accurate. Adult magazines that have a high teen readership contain a substantial amount of DS information with questionable accuracy, raising concerns that this information may increase the chances of inappropriate DS use by adolescents, thereby increasing the potential for unexpected effects or possible harm.

  12. The last glacial maximum

    Science.gov (United States)

    Clark, P.U.; Dyke, A.S.; Shakun, J.D.; Carlson, A.E.; Clark, J.; Wohlfarth, B.; Mitrovica, J.X.; Hostetler, S.W.; McCabe, A.M.

    2009-01-01

    We used 5704 14C, 10Be, and 3He ages that span the interval from 10,000 to 50,000 years ago (10 to 50 ka) to constrain the timing of the Last Glacial Maximum (LGM) in terms of global ice-sheet and mountain-glacier extent. Growth of the ice sheets to their maximum positions occurred between 33.0 and 26.5 ka in response to climate forcing from decreases in northern summer insolation, tropical Pacific sea surface temperatures, and atmospheric CO2. Nearly all ice sheets were at their LGM positions from 26.5 ka to 19 to 20 ka, corresponding to minima in these forcings. The onset of Northern Hemisphere deglaciation 19 to 20 ka was induced by an increase in northern summer insolation, providing the source for an abrupt rise in sea level. The onset of deglaciation of the West Antarctic Ice Sheet occurred between 14 and 15 ka, consistent with evidence that this was the primary source for an abrupt rise in sea level ???14.5 ka.

  13. Strengthening Health Information Services

    Science.gov (United States)

    Haro, A. S.

    1977-01-01

    Discusses the need to apply modern scientific management to health administration in order to effectively manage programs utilizing increased preventive and curative capabilities. The value of having maximum information in order to make decisions, and problems of determining information content are reviewed. For journal availability, see SO 506…

  14. Content-based image retrieval using spatial layout information in brain tumor T1-weighted contrast-enhanced MR images.

    Science.gov (United States)

    Huang, Meiyan; Yang, Wei; Wu, Yao; Jiang, Jun; Gao, Yang; Chen, Yang; Feng, Qianjin; Chen, Wufan; Lu, Zhentai

    2014-01-01

    This study aims to develop content-based image retrieval (CBIR) system for the retrieval of T1-weighted contrast-enhanced MR (CE-MR) images of brain tumors. When a tumor region is fed to the CBIR system as a query, the system attempts to retrieve tumors of the same pathological category. The bag-of-visual-words (BoVW) model with partition learning is incorporated into the system to extract informative features for representing the image contents. Furthermore, a distance metric learning algorithm called the Rank Error-based Metric Learning (REML) is proposed to reduce the semantic gap between low-level visual features and high-level semantic concepts. The effectiveness of the proposed method is evaluated on a brain T1-weighted CE-MR dataset with three types of brain tumors (i.e., meningioma, glioma, and pituitary tumor). Using the BoVW model with partition learning, the mean average precision (mAP) of retrieval increases beyond 4.6% with the learned distance metrics compared with the spatial pyramid BoVW method. The distance metric learned by REML significantly outperforms three other existing distance metric learning methods in terms of mAP. The mAP of the CBIR system is as high as 91.8% using the proposed method, and the precision can reach 93.1% when the top 10 images are returned by the system. These preliminary results demonstrate that the proposed method is effective and feasible for the retrieval of brain tumors in T1-weighted CE-MR Images.

  15. Content-based image retrieval using spatial layout information in brain tumor T1-weighted contrast-enhanced MR images.

    Directory of Open Access Journals (Sweden)

    Meiyan Huang

    Full Text Available This study aims to develop content-based image retrieval (CBIR system for the retrieval of T1-weighted contrast-enhanced MR (CE-MR images of brain tumors. When a tumor region is fed to the CBIR system as a query, the system attempts to retrieve tumors of the same pathological category. The bag-of-visual-words (BoVW model with partition learning is incorporated into the system to extract informative features for representing the image contents. Furthermore, a distance metric learning algorithm called the Rank Error-based Metric Learning (REML is proposed to reduce the semantic gap between low-level visual features and high-level semantic concepts. The effectiveness of the proposed method is evaluated on a brain T1-weighted CE-MR dataset with three types of brain tumors (i.e., meningioma, glioma, and pituitary tumor. Using the BoVW model with partition learning, the mean average precision (mAP of retrieval increases beyond 4.6% with the learned distance metrics compared with the spatial pyramid BoVW method. The distance metric learned by REML significantly outperforms three other existing distance metric learning methods in terms of mAP. The mAP of the CBIR system is as high as 91.8% using the proposed method, and the precision can reach 93.1% when the top 10 images are returned by the system. These preliminary results demonstrate that the proposed method is effective and feasible for the retrieval of brain tumors in T1-weighted CE-MR Images.

  16. Probable maximum flood control

    International Nuclear Information System (INIS)

    DeGabriele, C.E.; Wu, C.L.

    1991-11-01

    This study proposes preliminary design concepts to protect the waste-handling facilities and all shaft and ramp entries to the underground from the probable maximum flood (PMF) in the current design configuration for the proposed Nevada Nuclear Waste Storage Investigation (NNWSI) repository protection provisions were furnished by the United States Bureau of Reclamation (USSR) or developed from USSR data. Proposed flood protection provisions include site grading, drainage channels, and diversion dikes. Figures are provided to show these proposed flood protection provisions at each area investigated. These areas are the central surface facilities (including the waste-handling building and waste treatment building), tuff ramp portal, waste ramp portal, men-and-materials shaft, emplacement exhaust shaft, and exploratory shafts facility

  17. Introduction to maximum entropy

    International Nuclear Information System (INIS)

    Sivia, D.S.

    1988-01-01

    The maximum entropy (MaxEnt) principle has been successfully used in image reconstruction in a wide variety of fields. We review the need for such methods in data analysis and show, by use of a very simple example, why MaxEnt is to be preferred over other regularizing functions. This leads to a more general interpretation of the MaxEnt method, and its use is illustrated with several different examples. Practical difficulties with non-linear problems still remain, this being highlighted by the notorious phase problem in crystallography. We conclude with an example from neutron scattering, using data from a filter difference spectrometer to contrast MaxEnt with a conventional deconvolution. 12 refs., 8 figs., 1 tab

  18. Solar maximum observatory

    International Nuclear Information System (INIS)

    Rust, D.M.

    1984-01-01

    The successful retrieval and repair of the Solar Maximum Mission (SMM) satellite by Shuttle astronauts in April 1984 permitted continuance of solar flare observations that began in 1980. The SMM carries a soft X ray polychromator, gamma ray, UV and hard X ray imaging spectrometers, a coronagraph/polarimeter and particle counters. The data gathered thus far indicated that electrical potentials of 25 MeV develop in flares within 2 sec of onset. X ray data show that flares are composed of compressed magnetic loops that have come too close together. Other data have been taken on mass ejection, impacts of electron beams and conduction fronts with the chromosphere and changes in the solar radiant flux due to sunspots. 13 references

  19. Introduction to maximum entropy

    International Nuclear Information System (INIS)

    Sivia, D.S.

    1989-01-01

    The maximum entropy (MaxEnt) principle has been successfully used in image reconstruction in a wide variety of fields. The author reviews the need for such methods in data analysis and shows, by use of a very simple example, why MaxEnt is to be preferred over other regularizing functions. This leads to a more general interpretation of the MaxEnt method, and its use is illustrated with several different examples. Practical difficulties with non-linear problems still remain, this being highlighted by the notorious phase problem in crystallography. He concludes with an example from neutron scattering, using data from a filter difference spectrometer to contrast MaxEnt with a conventional deconvolution. 12 refs., 8 figs., 1 tab

  20. Functional Maximum Autocorrelation Factors

    DEFF Research Database (Denmark)

    Larsen, Rasmus; Nielsen, Allan Aasbjerg

    2005-01-01

    MAF outperforms the functional PCA in concentrating the interesting' spectra/shape variation in one end of the eigenvalue spectrum and allows for easier interpretation of effects. Conclusions. Functional MAF analysis is a useful methods for extracting low dimensional models of temporally or spatially......Purpose. We aim at data where samples of an underlying function are observed in a spatial or temporal layout. Examples of underlying functions are reflectance spectra and biological shapes. We apply functional models based on smoothing splines and generalize the functional PCA in......\\verb+~+\\$\\backslash\\$cite{ramsay97} to functional maximum autocorrelation factors (MAF)\\verb+~+\\$\\backslash\\$cite{switzer85,larsen2001d}. We apply the method to biological shapes as well as reflectance spectra. {\\$\\backslash\\$bf Methods}. MAF seeks linear combination of the original variables that maximize autocorrelation between...

  1. Regularized maximum correntropy machine

    KAUST Repository

    Wang, Jim Jing-Yan; Wang, Yunji; Jing, Bing-Yi; Gao, Xin

    2015-01-01

    In this paper we investigate the usage of regularized correntropy framework for learning of classifiers from noisy labels. The class label predictors learned by minimizing transitional loss functions are sensitive to the noisy and outlying labels of training samples, because the transitional loss functions are equally applied to all the samples. To solve this problem, we propose to learn the class label predictors by maximizing the correntropy between the predicted labels and the true labels of the training samples, under the regularized Maximum Correntropy Criteria (MCC) framework. Moreover, we regularize the predictor parameter to control the complexity of the predictor. The learning problem is formulated by an objective function considering the parameter regularization and MCC simultaneously. By optimizing the objective function alternately, we develop a novel predictor learning algorithm. The experiments on two challenging pattern classification tasks show that it significantly outperforms the machines with transitional loss functions.

  2. Regularized maximum correntropy machine

    KAUST Repository

    Wang, Jim Jing-Yan

    2015-02-12

    In this paper we investigate the usage of regularized correntropy framework for learning of classifiers from noisy labels. The class label predictors learned by minimizing transitional loss functions are sensitive to the noisy and outlying labels of training samples, because the transitional loss functions are equally applied to all the samples. To solve this problem, we propose to learn the class label predictors by maximizing the correntropy between the predicted labels and the true labels of the training samples, under the regularized Maximum Correntropy Criteria (MCC) framework. Moreover, we regularize the predictor parameter to control the complexity of the predictor. The learning problem is formulated by an objective function considering the parameter regularization and MCC simultaneously. By optimizing the objective function alternately, we develop a novel predictor learning algorithm. The experiments on two challenging pattern classification tasks show that it significantly outperforms the machines with transitional loss functions.

  3. Evaluation of information-theoretic similarity measures for content-based retrieval and detection of masses in mammograms

    International Nuclear Information System (INIS)

    Tourassi, Georgia D.; Harrawood, Brian; Singh, Swatee; Lo, Joseph Y.; Floyd, Carey E.

    2007-01-01

    The purpose of this study was to evaluate image similarity measures employed in an information-theoretic computer-assisted detection (IT-CAD) scheme. The scheme was developed for content-based retrieval and detection of masses in screening mammograms. The study is aimed toward an interactive clinical paradigm where physicians query the proposed IT-CAD scheme on mammographic locations that are either visually suspicious or indicated as suspicious by other cuing CAD systems. The IT-CAD scheme provides an evidence-based, second opinion for query mammographic locations using a knowledge database of mass and normal cases. In this study, eight entropy-based similarity measures were compared with respect to retrieval precision and detection accuracy using a database of 1820 mammographic regions of interest. The IT-CAD scheme was then validated on a separate database for false positive reduction of progressively more challenging visual cues generated by an existing, in-house mass detection system. The study showed that the image similarity measures fall into one of two categories; one category is better suited to the retrieval of semantically similar cases while the second is more effective with knowledge-based decisions regarding the presence of a true mass in the query location. In addition, the IT-CAD scheme yielded a substantial reduction in false-positive detections while maintaining high detection rate for malignant masses

  4. The Information Content of Visible Spectra of Extra Virgin Olive Oil in the Characterization of Its Origin

    International Nuclear Information System (INIS)

    Forina, M.; Boggia, R.; Casale, M.

    2007-01-01

    The information content of visible spectra has been evaluated, by means of some selected chemometrical techniques, for its ability to trace the geographical origin of extra virgin olive oils coming from several Mediterranean regions. Special attention was paid to extra virgin olive oil produced in West Liguria, a North Italy region which leans over the Mediterranean Sea and borders France. The peculiar organoleptic features of this niche product deserved the protected designation of origin Riviera Ligure-Riviera dei fiori. Unfortunately, this expensive oil is often submitted to profitable adulterations, commonly involving addition of other cheaper Mediterranean oils. Using suitable transforms, such as profiles and derivatives, the visible spectra of extra virgin olive oils showed a very important discriminant power in that regards the geographical characterization of the studied samples. In particular, the developed class models for West Liguria oils have 100% sensitivity and specificity. Moreover, even if this paper is focused on West Liguria oil, it is important to emphasize that a similar study, involving a so widespread and timesaving technique, could be analogously developed for all the other Mediterranean regions taken into account and it could be used in other olive oil characterization problems

  5. THE INFORMATION AND CONTENT SPECIFICITY OF THE TEACHING THE DISCIPLINE "MODERN UKRAINIAN LANGUAGE OF MEDIA" AT THE JOURNALISM FACULTIES

    Directory of Open Access Journals (Sweden)

    Monakhova T.

    2017-12-01

    Full Text Available The article deals with the content planning of the discipline "The Modern Ukrainian Language of Mass Media". Its importance and inscription in the general educational process at the specialty "The Journalism" in accordance with the curriculum of the specialty are substantiated, the key directions and problems that need attention of future journalists are outlined. "The Modern Ukrainian Language of Mass Media" is a course that integrates linguistic and cognitive, communicative, semiotic, and other approaches to considering the functioning of the state language in the media. Such an approach involves consideration of a number of linguistic problems, in particular spelling (the spelling debate in Ukraine, peculiarities of the transliteration of foreign language names, the types of journalistic texts compositions, longevity, tricksters, etc., cognitive problems, for example, the language game in the media, the gender aspects of the language of the media, the problem of hate-speech etc., as well as communicative approaches, in particular, the theory of communicative acts, the working with different types of information, fact cheking, the linguistic specifics of social networks, etc. The offered academic discipline is simultaneously propaedeutic for further journalistic disciplines, as well as the summary for the "language block of journalistic courses" – the disciplines "The Practical Ukrainian Language", "The Stylistics and Culture of the Ukrainian Language", etc.

  6. DaGO-Fun: tool for Gene Ontology-based functional analysis using term information content measures.

    Science.gov (United States)

    Mazandu, Gaston K; Mulder, Nicola J

    2013-09-25

    The use of Gene Ontology (GO) data in protein analyses have largely contributed to the improved outcomes of these analyses. Several GO semantic similarity measures have been proposed in recent years and provide tools that allow the integration of biological knowledge embedded in the GO structure into different biological analyses. There is a need for a unified tool that provides the scientific community with the opportunity to explore these different GO similarity measure approaches and their biological applications. We have developed DaGO-Fun, an online tool available at http://web.cbio.uct.ac.za/ITGOM, which incorporates many different GO similarity measures for exploring, analyzing and comparing GO terms and proteins within the context of GO. It uses GO data and UniProt proteins with their GO annotations as provided by the Gene Ontology Annotation (GOA) project to precompute GO term information content (IC), enabling rapid response to user queries. The DaGO-Fun online tool presents the advantage of integrating all the relevant IC-based GO similarity measures, including topology- and annotation-based approaches to facilitate effective exploration of these measures, thus enabling users to choose the most relevant approach for their application. Furthermore, this tool includes several biological applications related to GO semantic similarity scores, including the retrieval of genes based on their GO annotations, the clustering of functionally related genes within a set, and term enrichment analysis.

  7. 20 CFR 416.919n - Informing the medical source of examination scheduling, report content, and signature requirements.

    Science.gov (United States)

    2010-04-01

    ... scheduling, report content, and signature requirements. 416.919n Section 416.919n Employees' Benefits SOCIAL... medical source of examination scheduling, report content, and signature requirements. The medical sources... report containing all of the elements in paragraph (c). (e) Signature requirements. All consultative...

  8. 20 CFR 404.1519n - Informing the medical source of examination scheduling, report content, and signature requirements.

    Science.gov (United States)

    2010-04-01

    ... scheduling, report content, and signature requirements. 404.1519n Section 404.1519n Employees' Benefits... medical source of examination scheduling, report content, and signature requirements. The medical sources... report containing all of the elements in paragraph (c). (e) Signature requirements. All consultative...

  9. Extreme Maximum Land Surface Temperatures.

    Science.gov (United States)

    Garratt, J. R.

    1992-09-01

    There are numerous reports in the literature of observations of land surface temperatures. Some of these, almost all made in situ, reveal maximum values in the 50°-70°C range, with a few, made in desert regions, near 80°C. Consideration of a simplified form of the surface energy balance equation, utilizing likely upper values of absorbed shortwave flux (1000 W m2) and screen air temperature (55°C), that surface temperatures in the vicinity of 90°-100°C may occur for dry, darkish soils of low thermal conductivity (0.1-0.2 W m1 K1). Numerical simulations confirm this and suggest that temperature gradients in the first few centimeters of soil may reach 0.5°-1°C mm1 under these extreme conditions. The study bears upon the intrinsic interest of identifying extreme maximum temperatures and yields interesting information regarding the comfort zone of animals (including man).

  10. 75 FR 43840 - Inflation Adjustment of the Ordinary Maximum and Aggravated Maximum Civil Monetary Penalties for...

    Science.gov (United States)

    2010-07-27

    ...-17530; Notice No. 2] RIN 2130-ZA03 Inflation Adjustment of the Ordinary Maximum and Aggravated Maximum... remains at $250. These adjustments are required by the Federal Civil Penalties Inflation Adjustment Act [email protected] . SUPPLEMENTARY INFORMATION: The Federal Civil Penalties Inflation Adjustment Act of 1990...

  11. Examining the Extent to Which Select Teacher Preparation Experiences Inform Technology and Engineering Educators’ Teaching of Science Content and Practices

    OpenAIRE

    Love, Tyler Scott

    2015-01-01

    With the recent release of the Next Generation Science Standards (NGSS) (NGSS Lead States, 2014b) science educators were expected to teach engineering content and practices within their curricula. However, technology and engineering (T&E) educators have been expected to teach content and practices from engineering and other disciplines since the release of the Standards for Technological Literacy (ITEA/ITEEA, 2000/2002/2007). Requisite to the preparation of globally competitive...

  12. Face-to-face and electronic communications in maintaining social networks : the influence of geographical and relational distance and of information content

    NARCIS (Netherlands)

    Tillema, Taede; Dijst, Martin; Schwanen, Tim

    Using data collected among 742 respondents, this article aims at gaining greater insight into (i) the interaction between face-to-face (F2F) and electronic contacts, (ii) the influence of information content and relational distance on the communication mode/service choice and (iii) the influence of

  13. Energy contents of frequently ordered restaurant meals and comparison with human energy requirements and US Department of Agriculture database information: a multisite randomized study

    Science.gov (United States)

    BACKGROUND: Excess energy intake from meals consumed away from home is implicated as a major contributor to obesity, and ~50% of US restaurants are individual or small-chain (non-chain) establishments that do not provide nutrition information. OBJECTIVE: To measure the energy content of frequently o...

  14. Effect of electrode contact area on the information content of the recorded electrogastrograms: An analysis based on Rényi entropy and Teager-Kaiser Energy

    Science.gov (United States)

    Alagumariappan, Paramasivam; Krishnamurthy, Kamalanand; Kandiah, Sundravadivelu; Ponnuswamy, Mannar Jawahar

    2017-06-01

    Electrogastrograms (EGG) are electrical signals originating from the digestive system, which are closely correlated with its mechanical activity. Electrogastrography is an efficient non-invasive method for examining the physiological and pathological states of the human digestive system. There are several factors such as fat conductivity, abdominal thickness, change in electrode surface area etc, which affects the quality of the recorded EGG signals. In this work, the effect of variations in the contact area of surface electrodes on the information content of the measured electrogastrograms is analyzed using Rényi entropy and Teager-Kaiser Energy (TKE). Two different circular cutaneous electrodes with approximate contact areas of 201.14 mm2 and 283.64 mm2, have been adopted and EGG signals were acquired using the standard three electrode protocol. Further, the information content of the measured EGG signals were analyzed using the computed values of entropy and energy. Results demonstrate that the information content of the measured EGG signals increases by 6.72% for an increase in the contact area of the surface electrode by 29.09%. Further, it was observed that the average energy increases with increase in the contact surface area. This work appears to be of high clinical significance since the accurate measurement of EGG signals without loss in its information content, is highly useful for the design of diagnostic assistance tools for automated diagnosis and mass screening of digestive disorders.

  15. An exploratory study of relative and incremental information content of two non-financial performance measures: Field study evidence on absence frequency and on-time delivery

    NARCIS (Netherlands)

    Wiersma, E.

    2008-01-01

    In this exploratory field study, I test the relative and incremental information content of two non-financial performance measures compared to financial performance measures for future financial performance. The proprietary database used is from the contracts of the managers of 27 responsibility

  16. Role of the Information Professional in the Development and Promotion of Digital Humanities Content for Research, Teaching, and Learning in the Modern Academic Library: An Irish Case Study

    Science.gov (United States)

    Burns, Jane A.

    2016-01-01

    The Internet has been the catalyst for the convergence of many subject areas and online platforms. Information professionals such as Archivists, IT developers and especially Librarians have been impacted in the development and promotion of digital humanities content for research, teaching, and learning in the modern academic library. In this case…

  17. The effects of problem content and scientific background on information search and the assessment and valuation of correlations.

    Science.gov (United States)

    Soffer, Shira; Kareev, Yaakov

    2011-01-01

    The effects of problem contents and one's scientific background on the detection of correlations and the assessment of their strength were studied using a task that required active data search, assessment of the strength of a correlation, and monetary valuation of the correlation's predictive utility. Participants (N = 72) who were trained either in the natural sciences or in the social sciences and humanities explored data sets differing in contents and actual strength of correlation. Data search was consistent across all variables: Participants drew relatively small samples whose relative sizes would favor the detection of a correlation, if one existed. In contrast, the assessment of the correlation strength and the valuation of its predictive utility were strongly related not only to its objective strength, but also to the correspondence between problem contents and one's scientific background: When the two matched, correlations were judged to be stronger and more valuable than when they did not.

  18. Discussion on the correlation between geophysical and remote sensing information. Primary study on information correlation of research content and concept of post-remote sensing application technology for uranium exploration

    International Nuclear Information System (INIS)

    Ye Fawang; Liu Dechang

    2005-01-01

    Based on the research content of post-remote sensing application technology for uranium exploration, a preliminary discussion on the correlation between RS information and geophysical information from gravity, aero-magnetics, aero-radioactivity is made on five aspects: physical meaning, depth of geological rule meaning, time and phase, planar pattern and inter-reaction mechanism. It creates a good beginner for deeply studying the correlation in quality and quantity between RS information from post-remote sensing application technology and other geologic information. (authors)

  19. The relationship between information content of depreciation and abnormal return and future benefits in manufacturing companies in Tehran Stock Exchange(TSE

    Directory of Open Access Journals (Sweden)

    Reza Zare

    2013-01-01

    Full Text Available In present study by virtue of the importance of  the fiscal statement contents and illiquid items ignored by the merchant the depreciation contents relation with abnormal return of the shares and future benefits are examined in order to influence the items under consideration of the investors to take related decisions; 94 companies were selected from the accessible universe in five years (2006-2010 to have the data necessary for the study in order to achieve the goal. The simple and multivariable regression statistical techniques Chow and Hausman  Test were used to test the hypotheses. The significant test was conducted for the paradigms by using the ‘F’ and ‘T’ statistics. The study findings show there is no relation between the depreciation expense and abnormal return and there is a significant and positive relation between the depreciation expense and future benefits.Key Words: Abnormal return, Information Contents, depreciation, abnormal return.

  20. Credal Networks under Maximum Entropy

    OpenAIRE

    Lukasiewicz, Thomas

    2013-01-01

    We apply the principle of maximum entropy to select a unique joint probability distribution from the set of all joint probability distributions specified by a credal network. In detail, we start by showing that the unique joint distribution of a Bayesian tree coincides with the maximum entropy model of its conditional distributions. This result, however, does not hold anymore for general Bayesian networks. We thus present a new kind of maximum entropy models, which are computed sequentially. ...

  1. Maximum entropy principal for transportation

    International Nuclear Information System (INIS)

    Bilich, F.; Da Silva, R.

    2008-01-01

    In this work we deal with modeling of the transportation phenomenon for use in the transportation planning process and policy-impact studies. The model developed is based on the dependence concept, i.e., the notion that the probability of a trip starting at origin i is dependent on the probability of a trip ending at destination j given that the factors (such as travel time, cost, etc.) which affect travel between origin i and destination j assume some specific values. The derivation of the solution of the model employs the maximum entropy principle combining a priori multinomial distribution with a trip utility concept. This model is utilized to forecast trip distributions under a variety of policy changes and scenarios. The dependence coefficients are obtained from a regression equation where the functional form is derived based on conditional probability and perception of factors from experimental psychology. The dependence coefficients encode all the information that was previously encoded in the form of constraints. In addition, the dependence coefficients encode information that cannot be expressed in the form of constraints for practical reasons, namely, computational tractability. The equivalence between the standard formulation (i.e., objective function with constraints) and the dependence formulation (i.e., without constraints) is demonstrated. The parameters of the dependence-based trip-distribution model are estimated, and the model is also validated using commercial air travel data in the U.S. In addition, policy impact analyses (such as allowance of supersonic flights inside the U.S. and user surcharge at noise-impacted airports) on air travel are performed.

  2. Campus Health Centers' Lack of Information Regarding Providers: A Content Analysis of Division-I Campus Health Centers' Provider Websites.

    Science.gov (United States)

    Perrault, Evan K

    2018-07-01

    Campus health centers are a convenient, and usually affordable, location for college students to obtain health care. Staffed by licensed and trained professionals, these providers can generally offer similar levels of care that providers at off-campus clinics can deliver. Yet, previous research finds students may forgo this convenient, on-campus option partially because of a lack of knowledge regarding the quality of providers at these campus clinics. This study sought to examine where this information deficit may come from by analyzing campus health centers' online provider information. All Division-I colleges or universities with an on-campus health center, which had information on their websites about their providers (n = 294), had their providers' online information analyzed (n = 2,127 providers). Results revealed that schools commonly offer professional information (e.g., provider specialties, education), but very little about their providers outside of the medical context (e.g., hobbies) that would allow a prospective student patient to more easily relate. While 181 different kinds of credentials were provided next to providers' names (e.g., MD, PA-C, FNP-BC), only nine schools offered information to help students understand what these different credentials meant. Most schools had information about their providers within one-click of the homepage. Recommendations for improving online information about campus health center providers are offered.

  3. Pinterest as a Resource for Health Information on Chronic Obstructive Pulmonary Disease (COPD): A Social Media Content Analysis

    Science.gov (United States)

    Paige, Samantha R.; Stellefson, Michael; Chaney, Beth H.; Alber, Julia M.

    2015-01-01

    Purpose: The purpose of this study was to explore how Pinterest group pinboards are used to communicate health information on chronic obstructive pulmonary disease (COPD). Method A nonprobability census sampling method retrieved 399 pins from the 10 most followed COPD group pinboards. Pins were coded according to COPD information categories,…

  4. Text de-identification for privacy protection: a study of its impact on clinical text information content.

    Science.gov (United States)

    Meystre, Stéphane M; Ferrández, Óscar; Friedlin, F Jeffrey; South, Brett R; Shen, Shuying; Samore, Matthew H

    2014-08-01

    As more and more electronic clinical information is becoming easier to access for secondary uses such as clinical research, approaches that enable faster and more collaborative research while protecting patient privacy and confidentiality are becoming more important. Clinical text de-identification offers such advantages but is typically a tedious manual process. Automated Natural Language Processing (NLP) methods can alleviate this process, but their impact on subsequent uses of the automatically de-identified clinical narratives has only barely been investigated. In the context of a larger project to develop and investigate automated text de-identification for Veterans Health Administration (VHA) clinical notes, we studied the impact of automated text de-identification on clinical information in a stepwise manner. Our approach started with a high-level assessment of clinical notes informativeness and formatting, and ended with a detailed study of the overlap of select clinical information types and Protected Health Information (PHI). To investigate the informativeness (i.e., document type information, select clinical data types, and interpretation or conclusion) of VHA clinical notes, we used five different existing text de-identification systems. The informativeness was only minimally altered by these systems while formatting was only modified by one system. To examine the impact of de-identification on clinical information extraction, we compared counts of SNOMED-CT concepts found by an open source information extraction application in the original (i.e., not de-identified) version of a corpus of VHA clinical notes, and in the same corpus after de-identification. Only about 1.2-3% less SNOMED-CT concepts were found in de-identified versions of our corpus, and many of these concepts were PHI that was erroneously identified as clinical information. To study this impact in more details and assess how generalizable our findings were, we examined the overlap between

  5. Free Flowing Content

    DEFF Research Database (Denmark)

    Cass, Andrew Knox; Kravchenko, Mariia

    2017-01-01

    Higher education institutions are moving to exploit information andcommunication technologies by increasing the use of videos both online and inclass. This is led, by definition, by ‘early adopters’ and most of the research intothis process reflects this. Increasingly, institutions are making...... strategic decisionsto move courses online however, some teachers involved are not well equippedto transition. The barriers are reported to be time constraints and a lack offamiliarity with the technology to make video. Also, there is a fear of the‘presented self’ where teachers may initially resent the idea...... do not secure theintegrity of the learning. This paper sets out the methods used to assist teacherstake the maximum benefit of their existing content as presentation style lecturesand utilize them for video recording suitable for both flipped and online classes.A central theme is removing the fear...

  6. Maximum likelihood approach to “informed” Sound Source Localization for Hearing Aid applications

    DEFF Research Database (Denmark)

    Farmani, Mojtaba; Pedersen, Michael Syskind; Tan, Zheng-Hua

    2015-01-01

    Most state-of-the-art Sound Source Localization (SSL) algorithms have been proposed for applications which are "uninformed'' about the target sound content; however, utilizing a wireless microphone worn by a target talker, enables recent Hearing Aid Systems (HASs) to access to an almost noise......-free sound signal of the target talker at the HAS via the wireless connection. Therefore, in this paper, we propose a maximum likelihood (ML) approach, which we call MLSSL, to estimate the Direction of Arrival (DoA) of the target signal given access to the target signal content. Compared with other "informed...

  7. Comparing Information Assurance Awareness Training for End-Users: A Content Analysis Examination of Air Force and Defense Information Systems Agency User Training Modules

    National Research Council Canada - National Science Library

    Fruge, John W

    2008-01-01

    Today, the threats to information security and assurance are great. While there are many avenues for IT professionals to safeguard against these threats, many times these defenses prove useless against typical system users...

  8. Prostate Cancer Information Available in Health-Care Provider Offices: An Analysis of Content, Readability, and Cultural Sensitivity.

    Science.gov (United States)

    Choi, Seul Ki; Seel, Jessica S; Yelton, Brooks; Steck, Susan E; McCormick, Douglas P; Payne, Johnny; Minter, Anthony; Deutchki, Elizabeth K; Hébert, James R; Friedman, Daniela B

    2018-07-01

    Prostate cancer (PrCA) is the most common cancer affecting men in the United States, and African American men have the highest incidence among men in the United States. Little is known about the PrCA-related educational materials being provided to patients in health-care settings. Content, readability, and cultural sensitivity of materials available in providers' practices in South Carolina were examined. A total of 44 educational materials about PrCA and associated sexual dysfunction was collected from 16 general and specialty practices. The content of the materials was coded, and cultural sensitivity was assessed using the Cultural Sensitivity Assessment Tool. Flesch Reading Ease, Flesch-Kincaid Grade Level, and the Simple Measure of Gobbledygook were used to assess readability. Communication with health-care providers (52.3%), side effects of PrCA treatment (40.9%), sexual dysfunction and its treatment (38.6%), and treatment options (34.1%) were frequently presented. All materials had acceptable cultural sensitivity scores; however, 2.3% and 15.9% of materials demonstrated unacceptable cultural sensitivity regarding format and visual messages, respectively. Readability of the materials varied. More than half of the materials were written above a high-school reading level. PrCA-related materials available in health-care practices may not meet patients' needs regarding content, cultural sensitivity, and readability. A wide range of educational materials that address various aspects of PrCA, including treatment options and side effects, should be presented in plain language and be culturally sensitive.

  9. Informal roles within eSport teams : a content analysis of the game 'Counter-Strike: Global Offensive

    OpenAIRE

    Drenthe, Rolf

    2016-01-01

    Informal roles are roles that are not formally prescribed by a group or organization and are being established through group interaction that takes place among group members. Previous literature has identified twelve roles within traditional sport, however to date limited research has been done within the field of role development within competitive computer gaming (eSports). The purpose of the present study was to explore the informal roles within the eSport setting and i...

  10. The relevance of financial information and contents of the new audit report for lending decisions of commercial banks

    Directory of Open Access Journals (Sweden)

    Marina Trpeska

    2017-12-01

    Full Text Available This research study examines the importance of financial information and information contained in the ISA’s New Audit Report effective from 2016 for lenders as capital providers. We base our findings on a survey conducted in September of 2016 with corporate loan officers for medium-large corporate clients in Macedonia. The results show that the annual report of the company, except for the management report and notes to the financial statements, has consistently high importance and usability for respondents’ decision making. Various accounting ratios related to liquidity, financial stability and profitability of the company are considered very important and regularly used in credit analysis. However, respondents were not consistent in their shared perceptions regarding high importance of projected profits and historical information on operating cash flows. All loan officers gave high importance to the information found in existent auditor’s report format regardless of the form of expressed opinion. Also, information on key audit matters, additional information on going concern and related auditor’s judgement, procedures related to fraud risk were considered of high importance. Lenders rated as less important the disclosure of the name of engagement partner, auditor’s statement on independence and compliance with ethical requirements and the level of materiality used in the audit.

  11. Topics in Bayesian statistics and maximum entropy

    International Nuclear Information System (INIS)

    Mutihac, R.; Cicuttin, A.; Cerdeira, A.; Stanciulescu, C.

    1998-12-01

    Notions of Bayesian decision theory and maximum entropy methods are reviewed with particular emphasis on probabilistic inference and Bayesian modeling. The axiomatic approach is considered as the best justification of Bayesian analysis and maximum entropy principle applied in natural sciences. Particular emphasis is put on solving the inverse problem in digital image restoration and Bayesian modeling of neural networks. Further topics addressed briefly include language modeling, neutron scattering, multiuser detection and channel equalization in digital communications, genetic information, and Bayesian court decision-making. (author)

  12. Density estimation by maximum quantum entropy

    International Nuclear Information System (INIS)

    Silver, R.N.; Wallstrom, T.; Martz, H.F.

    1993-01-01

    A new Bayesian method for non-parametric density estimation is proposed, based on a mathematical analogy to quantum statistical physics. The mathematical procedure is related to maximum entropy methods for inverse problems and image reconstruction. The information divergence enforces global smoothing toward default models, convexity, positivity, extensivity and normalization. The novel feature is the replacement of classical entropy by quantum entropy, so that local smoothing is enforced by constraints on differential operators. The linear response of the estimate is proportional to the covariance. The hyperparameters are estimated by type-II maximum likelihood (evidence). The method is demonstrated on textbook data sets

  13. Retrieval of ice cloud properties using an optimal estimation algorithm and MODIS infrared observations. Part I: Forward model, error analysis, and information content

    Science.gov (United States)

    Wang, Chenxi; Platnick, Steven; Zhang, Zhibo; Meyer, Kerry; Yang, Ping

    2018-01-01

    An optimal estimation (OE) retrieval method is developed to infer three ice cloud properties simultaneously: optical thickness (τ), effective radius (reff), and cloud-top height (h). This method is based on a fast radiative transfer (RT) model and infrared (IR) observations from the MODerate resolution Imaging Spectroradiometer (MODIS). This study conducts thorough error and information content analyses to understand the error propagation and performance of retrievals from various MODIS band combinations under different cloud/atmosphere states. Specifically, the algorithm takes into account four error sources: measurement uncertainty, fast RT model uncertainty, uncertainties in ancillary datasets (e.g., atmospheric state), and assumed ice crystal habit uncertainties. It is found that the ancillary and ice crystal habit error sources dominate the MODIS IR retrieval uncertainty and cannot be ignored. The information content analysis shows that, for a given ice cloud, the use of four MODIS IR observations is sufficient to retrieve the three cloud properties. However, the selection of MODIS IR bands that provide the most information and their order of importance varies with both the ice cloud properties and the ambient atmospheric and the surface states. As a result, this study suggests the inclusion of all MODIS IR bands in practice since little a priori information is available. PMID:29707470

  14. Retrieval of Ice Cloud Properties Using an Optimal Estimation Algorithm and MODIS Infrared Observations. Part I: Forward Model, Error Analysis, and Information Content

    Science.gov (United States)

    Wang, Chenxi; Platnick, Steven; Zhang, Zhibo; Meyer, Kerry; Yang, Ping

    2016-01-01

    An optimal estimation (OE) retrieval method is developed to infer three ice cloud properties simultaneously: optical thickness (tau), effective radius (r(sub eff)), and cloud top height (h). This method is based on a fast radiative transfer (RT) model and infrared (IR) observations from the MODerate resolution Imaging Spectroradiometer (MODIS). This study conducts thorough error and information content analyses to understand the error propagation and performance of retrievals from various MODIS band combinations under different cloud/atmosphere states. Specifically, the algorithm takes into account four error sources: measurement uncertainty, fast RT model uncertainty, uncertainties in ancillary data sets (e.g., atmospheric state), and assumed ice crystal habit uncertainties. It is found that the ancillary and ice crystal habit error sources dominate the MODIS IR retrieval uncertainty and cannot be ignored. The information content analysis shows that for a given ice cloud, the use of four MODIS IR observations is sufficient to retrieve the three cloud properties. However, the selection of MODIS IR bands that provide the most information and their order of importance varies with both the ice cloud properties and the ambient atmospheric and the surface states. As a result, this study suggests the inclusion of all MODIS IR bands in practice since little a priori information is available.

  15. Improvement of Soil Moisture Retrieval from Hyperspectral VNIR-SWIR Data Using Clay Content Information: From Laboratory to Field Experiments

    Directory of Open Access Journals (Sweden)

    Rosa Oltra-Carrió

    2015-03-01

    Full Text Available The aim of this work is to study the constraints and performance of SMC retrieval methodologies in the VNIR (Visible-Near InfraRed and SWIR (ShortWave InfraRed regions (from 0.4 to 2.5 µm when passing from controlled laboratory conditions to field conditions. Five different approaches of signal processing found in literature were considered. Four local criteria are spectral indices (WISOIL, NSMI, NINSOL and NINSON. These indices are the ratios between the spectral reflectances acquired at two specific wavelengths to characterize moisture content in soil. The last criterion is based in the convex hull concept and it is a global method, which is based on the analysis of the full spectral signature of the soil. The database was composed of 464 and 9 spectra, respectively, measured over bare soils in laboratory and in-situ. For each measurement, SMC and texture were well-known and the database was divided in two parts dedicated to calibration and validation steps. The calibration part was used to define the empirical relation between SMC and SMC retrieval approaches, with coefficients of determination (R2 between 0.72 and 0.92. A clay content (CC dependence was detected for the NINSOL and NINSON indices. Consequently, two new criteria were proposed taking into account the CC contribution (NINSOLCC and NINSONCC. The well-marked regression between SMC and global/local indices, and the interest of using the CC, were confirmed during the validation step using laboratory data (R² superior to 0.76 and Root mean square errors inferior to 8.3% m3∙m−3 in all cases and using in-situ data, where WISOIL, NINSOLCC and NINSONCC criteria stand out among the NSMI and CH.

  16. "Wow! Look at That!": Discourse as a Means to Improve Teachers' Science Content Learning in Informal Science Institutions

    Science.gov (United States)

    Holliday, Gary M.; Lederman, Judith S.; Lederman, Norman G.

    2014-01-01

    Currently, it is not clear whether professional development staff at Informal Science Institutions (ISIs) are considering the way exhibits contribute to the social aspects of learning as described by the contextual model of learning (CML) (Falk & Dierking in "The museum experience." Whalesback, Washington, 1992; "Learning from…

  17. Promoting health (implicitly)? A longitudinal content analysis of implicit health information in cigarette advertising, 1954-2003.

    Science.gov (United States)

    Paek, Hye-Jin; Reid, Leonard N; Choi, Hojoon; Jeong, Hyun Ju

    2010-10-01

    Tobacco studies indicate that health-related information in cigarette advertising leads consumers to underestimate the detrimental health effects of smoking and contributes to their smoking-related perceptions, beliefs, and attitudes. This study examined the frequencies and kinds of implicit health information in cigarette advertising across five distinct smoking eras covering the years 1954-2003. Analysis of 1,135 cigarette advertisements collected through multistage probability sampling of three popular consumer magazines found that the level of implicit health information (i.e., "light" cigarette, cigarette pack color, verbal and visual health cues, cigarette portrayals, and human model-cigarette interaction) in post-Master Settlement Agreement [MSA] era ads is similar to the level in ads from early smoking eras. Specifically, "light" cigarettes were frequently promoted, and presence of light colors in cigarette packs seemed dominant after the probroadcast ban era. Impressionistic verbal health cues (e.g., soft, mild, and refreshing) appeared more frequently in post-MSA era ads than in pre-MSA era ads. Most notably, a majority of the cigarette ads portrayed models smoking, lighting, or offering a cigarette to others. The potential impact of implicit health information is discussed in the contexts of social cognition and Social Cognitive Theory. Policy implications regarding our findings are also detailed.

  18. Technical-Oriented Enterprise Resource Planning (ERP) Body of Knowledge for Information Systems Programs: Content and Implementation

    Science.gov (United States)

    Boyle, Todd A.

    2007-01-01

    In this article, the author proposes a body of knowledge that the educators can use to incorporate the technical aspects of enterprise resource planning (ERP) into an information systems (IS) program, encapsulated as the ERP technical knowledge framework. To illustrate the application of this framework, the author discusses a course sequence that…

  19. Cloud information content analysis of multi-angular measurements in the oxygen A-band: application to 3MI and MSPI

    Science.gov (United States)

    Merlin, G.; Riedi, J.; Labonnote, L. C.; Cornet, C.; Davis, A. B.; Dubuisson, P.; Desmons, M.; Ferlay, N.; Parol, F.

    2015-12-01

    The vertical distribution of cloud cover has a significant impact on a large number of meteorological and climatic processes. Cloud top altitude and cloud geometrical thickness are then essential. Previous studies established the possibility of retrieving those parameters from multi-angular oxygen A-band measurements. Here we perform a study and comparison of the performances of future instruments. The 3MI (Multi-angle, Multi-channel and Multi-polarization Imager) instrument developed by EUMETSAT, which is an extension of the POLDER/PARASOL instrument, and MSPI (Multi-angles Spectro-Polarimetric Imager) develoloped by NASA's Jet Propulsion Laboratory will measure total and polarized light reflected by the Earth's atmosphere-surface system in several spectral bands (from UV to SWIR) and several viewing geometries. Those instruments should provide opportunities to observe the links between the cloud structures and the anisotropy of the reflected solar radiation into space. Specific algorithms will need be developed in order to take advantage of the new capabilities of this instrument. However, prior to this effort, we need to understand, through a theoretical Shannon information content analysis, the limits and advantages of these new instruments for retrieving liquid and ice cloud properties, and especially, in this study, the amount of information coming from the A-Band channel on the cloud top altitude (CTOP) and geometrical thickness (CGT). We compare the information content of 3MI A-Band in two configurations and that of MSPI. Quantitative information content estimates show that the retrieval of CTOP with a high accuracy is possible in almost all cases investigated. The retrieval of CGT seems less easy but possible for optically thick clouds above a black surface, at least when CGT > 1-2 km.

  20. The problem of the content of the recognition of the testimony of the informers towards the concept of just cause for the regular exercise of the criminal action

    Directory of Open Access Journals (Sweden)

    Walter Barbosa Bittar

    2017-03-01

    Full Text Available This article analyzes the conceptual problems regarding the legal nature of the plea bargaining and the just cause, seeking to establish limits regarding the recognition of the content of the version presented by the informers, with greater attention to the probative value, as seen in the Brazilian legal system, seeking if it is possible to include the content of what can be understood as accepted just cause, or not, as a legitimate requirement for the criminal action procedure. To fulfill this objective, we sought to establish existence and validity requirements for the beginning of a valid criminal prosecution procedure, highlighting the criminal political aspects that end up influencing in the conclusion of the inherent contours of the object of the analysis of the present study.

  1. Understanding of Technical Terms and Contents of Informed Consent Forms for Sedative Gastrointestinal Endoscopy Procedures

    Directory of Open Access Journals (Sweden)

    Ihnsook Jeong, RN, PhD

    2013-03-01

    Conclusion: The understanding of the terms and knowledge about the procedures were disappointing. Therefore, sufficient explanations should be provided to the patients. While the informed consent was taken by doctors, the level of understanding should be monitored by nurses. In particular, subjects who did not have any previous experience with endoscopy procedures showed relatively lower level of understanding. We recommend that medical terms should be replaced with more common and nontechnical words in consent forms.

  2. Digital Content Strategies

    OpenAIRE

    Halbheer, Daniel; Stahl, Florian; Koenigsberg, Oded; Lehmann, Donald R

    2013-01-01

    This paper studies content strategies for online publishers of digital information goods. It examines sampling strategies and compares their performance to paid content and free content strategies. A sampling strategy, where some of the content is offered for free and consumers are charged for access to the rest, is known as a "metered model" in the newspaper industry. We analyze optimal decisions concerning the size of the sample and the price of the paid content when sampling serves the dua...

  3. An algorithm for hyperspectral remote sensing of aerosols: 2. Information content analysis for aerosol parameters and principal components of surface spectra

    Science.gov (United States)

    Hou, Weizhen; Wang, Jun; Xu, Xiaoguang; Reid, Jeffrey S.

    2017-05-01

    This paper describes the second part of a series of investigation to develop algorithms for simultaneous retrieval of aerosol parameters and surface reflectance from the future hyperspectral and geostationary satellite sensors such as Tropospheric Emissions: Monitoring of POllution (TEMPO). The information content in these hyperspectral measurements is analyzed for 6 principal components (PCs) of surface spectra and a total of 14 aerosol parameters that describe the columnar aerosol volume Vtotal, fine-mode aerosol volume fraction, and the size distribution and wavelength-dependent index of refraction in both coarse and fine mode aerosols. Forward simulations of atmospheric radiative transfer are conducted for 5 surface types (green vegetation, bare soil, rangeland, concrete and mixed surface case) and a wide range of aerosol mixtures. It is shown that the PCs of surface spectra in the atmospheric window channel could be derived from the top-of-the-atmosphere reflectance in the conditions of low aerosol optical depth (AOD ≤ 0.2 at 550 nm), with a relative error of 1%. With degree freedom for signal analysis and the sequential forward selection method, the common bands for different aerosol mixture types and surface types can be selected for aerosol retrieval. The first 20% of our selected bands accounts for more than 90% of information content for aerosols, and only 4 PCs are needed to reconstruct surface reflectance. However, the information content in these common bands from each TEMPO individual observation is insufficient for the simultaneous retrieval of surface's PC weight coefficients and multiple aerosol parameters (other than Vtotal). In contrast, with multiple observations for the same location from TEMPO in multiple consecutive days, 1-3 additional aerosol parameters could be retrieved. Consequently, a self-adjustable aerosol retrieval algorithm to account for surface types, AOD conditions, and multiple-consecutive observations is recommended to derive

  4. Do Health Promotion Messages Integrate Unintended Pregnancy and STI Prevention? A Content Analysis of Online Information for Adolescents and Young Adults.

    Science.gov (United States)

    Steiner, Riley J; Rasberry, Catherine N; Sales, Jessica M; Gaydos, Laura M; Pazol, Karen; Kramer, Michael; Swartzendruber, Andrea

    2018-04-20

    Recently there have been calls to strengthen integration of unintended pregnancy and sexually transmitted infection (STI) prevention messages, spurred by increasing use of long-acting reversible contraception. To assess the extent to which public health/clinical messages about unintended pregnancy prevention also address STI prevention, we conducted a content analysis of web-based health promotion information for young people. Websites identified through a systematic Google search were eligible for inclusion if they were operated by a United States-based organization with a mission related to public health/clinical services and the URL included: 1) original content; 2) about sexual and reproductive health; 3) explicitly for adolescents and/or young adults. Using defined protocols, URLs were screened and content was selected and analyzed thematically. Many of the 32 eligible websites presented information about pregnancy and STI prevention separately. Concurrent discussion of the two topics was often limited to statements about (1) strategies that can prevent both outcomes (abstinence, condoms only, condoms plus moderate or highly effective contraceptive methods) and (2) contraceptive methods that confer no STI protection. We also identified framing of condom use with moderate or highly effective contraceptive method for back-up pregnancy prevention but not STI prevention. STI prevention methods in addition to condoms, such as STI/HIV testing, vaccination, or pre-exposure or post-exposure prophylaxis, were typically not addressed with pregnancy prevention information. There may be missed opportunities for promoting STI prevention online in the context of increasing awareness of and access to a full range of contraceptive methods. Strengthening messages that integrate pregnancy and STI prevention may include: describing STI prevention strategies when noting that birth control methods do not prevent STIs; promoting a full complement of STI prevention strategies; and

  5. A novel genome-information content-based statistic for genome-wide association analysis designed for next-generation sequencing data.

    Science.gov (United States)

    Luo, Li; Zhu, Yun; Xiong, Momiao

    2012-06-01

    The genome-wide association studies (GWAS) designed for next-generation sequencing data involve testing association of genomic variants, including common, low frequency, and rare variants. The current strategies for association studies are well developed for identifying association of common variants with the common diseases, but may be ill-suited when large amounts of allelic heterogeneity are present in sequence data. Recently, group tests that analyze their collective frequency differences between cases and controls shift the current variant-by-variant analysis paradigm for GWAS of common variants to the collective test of multiple variants in the association analysis of rare variants. However, group tests ignore differences in genetic effects among SNPs at different genomic locations. As an alternative to group tests, we developed a novel genome-information content-based statistics for testing association of the entire allele frequency spectrum of genomic variation with the diseases. To evaluate the performance of the proposed statistics, we use large-scale simulations based on whole genome low coverage pilot data in the 1000 Genomes Project to calculate the type 1 error rates and power of seven alternative statistics: a genome-information content-based statistic, the generalized T(2), collapsing method, multivariate and collapsing (CMC) method, individual χ(2) test, weighted-sum statistic, and variable threshold statistic. Finally, we apply the seven statistics to published resequencing dataset from ANGPTL3, ANGPTL4, ANGPTL5, and ANGPTL6 genes in the Dallas Heart Study. We report that the genome-information content-based statistic has significantly improved type 1 error rates and higher power than the other six statistics in both simulated and empirical datasets.

  6. Last Glacial Maximum Salinity Reconstruction

    Science.gov (United States)

    Homola, K.; Spivack, A. J.

    2016-12-01

    It has been previously demonstrated that salinity can be reconstructed from sediment porewater. The goal of our study is to reconstruct high precision salinity during the Last Glacial Maximum (LGM). Salinity is usually determined at high precision via conductivity, which requires a larger volume of water than can be extracted from a sediment core, or via chloride titration, which yields lower than ideal precision. It has been demonstrated for water column samples that high precision density measurements can be used to determine salinity at the precision of a conductivity measurement using the equation of state of seawater. However, water column seawater has a relatively constant composition, in contrast to porewater, where variations from standard seawater composition occur. These deviations, which affect the equation of state, must be corrected for through precise measurements of each ion's concentration and knowledge of apparent partial molar density in seawater. We have developed a density-based method for determining porewater salinity that requires only 5 mL of sample, achieving density precisions of 10-6 g/mL. We have applied this method to porewater samples extracted from long cores collected along a N-S transect across the western North Atlantic (R/V Knorr cruise KN223). Density was determined to a precision of 2.3x10-6 g/mL, which translates to salinity uncertainty of 0.002 gms/kg if the effect of differences in composition is well constrained. Concentrations of anions (Cl-, and SO4-2) and cations (Na+, Mg+, Ca+2, and K+) were measured. To correct salinities at the precision required to unravel LGM Meridional Overturning Circulation, our ion precisions must be better than 0.1% for SO4-/Cl- and Mg+/Na+, and 0.4% for Ca+/Na+, and K+/Na+. Alkalinity, pH and Dissolved Inorganic Carbon of the porewater were determined to precisions better than 4% when ratioed to Cl-, and used to calculate HCO3-, and CO3-2. Apparent partial molar densities in seawater were

  7. Information content in spectral dependencies of optical unit volume parameters under action of He-Ne laser on blood

    Science.gov (United States)

    Khairullina, Alphiya Y.; Oleinik, Tatiana V.

    1995-01-01

    Our previous works concerned with the development of methods for studying blood and action of low-intensity laser radiation on blood and erythrocyte suspensions had shown the light- scattering methods gave a large body of information on a medium studied due to the methodological relationship between irradiation processes and techniques for investigations. Detail analysis of spectral diffuse reflectivities and transmissivities of optically thick blood layers, spectral absorptivities calculated on this basis over 600 - 900 nm, by using different approximations, for a pathological state owing to hypoxia testifies to the optical significance of not only hemoglobin derivatives but also products of hemoglobin decomposition. Laser action on blood is specific and related to an initial state of blood absorption due to different composition of chromoproteids. This work gives the interpretation of spectral observations. Analysis of spectral dependencies of the exinction coefficient e, mean cosine m of phase function, and parameter Q equals (epsilon) (1-(mu) )H/(lambda) (H - hematocrit) testifies to decreasing the relative index of refraction of erythrocytes and to morphological changes during laser action under pathology owing to hypoxia. The possibility to obtain physical and chemical information on the state of blood under laser action in vivo is shown to be based on the method proposed by us for calculating multilayered structures modeling human organs and on the technical implementation of this method.

  8. The Coordination Dynamics of Observational Learning: Relative Motion Direction and Relative Phase as Informational Content Linking Action-Perception to Action-Production.

    Science.gov (United States)

    Buchanan, John J

    2016-01-01

    The primary goal of this chapter is to merge together the visual perception perspective of observational learning and the coordination dynamics theory of pattern formation in perception and action. Emphasis is placed on identifying movement features that constrain and inform action-perception and action-production processes. Two sources of visual information are examined, relative motion direction and relative phase. The visual perception perspective states that the topological features of relative motion between limbs and joints remains invariant across an actor's motion and therefore are available for pickup by an observer. Relative phase has been put forth as an informational variable that links perception to action within the coordination dynamics theory. A primary assumption of the coordination dynamics approach is that environmental information is meaningful only in terms of the behavior it modifies. Across a series of single limb tasks and bimanual tasks it is shown that the relative motion and relative phase between limbs and joints is picked up through visual processes and supports observational learning of motor skills. Moreover, internal estimations of motor skill proficiency and competency are linked to the informational content found in relative motion and relative phase. Thus, the chapter links action to perception and vice versa and also links cognitive evaluations to the coordination dynamics that support action-perception and action-production processes.

  9. Energy Contents of Frequently Ordered Restaurant Meals and Comparison with Human Energy Requirements and US Department of Agriculture Database Information: A Multisite Randomized Study

    Science.gov (United States)

    Urban, Lorien E.; Weber, Judith L.; Heyman, Melvin B.; Schichtl, Rachel L.; Verstraete, Sofia; Lowery, Nina S.; Das, Sai Krupa; Schleicher, Molly M.; Rogers, Gail; Economos, Christina; Masters, William A.; Roberts, Susan B.

    2017-01-01

    Background Excess energy intake from meals consumed away from home is implicated as a major contributor to obesity, and ~50% of US restaurants are individual or small-chain (non–chain) establishments that do not provide nutrition information. Objective To measure the energy content of frequently ordered meals in non–chain restaurants in three US locations, and compare with the energy content of meals from large-chain restaurants, energy requirements, and food database information. Design A multisite random-sampling protocol was used to measure the energy contents of the most frequently ordered meals from the most popular cuisines in non–chain restaurants, together with equivalent meals from large-chain restaurants. Setting Meals were obtained from restaurants in San Francisco, CA; Boston, MA; and Little Rock, AR, between 2011 and 2014. Main outcome measures Meal energy content determined by bomb calorimetry. Statistical analysis performed Regional and cuisine differences were assessed using a mixed model with restaurant nested within region×cuisine as the random factor. Paired t tests were used to evaluate differences between non–chain and chain meals, human energy requirements, and food database values. Results Meals from non–chain restaurants contained 1,205±465 kcal/meal, amounts that were not significantly different from equivalent meals from large-chain restaurants (+5.1%; P=0.41). There was a significant effect of cuisine on non–chain meal energy, and three of the four most popular cuisines (American, Italian, and Chinese) had the highest mean energy (1,495 kcal/meal). Ninety-two percent of meals exceeded typical energy requirements for a single eating occasion. Conclusions Non–chain restaurants lacking nutrition information serve amounts of energy that are typically far in excess of human energy requirements for single eating occasions, and are equivalent to amounts served by the large-chain restaurants that have previously been criticized

  10. Energy Contents of Frequently Ordered Restaurant Meals and Comparison with Human Energy Requirements and U.S. Department of Agriculture Database Information: A Multisite Randomized Study.

    Science.gov (United States)

    Urban, Lorien E; Weber, Judith L; Heyman, Melvin B; Schichtl, Rachel L; Verstraete, Sofia; Lowery, Nina S; Das, Sai Krupa; Schleicher, Molly M; Rogers, Gail; Economos, Christina; Masters, William A; Roberts, Susan B

    2016-04-01

    Excess energy intake from meals consumed away from home is implicated as a major contributor to obesity, and ∼50% of US restaurants are individual or small-chain (non-chain) establishments that do not provide nutrition information. To measure the energy content of frequently ordered meals in non-chain restaurants in three US locations, and compare with the energy content of meals from large-chain restaurants, energy requirements, and food database information. A multisite random-sampling protocol was used to measure the energy contents of the most frequently ordered meals from the most popular cuisines in non-chain restaurants, together with equivalent meals from large-chain restaurants. Meals were obtained from restaurants in San Francisco, CA; Boston, MA; and Little Rock, AR, between 2011 and 2014. Meal energy content determined by bomb calorimetry. Regional and cuisine differences were assessed using a mixed model with restaurant nested within region×cuisine as the random factor. Paired t tests were used to evaluate differences between non-chain and chain meals, human energy requirements, and food database values. Meals from non-chain restaurants contained 1,205±465 kcal/meal, amounts that were not significantly different from equivalent meals from large-chain restaurants (+5.1%; P=0.41). There was a significant effect of cuisine on non-chain meal energy, and three of the four most popular cuisines (American, Italian, and Chinese) had the highest mean energy (1,495 kcal/meal). Ninety-two percent of meals exceeded typical energy requirements for a single eating occasion. Non-chain restaurants lacking nutrition information serve amounts of energy that are typically far in excess of human energy requirements for single eating occasions, and are equivalent to amounts served by the large-chain restaurants that have previously been criticized for providing excess energy. Restaurants in general, rather than specific categories of restaurant, expose patrons to

  11. Maximum stellar iron core mass

    Indian Academy of Sciences (India)

    60, No. 3. — journal of. March 2003 physics pp. 415–422. Maximum stellar iron core mass. F W GIACOBBE. Chicago Research Center/American Air Liquide ... iron core compression due to the weight of non-ferrous matter overlying the iron cores within large .... thermal equilibrium velocities will tend to be non-relativistic.

  12. Maximum entropy beam diagnostic tomography

    International Nuclear Information System (INIS)

    Mottershead, C.T.

    1985-01-01

    This paper reviews the formalism of maximum entropy beam diagnostic tomography as applied to the Fusion Materials Irradiation Test (FMIT) prototype accelerator. The same formalism has also been used with streak camera data to produce an ultrahigh speed movie of the beam profile of the Experimental Test Accelerator (ETA) at Livermore. 11 refs., 4 figs

  13. Maximum entropy beam diagnostic tomography

    International Nuclear Information System (INIS)

    Mottershead, C.T.

    1985-01-01

    This paper reviews the formalism of maximum entropy beam diagnostic tomography as applied to the Fusion Materials Irradiation Test (FMIT) prototype accelerator. The same formalism has also been used with streak camera data to produce an ultrahigh speed movie of the beam profile of the Experimental Test Accelerator (ETA) at Livermore

  14. A portable storage maximum thermometer

    International Nuclear Information System (INIS)

    Fayart, Gerard.

    1976-01-01

    A clinical thermometer storing the voltage corresponding to the maximum temperature in an analog memory is described. End of the measurement is shown by a lamp switch out. The measurement time is shortened by means of a low thermal inertia platinum probe. This portable thermometer is fitted with cell test and calibration system [fr

  15. [Conception and Content Validation of a Questionnaire Relating to the Potential Need for Information of Visually Impaired Persons with Regard to Services and Contact Persons].

    Science.gov (United States)

    Hahn, U; Hechler, T; Witt, U; Krummenauer, F

    2015-12-01

    A questionnaire was drafted to identify the needs of visually impaired persons and to optimize their access to non-medical support and services. Subjects had to rate a list of 15 everyday activities that are typically affected by visual impairment (for example, being able to orient themselves in the home environment), by indicating the degree to which they perceive each activity to be affected, using a four-stage scale. They had to evaluate these aspects by means of a relevance assessment. The needs profile derived from this is then correlated with individualized information for assistance and support. The questionnaire shall be made available for use by subjects through advisers in some ophthalmic practices and via the internet. The validity of the content of the proposed tool was evaluated on the basis of a survey of 59 experts in the fields of medical, optical and psychological care and of persons involved in training initiatives. The experts were asked to rate the activities by relevance and clarity of the wording and to propose methods to further develop and optimize the content. The validity of the content was quantified according to a process adopted in the literature, based on the parameters Interrater Agreement (IRA) and Content Validity Index (CVI). The results of all responses (n = 19) and the sub-group analysis suggest that the questionnaire adequately reflects the potential needs profile of visually impaired persons. Overall, there was at least 80% agreement among the 19 experts for 93% of the proposed parameterisation of the activities relating to the relevance and clarity of the wording. Individual proposals for optimization of the design of the questionnaire were adopted. Georg Thieme Verlag KG Stuttgart · New York.

  16. SNPs selected by information content outperform randomly selected microsatellite loci for delineating genetic identification and introgression in the endangered dark European honeybee (Apis mellifera mellifera).

    Science.gov (United States)

    Muñoz, Irene; Henriques, Dora; Jara, Laura; Johnston, J Spencer; Chávez-Galarza, Julio; De La Rúa, Pilar; Pinto, M Alice

    2017-07-01

    The honeybee (Apis mellifera) has been threatened by multiple factors including pests and pathogens, pesticides and loss of locally adapted gene complexes due to replacement and introgression. In western Europe, the genetic integrity of the native A. m. mellifera (M-lineage) is endangered due to trading and intensive queen breeding with commercial subspecies of eastern European ancestry (C-lineage). Effective conservation actions require reliable molecular tools to identify pure-bred A. m. mellifera colonies. Microsatellites have been preferred for identification of A. m. mellifera stocks across conservation centres. However, owing to high throughput, easy transferability between laboratories and low genotyping error, SNPs promise to become popular. Here, we compared the resolving power of a widely utilized microsatellite set to detect structure and introgression with that of different sets that combine a variable number of SNPs selected for their information content and genomic proximity to the microsatellite loci. Contrary to every SNP data set, microsatellites did not discriminate between the two lineages in the PCA space. Mean introgression proportions were identical across the two marker types, although at the individual level, microsatellites' performance was relatively poor at the upper range of Q-values, a result reflected by their lower precision. Our results suggest that SNPs are more accurate and powerful than microsatellites for identification of A. m. mellifera colonies, especially when they are selected by information content. © 2016 John Wiley & Sons Ltd.

  17. Neutron spectra unfolding with maximum entropy and maximum likelihood

    International Nuclear Information System (INIS)

    Itoh, Shikoh; Tsunoda, Toshiharu

    1989-01-01

    A new unfolding theory has been established on the basis of the maximum entropy principle and the maximum likelihood method. This theory correctly embodies the Poisson statistics of neutron detection, and always brings a positive solution over the whole energy range. Moreover, the theory unifies both problems of overdetermined and of underdetermined. For the latter, the ambiguity in assigning a prior probability, i.e. the initial guess in the Bayesian sense, has become extinct by virtue of the principle. An approximate expression of the covariance matrix for the resultant spectra is also presented. An efficient algorithm to solve the nonlinear system, which appears in the present study, has been established. Results of computer simulation showed the effectiveness of the present theory. (author)

  18. Testosterone replacement therapy and the internet: an assessment of providers' health-related web site information content.

    Science.gov (United States)

    Oberlin, Daniel T; Masson, Puneet; Brannigan, Robert E

    2015-04-01

    To compare how providers of testosterone replacement therapy (TRT) in large metropolitan cities promote androgen replacement on their patient-oriented Web sites. TRT provider Web sites were identified using Google search and the terms "Testosterone replacement" and the name of the 5 most populous US cities. These Web sites were assessed for (1) type or specialty of medical provider, (2) discussion of the benefits and risks of TRT, and (3) industry affiliations. In total, 75 Web sites were evaluated. Twenty-seven of the 75 clinics (36%) were directed by nonphysicians, 35 (47%) were overseen by nonurology or nonendocrine physicians, and only 13 (17%) were specialist managed. Fourteen of 75 (18.6%) Web sites disclosed industry relationships. Ninety-five percent of Web sites promoted the benefits of TRT including improved sex drive, cognitive improvement, increased muscle strength, and/or improved energy. Only 20 of 75 Web sites (26.6%) described any side effect of TRT. Web sites directed by specialists were twice as likely to discuss risks of TRT compared with nonspecialist providers (41% vs 20%; odds ratio = 2.77; P <.01). Nine of 75 (12%) of all Web sites actually refuted that TRT was associated with significant side effects. Urologists and endocrinologists are in the minority of providers promoting TRT on the Internet. Specialists are more likely to discuss risks associated with TRT although the majority of surveyed Web sites that promote TRT do not mention treatment risks. There is substantial variability in quality and quantity of information on provider Web sites, which may contribute to misinformation regarding this prevalent health issue. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Facilitating medical information search using Google Glass connected to a content-based medical image retrieval system.

    Science.gov (United States)

    Widmer, Antoine; Schaer, Roger; Markonis, Dimitrios; Muller, Henning

    2014-01-01

    Wearable computing devices are starting to change the way users interact with computers and the Internet. Among them, Google Glass includes a small screen located in front of the right eye, a camera filming in front of the user and a small computing unit. Google Glass has the advantage to provide online services while allowing the user to perform tasks with his/her hands. These augmented glasses uncover many useful applications, also in the medical domain. For example, Google Glass can easily provide video conference between medical doctors to discuss a live case. Using these glasses can also facilitate medical information search by allowing the access of a large amount of annotated medical cases during a consultation in a non-disruptive fashion for medical staff. In this paper, we developed a Google Glass application able to take a photo and send it to a medical image retrieval system along with keywords in order to retrieve similar cases. As a preliminary assessment of the usability of the application, we tested the application under three conditions (images of the skin; printed CT scans and MRI images; and CT and MRI images acquired directly from an LCD screen) to explore whether using Google Glass affects the accuracy of the results returned by the medical image retrieval system. The preliminary results show that despite minor problems due to the relative stability of the Google Glass, images can be sent to and processed by the medical image retrieval system and similar images are returned to the user, potentially helping in the decision making process.

  20. Communications between volunteers and health researchers during recruitment and informed consent: qualitative content analysis of email interactions.

    Science.gov (United States)

    Townsend, Anne; Amarsi, Zubin; Backman, Catherine L; Cox, Susan M; Li, Linda C

    2011-10-13

    While use of the Internet is increasingly widespread in research, little is known about the role of routine electronic mail (email) correspondence during recruitment and early volunteer-researcher interactions. To gain insight into the standpoint of volunteers we analyzed email communications in an early rheumatoid arthritis qualitative interview study. The objectives of our study were (1) to understand the perspectives and motivations of individuals who volunteered for an interview study about the experiences of early rheumatoid arthritis, and (2) to investigate the role of emails in volunteer-researcher interactions during recruitment. Between December 2007 and December 2008 we recruited 38 individuals with early rheumatoid arthritis through rheumatologist and family physician offices, arthritis Internet sites, and the Arthritis Research Centre of Canada for a (face-to-face) qualitative interview study. Interested individuals were invited to contact us via email or telephone. In this paper, we report on email communications from 12 of 29 volunteers who used email as their primary communication mode. Emails offered insights into the perspective of study volunteers. They provided evidence prospectively about recruitment and informed consent in the context of early rheumatoid arthritis. First, some individuals anticipated that participating would have mutual benefits, for themselves and the research, suggesting a reciprocal quality to volunteering. Second, volunteering for the study was strongly motivated by a need to access health services and was both a help-seeking and self-managing strategy. Third, volunteers expressed ambivalence around participation, such as how far participating would benefit them, versus more general benefits for research. Fourth, practical difficulties of negotiating symptom impact, medical appointments, and research tasks were revealed. We also reflect on how emails documented volunteer-researcher interactions, illustrating typically

  1. Using qualitative methods to inform the trade-off between content validity and consistency in utility assessment: the example of type 2 diabetes and Alzheimer's Disease

    Directory of Open Access Journals (Sweden)

    Gargon Elizabeth

    2010-02-01

    Full Text Available Abstract Background Key stakeholders regard generic utility instruments as suitable tools to inform health technology assessment decision-making regarding allocation of resources across competing interventions. These instruments require a 'descriptor', a 'valuation' and a 'perspective' of the economic evaluation. There are various approaches that can be taken for each of these, offering a potential lack of consistency between instruments (a basic requirement for comparisons across diseases. The 'reference method' has been proposed as a way to address the limitations of the Quality-Adjusted Life Year (QALY. However, the degree to which generic measures can assess patients' specific experiences with their disease would remain unresolved. This has been neglected in the discussions on methods development and its impact on the QALY values obtained and resulting cost per QALY estimate underestimated. This study explored the content of utility instruments relevant to type 2 diabetes and Alzheimer's disease (AD as examples, and the role of qualitative research in informing the trade-off between content coverage and consistency. Method A literature review was performed to identify qualitative and quantitative studies regarding patients' experiences with type 2 diabetes or AD, and associated treatments. Conceptual models for each indication were developed. Generic- and disease-specific instruments were mapped to the conceptual models. Results Findings showed that published descriptions of relevant concepts important to patients with type 2 diabetes or AD are available for consideration in deciding on the most comprehensive approach to utility assessment. While the 15-dimensional health related quality of life measure (15D seemed the most comprehensive measure for both diseases, the Health Utilities Index 3 (HUI 3 seemed to have the least coverage for type 2 diabetes and the EuroQol-5 Dimensions (EQ-5D for AD. Furthermore, some of the utility instruments

  2. On Maximum Entropy and Inference

    Directory of Open Access Journals (Sweden)

    Luigi Gresele

    2017-11-01

    Full Text Available Maximum entropy is a powerful concept that entails a sharp separation between relevant and irrelevant variables. It is typically invoked in inference, once an assumption is made on what the relevant variables are, in order to estimate a model from data, that affords predictions on all other (dependent variables. Conversely, maximum entropy can be invoked to retrieve the relevant variables (sufficient statistics directly from the data, once a model is identified by Bayesian model selection. We explore this approach in the case of spin models with interactions of arbitrary order, and we discuss how relevant interactions can be inferred. In this perspective, the dimensionality of the inference problem is not set by the number of parameters in the model, but by the frequency distribution of the data. We illustrate the method showing its ability to recover the correct model in a few prototype cases and discuss its application on a real dataset.

  3. Maximum Water Hammer Sensitivity Analysis

    OpenAIRE

    Jalil Emadi; Abbas Solemani

    2011-01-01

    Pressure waves and Water Hammer occur in a pumping system when valves are closed or opened suddenly or in the case of sudden failure of pumps. Determination of maximum water hammer is considered one of the most important technical and economical items of which engineers and designers of pumping stations and conveyance pipelines should take care. Hammer Software is a recent application used to simulate water hammer. The present study focuses on determining significance of ...

  4. Maximum Gene-Support Tree

    Directory of Open Access Journals (Sweden)

    Yunfeng Shan

    2008-01-01

    Full Text Available Genomes and genes diversify during evolution; however, it is unclear to what extent genes still retain the relationship among species. Model species for molecular phylogenetic studies include yeasts and viruses whose genomes were sequenced as well as plants that have the fossil-supported true phylogenetic trees available. In this study, we generated single gene trees of seven yeast species as well as single gene trees of nine baculovirus species using all the orthologous genes among the species compared. Homologous genes among seven known plants were used for validation of the finding. Four algorithms—maximum parsimony (MP, minimum evolution (ME, maximum likelihood (ML, and neighbor-joining (NJ—were used. Trees were reconstructed before and after weighting the DNA and protein sequence lengths among genes. Rarely a gene can always generate the “true tree” by all the four algorithms. However, the most frequent gene tree, termed “maximum gene-support tree” (MGS tree, or WMGS tree for the weighted one, in yeasts, baculoviruses, or plants was consistently found to be the “true tree” among the species. The results provide insights into the overall degree of divergence of orthologous genes of the genomes analyzed and suggest the following: 1 The true tree relationship among the species studied is still maintained by the largest group of orthologous genes; 2 There are usually more orthologous genes with higher similarities between genetically closer species than between genetically more distant ones; and 3 The maximum gene-support tree reflects the phylogenetic relationship among species in comparison.

  5. Maximum organic carbon limits at different melter feed rates (U)

    International Nuclear Information System (INIS)

    Choi, A.S.

    1995-01-01

    This report documents the results of a study to assess the impact of varying melter feed rates on the maximum total organic carbon (TOC) limits allowable in the DWPF melter feed. Topics discussed include: carbon content; feed rate; feed composition; melter vapor space temperature; combustion and dilution air; off-gas surges; earlier work on maximum TOC; overview of models; and the results of the work completed

  6. LCLS Maximum Credible Beam Power

    International Nuclear Information System (INIS)

    Clendenin, J.

    2005-01-01

    The maximum credible beam power is defined as the highest credible average beam power that the accelerator can deliver to the point in question, given the laws of physics, the beam line design, and assuming all protection devices have failed. For a new accelerator project, the official maximum credible beam power is determined by project staff in consultation with the Radiation Physics Department, after examining the arguments and evidence presented by the appropriate accelerator physicist(s) and beam line engineers. The definitive parameter becomes part of the project's safety envelope. This technical note will first review the studies that were done for the Gun Test Facility (GTF) at SSRL, where a photoinjector similar to the one proposed for the LCLS is being tested. In Section 3 the maximum charge out of the gun for a single rf pulse is calculated. In Section 4, PARMELA simulations are used to track the beam from the gun to the end of the photoinjector. Finally in Section 5 the beam through the matching section and injected into Linac-1 is discussed

  7. A qualitative content analysis of global health engagements in Peacekeeping and Stability Operations Institute's stability operations lessons learned and information management system.

    Science.gov (United States)

    Nang, Roberto N; Monahan, Felicia; Diehl, Glendon B; French, Daniel

    2015-04-01

    Many institutions collect reports in databases to make important lessons-learned available to their members. The Uniformed Services University of the Health Sciences collaborated with the Peacekeeping and Stability Operations Institute to conduct a descriptive and qualitative analysis of global health engagements (GHEs) contained in the Stability Operations Lessons Learned and Information Management System (SOLLIMS). This study used a summative qualitative content analysis approach involving six steps: (1) a comprehensive search; (2) two-stage reading and screening process to identify first-hand, health-related records; (3) qualitative and quantitative data analysis using MAXQDA, a software program; (4) a word cloud to illustrate word frequencies and interrelationships; (5) coding of individual themes and validation of the coding scheme; and (6) identification of relationships in the data and overarching lessons-learned. The individual codes with the most number of text segments coded included: planning, personnel, interorganizational coordination, communication/information sharing, and resources/supplies. When compared to the Department of Defense's (DoD's) evolving GHE principles and capabilities, the SOLLIMS coding scheme appeared to align well with the list of GHE capabilities developed by the Department of Defense Global Health Working Group. The results of this study will inform practitioners of global health and encourage additional qualitative analysis of other lessons-learned databases. Reprint & Copyright © 2015 Association of Military Surgeons of the U.S.

  8. Two Studies on Twitter Networks and Tweet Content in Relation to Amyotrophic Lateral Sclerosis (ALS): Conversation, Information, and 'Diary of a Daily Life'.

    Science.gov (United States)

    Hemsley, Bronwyn; Palmer, Stuart

    2016-01-01

    To date, there is no research examining how adults with Amyotrophic Lateral Sclerosis (ALS) or Motor Neurone Disease (MND) and severe communication disability use Twitter, nor the use of Twitter in relation to ALS/MND beyond its use for fundraising and raising awareness. In this paper we (a) outline a rationale for the use of Twitter as a method of communication and information exchange for adults with ALS/MND, (b) detail multiple qualitative and quantitative methods used to analyse Twitter networks and tweet content in the our studies, and (c) present the results of two studies designed to provide insights on the use of Twitter by an adult with ALS/MND and by #ALS and #MND hashtag communities in Twitter. We will also discuss findings across the studies, implications for health service providers in Twitter, and directions for future Twitter research in relation to ALS/MND.

  9. Generic maximum likely scale selection

    DEFF Research Database (Denmark)

    Pedersen, Kim Steenstrup; Loog, Marco; Markussen, Bo

    2007-01-01

    in this work is on applying this selection principle under a Brownian image model. This image model provides a simple scale invariant prior for natural images and we provide illustrative examples of the behavior of our scale estimation on such images. In these illustrative examples, estimation is based......The fundamental problem of local scale selection is addressed by means of a novel principle, which is based on maximum likelihood estimation. The principle is generally applicable to a broad variety of image models and descriptors, and provides a generic scale estimation methodology. The focus...

  10. Maximum entropy decomposition of quadrupole mass spectra

    International Nuclear Information System (INIS)

    Toussaint, U. von; Dose, V.; Golan, A.

    2004-01-01

    We present an information-theoretic method called generalized maximum entropy (GME) for decomposing mass spectra of gas mixtures from noisy measurements. In this GME approach to the noisy, underdetermined inverse problem, the joint entropies of concentration, cracking, and noise probabilities are maximized subject to the measured data. This provides a robust estimation for the unknown cracking patterns and the concentrations of the contributing molecules. The method is applied to mass spectroscopic data of hydrocarbons, and the estimates are compared with those received from a Bayesian approach. We show that the GME method is efficient and is computationally fast

  11. System for memorizing maximum values

    Science.gov (United States)

    Bozeman, Richard J., Jr.

    1992-08-01

    The invention discloses a system capable of memorizing maximum sensed values. The system includes conditioning circuitry which receives the analog output signal from a sensor transducer. The conditioning circuitry rectifies and filters the analog signal and provides an input signal to a digital driver, which may be either linear or logarithmic. The driver converts the analog signal to discrete digital values, which in turn triggers an output signal on one of a plurality of driver output lines n. The particular output lines selected is dependent on the converted digital value. A microfuse memory device connects across the driver output lines, with n segments. Each segment is associated with one driver output line, and includes a microfuse that is blown when a signal appears on the associated driver output line.

  12. Remarks on the maximum luminosity

    Science.gov (United States)

    Cardoso, Vitor; Ikeda, Taishi; Moore, Christopher J.; Yoo, Chul-Moon

    2018-04-01

    The quest for fundamental limitations on physical processes is old and venerable. Here, we investigate the maximum possible power, or luminosity, that any event can produce. We show, via full nonlinear simulations of Einstein's equations, that there exist initial conditions which give rise to arbitrarily large luminosities. However, the requirement that there is no past horizon in the spacetime seems to limit the luminosity to below the Planck value, LP=c5/G . Numerical relativity simulations of critical collapse yield the largest luminosities observed to date, ≈ 0.2 LP . We also present an analytic solution to the Einstein equations which seems to give an unboundedly large luminosity; this will guide future numerical efforts to investigate super-Planckian luminosities.

  13. Scintillation counter, maximum gamma aspect

    International Nuclear Information System (INIS)

    Thumim, A.D.

    1975-01-01

    A scintillation counter, particularly for counting gamma ray photons, includes a massive lead radiation shield surrounding a sample-receiving zone. The shield is disassembleable into a plurality of segments to allow facile installation and removal of a photomultiplier tube assembly, the segments being so constructed as to prevent straight-line access of external radiation through the shield into radiation-responsive areas. Provisions are made for accurately aligning the photomultiplier tube with respect to one or more sample-transmitting bores extending through the shield to the sample receiving zone. A sample elevator, used in transporting samples into the zone, is designed to provide a maximum gamma-receiving aspect to maximize the gamma detecting efficiency. (U.S.)

  14. Information

    International Nuclear Information System (INIS)

    Boyard, Pierre.

    1981-01-01

    The fear for nuclear energy and more particularly for radioactive wastes is analyzed in the sociological context. Everybody agree on the information need, information is available but there is a problem for their diffusion. Reactions of the public are analyzed and journalists, scientists and teachers have a role to play [fr

  15. Anti-nutrient components of guinea grass ( Panicum maximum ...

    African Journals Online (AJOL)

    Yomi

    2012-01-31

    Jan 31, 2012 ... A true measure of forage quality is animal ... The anti-nutritional contents of a pasture could be ... nutrient factors in P. maximum; (2) assess the effect of nitrogen ..... 3. http://www.clemson.edu/Fairfield/local/news/quality.

  16. The LIS Blogosphere Contains Tags that Can Be Categorized and It Disseminates Professional Content. A Review of: Aharony, N. (2009. Librarians and information scientists in the blogosphere: An exploratory analysis. Library & Information Science Research, 31(3, 174‐181.

    Directory of Open Access Journals (Sweden)

    Virginia Wilson

    2010-03-01

    Full Text Available Objective – This study analyzes library and information studies (LIS oriented blogs to determine the content, and looks at tags and folksonomies of these blogs to determine whether they form a consistent, coherent scheme or whether they are lacking in internal logic.Design – A qualitative content analysis of tags assigned to 30 LIS blogs.Setting – The research took place on the internet from May to July, 2008.Subjects – Thirty LIS blogs were examined, each of which was written by a librarian or an information scientist.Methods – The researcher reviewed 100 blogs that were found by browsing the Top 25 Librarian Bloggers as published by the Online Education Database in 2007 and by searching Technorati, one of the main search engines for blogs, using the term “library and information science.” Thirty blogs were chosen for analysis based on two criteria: the blog had to be written by a librarian or an information scientist, and the blog had to be active during the period studied (May‐July, 2008.A content analysis was undertaken on the tags assigned to the 30 blogs by categorizing the tags that appeared as tag clouds (visual representations of user‐generated tags in which the tags used more frequently are depicted in larger, bolder font in Technorati. In order to validate the Technorati tags, the researcher’s coders read and analyzed all the blog posts over the given time period. The categorization consists of five major categories, each with several subcategories. The categories were developed using a clustering approach, with new categories coming into being when a tag did not fit into an already established category.Main Results – The tag categorization resulted in five broad categories, each with several sub‐categories (a few of which are listed here:1.General (Nouns, Disciplines, Place Names2.Library‐related (Web 2.0, Librarians’ Activities, Catalogues3.Technology‐related Products, Technology – Types, People4

  17. Maximum entropy and Bayesian methods

    International Nuclear Information System (INIS)

    Smith, C.R.; Erickson, G.J.; Neudorfer, P.O.

    1992-01-01

    Bayesian probability theory and Maximum Entropy methods are at the core of a new view of scientific inference. These 'new' ideas, along with the revolution in computational methods afforded by modern computers allow astronomers, electrical engineers, image processors of any type, NMR chemists and physicists, and anyone at all who has to deal with incomplete and noisy data, to take advantage of methods that, in the past, have been applied only in some areas of theoretical physics. The title workshops have been the focus of a group of researchers from many different fields, and this diversity is evident in this book. There are tutorial and theoretical papers, and applications in a very wide variety of fields. Almost any instance of dealing with incomplete and noisy data can be usefully treated by these methods, and many areas of theoretical research are being enhanced by the thoughtful application of Bayes' theorem. Contributions contained in this volume present a state-of-the-art overview that will be influential and useful for many years to come

  18. Naturalising Representational Content

    Science.gov (United States)

    Shea, Nicholas

    2014-01-01

    This paper sets out a view about the explanatory role of representational content and advocates one approach to naturalising content – to giving a naturalistic account of what makes an entity a representation and in virtue of what it has the content it does. It argues for pluralism about the metaphysics of content and suggests that a good strategy is to ask the content question with respect to a variety of predictively successful information processing models in experimental psychology and cognitive neuroscience; and hence that data from psychology and cognitive neuroscience should play a greater role in theorising about the nature of content. Finally, the contours of the view are illustrated by drawing out and defending a surprising consequence: that individuation of vehicles of content is partly externalist. PMID:24563661

  19. Print advertising: vivid content

    NARCIS (Netherlands)

    Fennis, B.M.; Das, E.; Fransen, M.L.

    2012-01-01

    The present research examines the effects of vivid ad content in two types of appeal in print ads as a function of individual differences in chronically experienced vividness of visual imagery. For informational ads for a functional product, vivid ad content strongly affected individuals high in

  20. Print advertising : Vivid content

    NARCIS (Netherlands)

    Fennis, B.M.; Das, E.H.H.J.; Fransen, M.L.

    The present research examines the effects of vivid ad content in two types of appeal in print ads as a function of individual differences in chronically experienced vividness of visual imagery. For informational ads for a functional product, vivid ad content strongly affected individuals high in

  1. Informe

    Directory of Open Access Journals (Sweden)

    Egon Lichetenberger

    1950-10-01

    Full Text Available Informe del doctor Egon Lichetenberger ante el Consejo Directivo de la Facultad, sobre el  curso de especialización en Anatomía Patológica patrocinado por la Kellogg Foundation (Departamento de Patología

  2. Rumor Identification with Maximum Entropy in MicroNet

    Directory of Open Access Journals (Sweden)

    Suisheng Yu

    2017-01-01

    Full Text Available The widely used applications of Microblog, WeChat, and other social networking platforms (that we call MicroNet shorten the period of information dissemination and expand the range of information dissemination, which allows rumors to cause greater harm and have more influence. A hot topic in the information dissemination field is how to identify and block rumors. Based on the maximum entropy model, this paper constructs the recognition mechanism of rumor information in the micronetwork environment. First, based on the information entropy theory, we obtained the characteristics of rumor information using the maximum entropy model. Next, we optimized the original classifier training set and the feature function to divide the information into rumors and nonrumors. Finally, the experimental simulation results show that the rumor identification results using this method are better than the original classifier and other related classification methods.

  3. Maximum Parsimony on Phylogenetic networks

    Science.gov (United States)

    2012-01-01

    Background Phylogenetic networks are generalizations of phylogenetic trees, that are used to model evolutionary events in various contexts. Several different methods and criteria have been introduced for reconstructing phylogenetic trees. Maximum Parsimony is a character-based approach that infers a phylogenetic tree by minimizing the total number of evolutionary steps required to explain a given set of data assigned on the leaves. Exact solutions for optimizing parsimony scores on phylogenetic trees have been introduced in the past. Results In this paper, we define the parsimony score on networks as the sum of the substitution costs along all the edges of the network; and show that certain well-known algorithms that calculate the optimum parsimony score on trees, such as Sankoff and Fitch algorithms extend naturally for networks, barring conflicting assignments at the reticulate vertices. We provide heuristics for finding the optimum parsimony scores on networks. Our algorithms can be applied for any cost matrix that may contain unequal substitution costs of transforming between different characters along different edges of the network. We analyzed this for experimental data on 10 leaves or fewer with at most 2 reticulations and found that for almost all networks, the bounds returned by the heuristics matched with the exhaustively determined optimum parsimony scores. Conclusion The parsimony score we define here does not directly reflect the cost of the best tree in the network that displays the evolution of the character. However, when searching for the most parsimonious network that describes a collection of characters, it becomes necessary to add additional cost considerations to prefer simpler structures, such as trees over networks. The parsimony score on a network that we describe here takes into account the substitution costs along the additional edges incident on each reticulate vertex, in addition to the substitution costs along the other edges which are

  4. The constraint rule of the maximum entropy principle

    NARCIS (Netherlands)

    Uffink, J.

    1995-01-01

    The principle of maximum entropy is a method for assigning values to probability distributions on the basis of partial information. In usual formulations of this and related methods of inference one assumes that this partial information takes the form of a constraint on allowed probability

  5. METHODS OF CONTENTS CURATOR

    Directory of Open Access Journals (Sweden)

    V. Kukharenko

    2013-03-01

    Full Text Available Content curated - a new activity (started in 2008 qualified network users with process large amounts of information to represent her social network users. To prepare content curators developed 7 weeks distance course, which examines the functions, methods and tools curator. Courses showed a significant relationship success learning on the availability of advanced personal learning environment and the ability to process and analyze information.

  6. Cosmic shear measurement with maximum likelihood and maximum a posteriori inference

    Science.gov (United States)

    Hall, Alex; Taylor, Andy

    2017-06-01

    We investigate the problem of noise bias in maximum likelihood and maximum a posteriori estimators for cosmic shear. We derive the leading and next-to-leading order biases and compute them in the context of galaxy ellipticity measurements, extending previous work on maximum likelihood inference for weak lensing. We show that a large part of the bias on these point estimators can be removed using information already contained in the likelihood when a galaxy model is specified, without the need for external calibration. We test these bias-corrected estimators on simulated galaxy images similar to those expected from planned space-based weak lensing surveys, with promising results. We find that the introduction of an intrinsic shape prior can help with mitigation of noise bias, such that the maximum a posteriori estimate can be made less biased than the maximum likelihood estimate. Second-order terms offer a check on the convergence of the estimators, but are largely subdominant. We show how biases propagate to shear estimates, demonstrating in our simple set-up that shear biases can be reduced by orders of magnitude and potentially to within the requirements of planned space-based surveys at mild signal-to-noise ratio. We find that second-order terms can exhibit significant cancellations at low signal-to-noise ratio when Gaussian noise is assumed, which has implications for inferring the performance of shear-measurement algorithms from simplified simulations. We discuss the viability of our point estimators as tools for lensing inference, arguing that they allow for the robust measurement of ellipticity and shear.

  7. Maximum utilization of women's potentials.

    Science.gov (United States)

    1998-01-01

    Balayan's Municipal Center for Women was created to recognize women's role in the family and community in nation-building; to support the dignity and integrity of all people, especially women, and fight against rape, incest, wife beating, sexual harassment, and sexual discrimination; to empower women through education; to use women as equal partners in achieving progress; to end gender bias and discrimination, and improve women's status; and to enact progressive legal and moral change in favor of women and women's rights. The organization's functions in the following areas are described: education and information dissemination, community organizing, the provision of economic and livelihood assistance, women's counseling, health assistance, legislative advocacy and research, legal assistance, women's networking, and monitoring and evaluation.

  8. Informed, advance refusals of treatment by people with severe mental illness in a randomised controlled trial of joint crisis plans: demand, content and correlates.

    Science.gov (United States)

    Henderson, Claire; Farrelly, Simone; Flach, Clare; Borschmann, Rohan; Birchwood, Max; Thornicroft, Graham; Waheed, Waquas; Szmukler, George

    2017-11-24

    In the UK, crisis planning for mental health care should acknowledge the right to make an informed advance treatment refusal under the Mental Capacity Act 2005. Our aims were to estimate the demand for such treatment refusals within a sample of service users who had had a recent hospital admission for psychosis or bipolar disorder, and to examine the relationship between refusals, and service user characteristics. To identify refusals we conducted content analysis of Joint Crisis Plans, which are plans formulated by service users and their clinical team with involvement from an external facilitator, and routine care plans in sub-samples from a multi-centre randomised controlled trial of Joint Crisis Plans (plus routine mental health care) versus routine care alone (CRIMSON) in England. Factors hypothesised to be associated with refusals were identified using the trial data collected through baseline interviews of service users and clinicians and collection of routine clinical data. Ninety-nine of 221 (45%) of the Joint Crisis Plans contained a treatment refusal compared to 10 of 424 (2.4%) baseline routine care plans. No Joint Crisis Plans recorded disagreement with refusals on the part of clinicians. Among those with completed Joint Crisis Plans, adjusted analyses indicated a significant association between treatment refusals and perceived coercion at baseline (odds ratio = 1.21, 95% CI 1.02-1.43), but not with baseline working alliance or a past history of involuntary admission. We demonstrated significant demand for written treatment refusals in line with the Mental Capacity Act 2005, which had not previously been elicited by the process of treatment planning. Future treatment/crisis plans should incorporate the opportunity for service users to record a treatment refusal during the drafting of such plans. ISRCTN11501328 Registered 13th March 2008.

  9. Compiling standardized information from clinical practice: using content analysis and ICF Linking Rules in a goal-oriented youth rehabilitation program.

    Science.gov (United States)

    Lustenberger, Nadia A; Prodinger, Birgit; Dorjbal, Delgerjargal; Rubinelli, Sara; Schmitt, Klaus; Scheel-Sailer, Anke

    2017-09-23

    To illustrate how routinely written narrative admission and discharge reports of a rehabilitation program for eight youths with chronic neurological health conditions can be transformed to the International Classification of Functioning, Disability and Health. First, a qualitative content analysis was conducted by building meaningful units with text segments assigned of the reports to the five elements of the Rehab-Cycle ® : goal; assessment; assignment; intervention; evaluation. Second, the meaningful units were then linked to the ICF using the refined ICF Linking Rules. With the first step of transformation, the emphasis of the narrative reports changed to a process oriented interdisciplinary layout, revealing three thematic blocks of goals: mobility, self-care, mental, and social functions. The linked 95 unique ICF codes could be grouped in clinically meaningful goal-centered ICF codes. Between the two independent linkers, the agreement rate was improved after complementing the rules with additional agreements. The ICF Linking Rules can be used to compile standardized health information from narrative reports if prior structured. The process requires time and expertise. To implement the ICF into common practice, the findings provide the starting point for reporting rehabilitation that builds upon existing practice and adheres to international standards. Implications for Rehabilitation This study provides evidence that routinely collected health information from rehabilitation practice can be transformed to the International Classification of Functioning, Disability and Health by using the "ICF Linking Rules", however, this requires time and expertise. The Rehab-Cycle ® , including assessments, assignments, goal setting, interventions and goal evaluation, serves as feasible framework for structuring this rehabilitation program and ensures that the complexity of local practice is appropriately reflected. The refined "ICF Linking Rules" lead to a standardized

  10. Genomes: At the edge of chaos with maximum information capacity

    Science.gov (United States)

    Kong, Sing-Guan; Chen, Hong-Da; Torda, Andrew; Lee, H. C.

    2016-12-01

    We propose an order index, ϕ, which quantifies the notion of “life at the edge of chaos” when applied to genome sequences. It maps genomes to a number from 0 (random and of infinite length) to 1 (fully ordered) and applies regardless of sequence length and base composition. The 786 complete genomic sequences in GenBank were found to have ϕ values in a very narrow range, 0.037 ± 0.027. We show this implies that genomes are halfway towards being completely random, namely, at the edge of chaos. We argue that this narrow range represents the neighborhood of a fixed-point in the space of sequences, and genomes are driven there by the dynamics of a robust, predominantly neutral evolution process.

  11. Two-dimensional maximum entropy image restoration

    International Nuclear Information System (INIS)

    Brolley, J.E.; Lazarus, R.B.; Suydam, B.R.; Trussell, H.J.

    1977-07-01

    An optical check problem was constructed to test P LOG P maximum entropy restoration of an extremely distorted image. Useful recovery of the original image was obtained. Comparison with maximum a posteriori restoration is made. 7 figures

  12. Local Content

    CSIR Research Space (South Africa)

    Gibberd, Jeremy

    2016-10-01

    Full Text Available Local content refers to materials and products made in a country as opposed those that are imported. There is an increasing interest in the concept of local content as a means of supporting local economies and providing jobs (Belderbos & Sleuwaegen...

  13. Receiver function estimated by maximum entropy deconvolution

    Institute of Scientific and Technical Information of China (English)

    吴庆举; 田小波; 张乃铃; 李卫平; 曾融生

    2003-01-01

    Maximum entropy deconvolution is presented to estimate receiver function, with the maximum entropy as the rule to determine auto-correlation and cross-correlation functions. The Toeplitz equation and Levinson algorithm are used to calculate the iterative formula of error-predicting filter, and receiver function is then estimated. During extrapolation, reflective coefficient is always less than 1, which keeps maximum entropy deconvolution stable. The maximum entropy of the data outside window increases the resolution of receiver function. Both synthetic and real seismograms show that maximum entropy deconvolution is an effective method to measure receiver function in time-domain.

  14. Determination of the maximum-depth to potential field sources by a maximum structural index method

    Science.gov (United States)

    Fedi, M.; Florio, G.

    2013-01-01

    A simple and fast determination of the limiting depth to the sources may represent a significant help to the data interpretation. To this end we explore the possibility of determining those source parameters shared by all the classes of models fitting the data. One approach is to determine the maximum depth-to-source compatible with the measured data, by using for example the well-known Bott-Smith rules. These rules involve only the knowledge of the field and its horizontal gradient maxima, and are independent from the density contrast. Thanks to the direct relationship between structural index and depth to sources we work out a simple and fast strategy to obtain the maximum depth by using the semi-automated methods, such as Euler deconvolution or depth-from-extreme-points method (DEXP). The proposed method consists in estimating the maximum depth as the one obtained for the highest allowable value of the structural index (Nmax). Nmax may be easily determined, since it depends only on the dimensionality of the problem (2D/3D) and on the nature of the analyzed field (e.g., gravity field or magnetic field). We tested our approach on synthetic models against the results obtained by the classical Bott-Smith formulas and the results are in fact very similar, confirming the validity of this method. However, while Bott-Smith formulas are restricted to the gravity field only, our method is applicable also to the magnetic field and to any derivative of the gravity and magnetic field. Our method yields a useful criterion to assess the source model based on the (∂f/∂x)max/fmax ratio. The usefulness of the method in real cases is demonstrated for a salt wall in the Mississippi basin, where the estimation of the maximum depth agrees with the seismic information.

  15. The discrete maximum principle for Galerkin solutions of elliptic problems

    Czech Academy of Sciences Publication Activity Database

    Vejchodský, Tomáš

    2012-01-01

    Roč. 10, č. 1 (2012), s. 25-43 ISSN 1895-1074 R&D Projects: GA AV ČR IAA100760702 Institutional research plan: CEZ:AV0Z10190503 Keywords : discrete maximum principle * monotone methods * Galerkin solution Subject RIV: BA - General Mathematics Impact factor: 0.405, year: 2012 http://www.springerlink.com/content/x73624wm23x4wj26

  16. On an Objective Basis for the Maximum Entropy Principle

    Directory of Open Access Journals (Sweden)

    David J. Miller

    2015-01-01

    Full Text Available In this letter, we elaborate on some of the issues raised by a recent paper by Neapolitan and Jiang concerning the maximum entropy (ME principle and alternative principles for estimating probabilities consistent with known, measured constraint information. We argue that the ME solution for the “problematic” example introduced by Neapolitan and Jiang has stronger objective basis, rooted in results from information theory, than their alternative proposed solution. We also raise some technical concerns about the Bayesian analysis in their work, which was used to independently support their alternative to the ME solution. The letter concludes by noting some open problems involving maximum entropy statistical inference.

  17. Maximum Power from a Solar Panel

    Directory of Open Access Journals (Sweden)

    Michael Miller

    2010-01-01

    Full Text Available Solar energy has become a promising alternative to conventional fossil fuel sources. Solar panels are used to collect solar radiation and convert it into electricity. One of the techniques used to maximize the effectiveness of this energy alternative is to maximize the power output of the solar collector. In this project the maximum power is calculated by determining the voltage and the current of maximum power. These quantities are determined by finding the maximum value for the equation for power using differentiation. After the maximum values are found for each time of day, each individual quantity, voltage of maximum power, current of maximum power, and maximum power is plotted as a function of the time of day.

  18. Probabilistic maximum-value wind prediction for offshore environments

    DEFF Research Database (Denmark)

    Staid, Andrea; Pinson, Pierre; Guikema, Seth D.

    2015-01-01

    statistical models to predict the full distribution of the maximum-value wind speeds in a 3 h interval. We take a detailed look at the performance of linear models, generalized additive models and multivariate adaptive regression splines models using meteorological covariates such as gust speed, wind speed......, convective available potential energy, Charnock, mean sea-level pressure and temperature, as given by the European Center for Medium-Range Weather Forecasts forecasts. The models are trained to predict the mean value of maximum wind speed, and the residuals from training the models are used to develop...... the full probabilistic distribution of maximum wind speed. Knowledge of the maximum wind speed for an offshore location within a given period can inform decision-making regarding turbine operations, planned maintenance operations and power grid scheduling in order to improve safety and reliability...

  19. Fast crawling methods of exploring content distributed over large graphs

    KAUST Repository

    Wang, Pinghui

    2018-03-15

    Despite recent effort to estimate topology characteristics of large graphs (e.g., online social networks and peer-to-peer networks), little attention has been given to develop a formal crawling methodology to characterize the vast amount of content distributed over these networks. Due to the large-scale nature of these networks and a limited query rate imposed by network service providers, exhaustively crawling and enumerating content maintained by each vertex is computationally prohibitive. In this paper, we show how one can obtain content properties by crawling only a small fraction of vertices and collecting their content. We first show that when sampling is naively applied, this can produce a huge bias in content statistics (i.e., average number of content replicas). To remove this bias, one may use maximum likelihood estimation to estimate content characteristics. However, our experimental results show that this straightforward method requires to sample most vertices to obtain accurate estimates. To address this challenge, we propose two efficient estimators: special copy estimator (SCE) and weighted copy estimator (WCE) to estimate content characteristics using available information in sampled content. SCE uses the special content copy indicator to compute the estimate, while WCE derives the estimate based on meta-information in sampled vertices. We conduct experiments on a variety of real-word and synthetic datasets, and the results show that WCE and SCE are cost effective and also “asymptotically unbiased”. Our methodology provides a new tool for researchers to efficiently query content distributed in large-scale networks.

  20. Maximum permissible voltage of YBCO coated conductors

    Energy Technology Data Exchange (ETDEWEB)

    Wen, J.; Lin, B.; Sheng, J.; Xu, J.; Jin, Z. [Department of Electrical Engineering, Shanghai Jiao Tong University, Shanghai (China); Hong, Z., E-mail: zhiyong.hong@sjtu.edu.cn [Department of Electrical Engineering, Shanghai Jiao Tong University, Shanghai (China); Wang, D.; Zhou, H.; Shen, X.; Shen, C. [Qingpu Power Supply Company, State Grid Shanghai Municipal Electric Power Company, Shanghai (China)

    2014-06-15

    Highlights: • We examine three kinds of tapes’ maximum permissible voltage. • We examine the relationship between quenching duration and maximum permissible voltage. • Continuous I{sub c} degradations under repetitive quenching where tapes reaching maximum permissible voltage. • The relationship between maximum permissible voltage and resistance, temperature. - Abstract: Superconducting fault current limiter (SFCL) could reduce short circuit currents in electrical power system. One of the most important thing in developing SFCL is to find out the maximum permissible voltage of each limiting element. The maximum permissible voltage is defined as the maximum voltage per unit length at which the YBCO coated conductors (CC) do not suffer from critical current (I{sub c}) degradation or burnout. In this research, the time of quenching process is changed and voltage is raised until the I{sub c} degradation or burnout happens. YBCO coated conductors test in the experiment are from American superconductor (AMSC) and Shanghai Jiao Tong University (SJTU). Along with the quenching duration increasing, the maximum permissible voltage of CC decreases. When quenching duration is 100 ms, the maximum permissible of SJTU CC, 12 mm AMSC CC and 4 mm AMSC CC are 0.72 V/cm, 0.52 V/cm and 1.2 V/cm respectively. Based on the results of samples, the whole length of CCs used in the design of a SFCL can be determined.

  1. Paving the road to maximum productivity.

    Science.gov (United States)

    Holland, C

    1998-01-01

    "Job security" is an oxymoron in today's environment of downsizing, mergers, and acquisitions. Workers find themselves living by new rules in the workplace that they may not understand. How do we cope? It is the leader's charge to take advantage of this chaos and create conditions under which his or her people can understand the need for change and come together with a shared purpose to effect that change. The clinical laboratory at Arkansas Children's Hospital has taken advantage of this chaos to down-size and to redesign how the work gets done to pave the road to maximum productivity. After initial hourly cutbacks, the workers accepted the cold, hard fact that they would never get their old world back. They set goals to proactively shape their new world through reorganizing, flexing staff with workload, creating a rapid response laboratory, exploiting information technology, and outsourcing. Today the laboratory is a lean, productive machine that accepts change as a way of life. We have learned to adapt, trust, and support each other as we have journeyed together over the rough roads. We are looking forward to paving a new fork in the road to the future.

  2. New shower maximum trigger for electrons and photons at CDF

    International Nuclear Information System (INIS)

    Amidei, D.; Burkett, K.; Gerdes, D.; Miao, C.; Wolinski, D.

    1994-01-01

    For the 1994 Tevatron collider run, CDF has upgraded the electron and photo trigger hardware to make use of shower position and size information from the central shower maximum detector. For electrons, the upgrade has resulted in a 50% reduction in backgrounds while retaining approximately 90% of the signal. The new trigger also eliminates the background to photon triggers from single-phototube spikes

  3. New shower maximum trigger for electrons and photons at CDF

    International Nuclear Information System (INIS)

    Gerdes, D.

    1994-08-01

    For the 1994 Tevatron collider run, CDF has upgraded the electron and photon trigger hardware to make use of shower position and size information from the central shower maximum detector. For electrons, the upgrade has resulted in a 50% reduction in backgrounds while retaining approximately 90% of the signal. The new trigger also eliminates the background to photon triggers from single-phototube discharge

  4. Current opinion about maximum entropy methods in Moessbauer spectroscopy

    International Nuclear Information System (INIS)

    Szymanski, K

    2009-01-01

    Current opinion about Maximum Entropy Methods in Moessbauer Spectroscopy is presented. The most important advantage offered by the method is the correct data processing under circumstances of incomplete information. Disadvantage is the sophisticated algorithm and its application to the specific problems.

  5. Multilevel maximum likelihood estimation with application to covariance matrices

    Czech Academy of Sciences Publication Activity Database

    Turčičová, Marie; Mandel, J.; Eben, Kryštof

    Published online: 23 January ( 2018 ) ISSN 0361-0926 R&D Projects: GA ČR GA13-34856S Institutional support: RVO:67985807 Keywords : Fisher information * High dimension * Hierarchical maximum likelihood * Nested parameter spaces * Spectral diagonal covariance model * Sparse inverse covariance model Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.311, year: 2016

  6. The Effect of Web Assisted Learning with Emotional Intelligence Content on Students' Information about Energy Saving, Attitudes towards Environment and Emotional Intelligence

    Science.gov (United States)

    Ercan, Orhan; Ural, Evrim; Köse, Sinan

    2017-01-01

    For a sustainable world, it is very important for students to develop positive environmental attitudes and to have awareness of energy use. The study aims to investigate the effect of web assisted instruction with emotional intelligence content on 8th grade students' emotional intelligence, attitudes towards environment and energy saving, academic…

  7. Precise charge density studies by maximum entropy method

    CERN Document Server

    Takata, M

    2003-01-01

    For the production research and development of nanomaterials, their structural information is indispensable. Recently, a sophisticated analytical method, which is based on information theory, the Maximum Entropy Method (MEM) using synchrotron radiation powder data, has been successfully applied to determine precise charge densities of metallofullerenes and nanochannel microporous compounds. The results revealed various endohedral natures of metallofullerenes and one-dimensional array formation of adsorbed gas molecules in nanochannel microporous compounds. The concept of MEM analysis was also described briefly. (author)

  8. Missing data methods for dealing with missing items in quality of life questionnaires. A comparison by simulation of personal mean score, full information maximum likelihood, multiple imputation, and hot deck techniques applied to the SF-36 in the French 2003 decennial health survey.

    Science.gov (United States)

    Peyre, Hugo; Leplège, Alain; Coste, Joël

    2011-03-01

    Missing items are common in quality of life (QoL) questionnaires and present a challenge for research in this field. It remains unclear which of the various methods proposed to deal with missing data performs best in this context. We compared personal mean score, full information maximum likelihood, multiple imputation, and hot deck techniques using various realistic simulation scenarios of item missingness in QoL questionnaires constructed within the framework of classical test theory. Samples of 300 and 1,000 subjects were randomly drawn from the 2003 INSEE Decennial Health Survey (of 23,018 subjects representative of the French population and having completed the SF-36) and various patterns of missing data were generated according to three different item non-response rates (3, 6, and 9%) and three types of missing data (Little and Rubin's "missing completely at random," "missing at random," and "missing not at random"). The missing data methods were evaluated in terms of accuracy and precision for the analysis of one descriptive and one association parameter for three different scales of the SF-36. For all item non-response rates and types of missing data, multiple imputation and full information maximum likelihood appeared superior to the personal mean score and especially to hot deck in terms of accuracy and precision; however, the use of personal mean score was associated with insignificant bias (relative bias personal mean score appears nonetheless appropriate for dealing with items missing from completed SF-36 questionnaires in most situations of routine use. These results can reasonably be extended to other questionnaires constructed according to classical test theory.

  9. Concrete workability and fibre content

    OpenAIRE

    Vikan, Hedda

    2007-01-01

    Research report Parameters influencing the workability of fibre concrete and maximum fibre content are given in this state of the art report along with the range of fibre types available on today’s market. The study reveales that new placing techniques and production methods are crucial in order to increase fibre content and concrete strength. Achieving the same mechanical properties as traditionally reinforced concrete will probably also demand changes of the matrix. Finally, reco...

  10. Revealing the Maximum Strength in Nanotwinned Copper

    DEFF Research Database (Denmark)

    Lu, L.; Chen, X.; Huang, Xiaoxu

    2009-01-01

    boundary–related processes. We investigated the maximum strength of nanotwinned copper samples with different twin thicknesses. We found that the strength increases with decreasing twin thickness, reaching a maximum at 15 nanometers, followed by a softening at smaller values that is accompanied by enhanced...

  11. Modelling maximum canopy conductance and transpiration in ...

    African Journals Online (AJOL)

    There is much current interest in predicting the maximum amount of water that can be transpired by Eucalyptus trees. It is possible that industrial waste water may be applied as irrigation water to eucalypts and it is important to predict the maximum transpiration rates of these plantations in an attempt to dispose of this ...

  12. A discussion about maximum uranium concentration in digestion solution of U3O8 type uranium ore concentrate

    International Nuclear Information System (INIS)

    Xia Dechang; Liu Chao

    2012-01-01

    On the basis of discussing the influence of single factor on maximum uranium concentration in digestion solution,the influence degree of some factors such as U content, H 2 O content, mass ratio of P and U was compared and analyzed. The results indicate that the relationship between U content and maximum uranium concentration in digestion solution was direct ratio, while the U content increases by 1%, the maximum uranium concentration in digestion solution increases by 4.8%-5.7%. The relationship between H 2 O content and maximum uranium concentration in digestion solution was inverse ratio, the maximum uranium concentration in digestion solution decreases by 46.1-55.2 g/L while H 2 O content increases by 1%. The relationship between mass ratio of P and U and maximum uranium concentration in digestion solution was inverse ratio, the maximum uranium concentration in digestion solution decreases by 116.0-181.0 g/L while the mass ratio of P and U increase 0.1%. When U content equals 62.5% and the influence of mass ratio of P and U is no considered, the maximum uranium concentration in digestion solution equals 1 578 g/L; while mass ratio of P and U equals 0.35%, the maximum uranium concentration decreases to 716 g/L, the decreased rate is 54.6%, so the mass ratio of P and U in U 3 O 8 type uranium ore concentrate is the main controlling factor. (authors)

  13. Personalized professional content recommendation

    Science.gov (United States)

    Xu, Songhua

    2015-10-27

    A personalized content recommendation system includes a client interface configured to automatically monitor a user's information data stream transmitted on the Internet. A hybrid contextual behavioral and collaborative personal interest inference engine resident to a non-transient media generates automatic predictions about the interests of individual users of the system. A database server retains the user's personal interest profile based on a plurality of monitored information. The system also includes a server programmed to filter items in an incoming information stream with the personal interest profile and is further programmed to identify only those items of the incoming information stream that substantially match the personal interest profile.

  14. Discontinuity of maximum entropy inference and quantum phase transitions

    International Nuclear Information System (INIS)

    Chen, Jianxin; Ji, Zhengfeng; Yu, Nengkun; Zeng, Bei; Li, Chi-Kwong; Poon, Yiu-Tung; Shen, Yi; Zhou, Duanlu

    2015-01-01

    In this paper, we discuss the connection between two genuinely quantum phenomena—the discontinuity of quantum maximum entropy inference and quantum phase transitions at zero temperature. It is shown that the discontinuity of the maximum entropy inference of local observable measurements signals the non-local type of transitions, where local density matrices of the ground state change smoothly at the transition point. We then propose to use the quantum conditional mutual information of the ground state as an indicator to detect the discontinuity and the non-local type of quantum phase transitions in the thermodynamic limit. (paper)

  15. MXLKID: a maximum likelihood parameter identifier

    International Nuclear Information System (INIS)

    Gavel, D.T.

    1980-07-01

    MXLKID (MaXimum LiKelihood IDentifier) is a computer program designed to identify unknown parameters in a nonlinear dynamic system. Using noisy measurement data from the system, the maximum likelihood identifier computes a likelihood function (LF). Identification of system parameters is accomplished by maximizing the LF with respect to the parameters. The main body of this report briefly summarizes the maximum likelihood technique and gives instructions and examples for running the MXLKID program. MXLKID is implemented LRLTRAN on the CDC7600 computer at LLNL. A detailed mathematical description of the algorithm is given in the appendices. 24 figures, 6 tables

  16. Learning Content Management Systems

    Directory of Open Access Journals (Sweden)

    Tache JURUBESCU

    2008-01-01

    Full Text Available The paper explains the evolution of e-Learning and related concepts and tools and its connection with other concepts such as Knowledge Management, Human Resources Management, Enterprise Resource Planning, and Information Technology. The paper also distinguished Learning Content Management Systems from Learning Management Systems and Content Management Systems used for general web-based content. The newest Learning Content Management System, very expensive and yet very little implemented is one of the best tools that helps us to cope with the realities of the 21st Century in what learning concerns. The debates over how beneficial one or another system is for an organization, can be driven by costs involved, efficiency envisaged, and availability of the product on the market.

  17. Copyright Information

    Science.gov (United States)

    ... Here: Home → Copyright Information URL of this page: https://medlineplus.gov/copyright.html Copyright Information To use ... the Magazine and NIH MedlinePlus Salud The FAQs ( https://medlineplus.gov/faq/faq.html ) The same content ...

  18. Maximum neutron flux in thermal reactors

    International Nuclear Information System (INIS)

    Strugar, P.V.

    1968-12-01

    Direct approach to the problem is to calculate spatial distribution of fuel concentration if the reactor core directly using the condition of maximum neutron flux and comply with thermal limitations. This paper proved that the problem can be solved by applying the variational calculus, i.e. by using the maximum principle of Pontryagin. Mathematical model of reactor core is based on the two-group neutron diffusion theory with some simplifications which make it appropriate from maximum principle point of view. Here applied theory of maximum principle are suitable for application. The solution of optimum distribution of fuel concentration in the reactor core is obtained in explicit analytical form. The reactor critical dimensions are roots of a system of nonlinear equations and verification of optimum conditions can be done only for specific examples

  19. Maximum allowable load on wheeled mobile manipulators

    International Nuclear Information System (INIS)

    Habibnejad Korayem, M.; Ghariblu, H.

    2003-01-01

    This paper develops a computational technique for finding the maximum allowable load of mobile manipulator during a given trajectory. The maximum allowable loads which can be achieved by a mobile manipulator during a given trajectory are limited by the number of factors; probably the dynamic properties of mobile base and mounted manipulator, their actuator limitations and additional constraints applied to resolving the redundancy are the most important factors. To resolve extra D.O.F introduced by the base mobility, additional constraint functions are proposed directly in the task space of mobile manipulator. Finally, in two numerical examples involving a two-link planar manipulator mounted on a differentially driven mobile base, application of the method to determining maximum allowable load is verified. The simulation results demonstrates the maximum allowable load on a desired trajectory has not a unique value and directly depends on the additional constraint functions which applies to resolve the motion redundancy

  20. Maximum phytoplankton concentrations in the sea

    DEFF Research Database (Denmark)

    Jackson, G.A.; Kiørboe, Thomas

    2008-01-01

    A simplification of plankton dynamics using coagulation theory provides predictions of the maximum algal concentration sustainable in aquatic systems. These predictions have previously been tested successfully against results from iron fertilization experiments. We extend the test to data collect...

  1. Maximum total organic carbon limit for DWPF melter feed

    International Nuclear Information System (INIS)

    Choi, A.S.

    1995-01-01

    DWPF recently decided to control the potential flammability of melter off-gas by limiting the total carbon content in the melter feed and maintaining adequate conditions for combustion in the melter plenum. With this new strategy, all the LFL analyzers and associated interlocks and alarms were removed from both the primary and backup melter off-gas systems. Subsequently, D. Iverson of DWPF- T ampersand E requested that SRTC determine the maximum allowable total organic carbon (TOC) content in the melter feed which can be implemented as part of the Process Requirements for melter feed preparation (PR-S04). The maximum TOC limit thus determined in this study was about 24,000 ppm on an aqueous slurry basis. At the TOC levels below this, the peak concentration of combustible components in the quenched off-gas will not exceed 60 percent of the LFL during off-gas surges of magnitudes up to three times nominal, provided that the melter plenum temperature and the air purge rate to the BUFC are monitored and controlled above 650 degrees C and 220 lb/hr, respectively. Appropriate interlocks should discontinue the feeding when one or both of these conditions are not met. Both the magnitude and duration of an off-gas surge have a major impact on the maximum TOC limit, since they directly affect the melter plenum temperature and combustion. Although the data obtained during recent DWPF melter startup tests showed that the peak magnitude of a surge can be greater than three times nominal, the observed duration was considerably shorter, on the order of several seconds. The long surge duration assumed in this study has a greater impact on the plenum temperature than the peak magnitude, thus making the maximum TOC estimate conservative. Two models were used to make the necessary calculations to determine the TOC limit

  2. Maximum-Likelihood Detection Of Noncoherent CPM

    Science.gov (United States)

    Divsalar, Dariush; Simon, Marvin K.

    1993-01-01

    Simplified detectors proposed for use in maximum-likelihood-sequence detection of symbols in alphabet of size M transmitted by uncoded, full-response continuous phase modulation over radio channel with additive white Gaussian noise. Structures of receivers derived from particular interpretation of maximum-likelihood metrics. Receivers include front ends, structures of which depends only on M, analogous to those in receivers of coherent CPM. Parts of receivers following front ends have structures, complexity of which would depend on N.

  3. A decade of research on health content in the media: the focus on health challenges and sociocultural context and attendant informational and ideological problems.

    Science.gov (United States)

    Kline, Kimberly N

    2006-01-01

    There is a burgeoning interest in the health and illness content of popular media in the domains of advertising, journalism, and entertainment. This article reviews the past 10 years of this research, describing the relationship between the health topics addressed in the research, the shifting focus of concerns about the media, and, ultimately, the variation in problems for health promotion. I suggest that research attending to topics related to bodily health challenges focused on whether popular media accurately or appropriately represented health challenges. The implication was that there is some consensus about more right or wrong, complete or incomplete ways of representing an issue; the problem was that the media are generally wrong. Alternatively, research addressing topics related to sociocultural context issues focused on how certain interests are privileged in the media. The implication was that competing groups are making claims on the system, but the problem was that popular media marginalizes certain interests. In short, popular media is not likely to facilitate understandings helpful to individuals coping with health challenges and is likely to perpetuate social and political power differentials with regard to health-related issues. I conclude by offering some possibilities for future health media content research.

  4. Development of a Rating Tool for Mobile Cancer Apps: Information Analysis and Formal and Content-Related Evaluation of Selected Cancer Apps.

    Science.gov (United States)

    Böhme, Cathleen; von Osthoff, Marc Baron; Frey, Katrin; Hübner, Jutta

    2017-08-17

    Mobile apps are offered in large numbers and have different qualities. The aim of this article was to develop a rating tool based on formal and content-related criteria for the assessment of cancer apps and to test its applicability on apps. After a thorough analysis of the literature, we developed a specific rating tool for cancer apps based on the MARS (mobile app rating system) and a rating tool for cancer websites. This instrument was applied to apps freely available in stores and focusing on some cancer topic. Ten apps were rated on the basis of 22 criteria. Sixty percent of the apps (6/10) were rated poor and insufficient. The rating by different scientists was homogenous. The good apps had reliable sources were regularly updated and had a concrete intent/purpose in their app description. In contrast, the apps that were rated poor had no distinction of scientific content and advertisement. In some cases, there was no imprint to identify the provider. As apps of poor quality can give misinformation and lead to wrong treatment decisions, efforts have to be made to increase usage of high-quality apps. Certification would help cancer patients to identify reliable apps, yet acceptance of a certification system must be backed up.

  5. Ionospheric/protonospheric electron content studies using ATS-6

    International Nuclear Information System (INIS)

    Hajeb-Hosseinieh, H.; Kersley, L.; Edwards, K.J.

    1978-01-01

    Measurements of ionospheric and protonospheric contents obtained at Aberystwyth from observations of the ATS-6 satellite radio beacon are reported. The monthly median diurnal behavior shows protonospheric contributions of approximately 15 to 20% to the total content along the ray path by day, rising to a predawn maximum of 35% in summer and more than 40% in winter. The protonospheric results are shown to be typical of those expected from other European stations and differences from earlier American measurements are explained in terms of ionospheric interactions in the conjugate hemisphere. The temporal gradients of protonospheric content provide information on the net integrated ionospheric/protonospheric plasma fluxes and the results obtained indicate the importance of plasma exchange with both local and conjugate ionospheres

  6. Maximum-Entropy Inference with a Programmable Annealer

    Science.gov (United States)

    Chancellor, Nicholas; Szoke, Szilard; Vinci, Walter; Aeppli, Gabriel; Warburton, Paul A.

    2016-03-01

    Optimisation problems typically involve finding the ground state (i.e. the minimum energy configuration) of a cost function with respect to many variables. If the variables are corrupted by noise then this maximises the likelihood that the solution is correct. The maximum entropy solution on the other hand takes the form of a Boltzmann distribution over the ground and excited states of the cost function to correct for noise. Here we use a programmable annealer for the information decoding problem which we simulate as a random Ising model in a field. We show experimentally that finite temperature maximum entropy decoding can give slightly better bit-error-rates than the maximum likelihood approach, confirming that useful information can be extracted from the excited states of the annealer. Furthermore we introduce a bit-by-bit analytical method which is agnostic to the specific application and use it to show that the annealer samples from a highly Boltzmann-like distribution. Machines of this kind are therefore candidates for use in a variety of machine learning applications which exploit maximum entropy inference, including language processing and image recognition.

  7. Reduced oxygen at high altitude limits maximum size.

    Science.gov (United States)

    Peck, L S; Chapelle, G

    2003-11-07

    The trend towards large size in marine animals with latitude, and the existence of giant marine species in polar regions have long been recognized, but remained enigmatic until a recent study showed it to be an effect of increased oxygen availability in sea water of a low temperature. The effect was apparent in data from 12 sites worldwide because of variations in water oxygen content controlled by differences in temperature and salinity. Another major physical factor affecting oxygen content in aquatic environments is reduced pressure at high altitude. Suitable data from high-altitude sites are very scarce. However, an exceptionally rich crustacean collection, which remains largely undescribed, was obtained by the British 1937 expedition from Lake Titicaca on the border between Peru and Bolivia in the Andes at an altitude of 3809 m. We show that in Lake Titicaca the maximum length of amphipods is 2-4 times smaller than other low-salinity sites (Caspian Sea and Lake Baikal).

  8. What motivates consumers to re-tweet brand content? The impact of information, emotion, and traceability on pass-along behavior

    NARCIS (Netherlands)

    Araujo, T.; Neijens, P.; Vliegenthart, R.

    2015-01-01

    How do certain cues influence pass-along behavior (re-Tweeting) of brand messages on Twitter? Analyzing 19,343 global brand messages over a three-year period, the authors of this article found that informational cues were predictors of higher levels of re-Tweeting, particularly product details and

  9. PNNL: A Supervised Maximum Entropy Approach to Word Sense Disambiguation

    Energy Technology Data Exchange (ETDEWEB)

    Tratz, Stephen C.; Sanfilippo, Antonio P.; Gregory, Michelle L.; Chappell, Alan R.; Posse, Christian; Whitney, Paul D.

    2007-06-23

    In this paper, we described the PNNL Word Sense Disambiguation system as applied to the English All-Word task in Se-mEval 2007. We use a supervised learning approach, employing a large number of features and using Information Gain for dimension reduction. Our Maximum Entropy approach combined with a rich set of features produced results that are significantly better than baseline and are the highest F-score for the fined-grained English All-Words subtask.

  10. Maximum gravitational redshift of white dwarfs

    International Nuclear Information System (INIS)

    Shapiro, S.L.; Teukolsky, S.A.

    1976-01-01

    The stability of uniformly rotating, cold white dwarfs is examined in the framework of the Parametrized Post-Newtonian (PPN) formalism of Will and Nordtvedt. The maximum central density and gravitational redshift of a white dwarf are determined as functions of five of the nine PPN parameters (γ, β, zeta 2 , zeta 3 , and zeta 4 ), the total angular momentum J, and the composition of the star. General relativity predicts that the maximum redshifts is 571 km s -1 for nonrotating carbon and helium dwarfs, but is lower for stars composed of heavier nuclei. Uniform rotation can increase the maximum redshift to 647 km s -1 for carbon stars (the neutronization limit) and to 893 km s -1 for helium stars (the uniform rotation limit). The redshift distribution of a larger sample of white dwarfs may help determine the composition of their cores

  11. Maximum entropy analysis of EGRET data

    DEFF Research Database (Denmark)

    Pohl, M.; Strong, A.W.

    1997-01-01

    EGRET data are usually analysed on the basis of the Maximum-Likelihood method \\cite{ma96} in a search for point sources in excess to a model for the background radiation (e.g. \\cite{hu97}). This method depends strongly on the quality of the background model, and thus may have high systematic unce...... uncertainties in region of strong and uncertain background like the Galactic Center region. Here we show images of such regions obtained by the quantified Maximum-Entropy method. We also discuss a possible further use of MEM in the analysis of problematic regions of the sky....

  12. The Maximum Resource Bin Packing Problem

    DEFF Research Database (Denmark)

    Boyar, J.; Epstein, L.; Favrholdt, L.M.

    2006-01-01

    Usually, for bin packing problems, we try to minimize the number of bins used or in the case of the dual bin packing problem, maximize the number or total size of accepted items. This paper presents results for the opposite problems, where we would like to maximize the number of bins used...... algorithms, First-Fit-Increasing and First-Fit-Decreasing for the maximum resource variant of classical bin packing. For the on-line variant, we define maximum resource variants of classical and dual bin packing. For dual bin packing, no on-line algorithm is competitive. For classical bin packing, we find...

  13. Shower maximum detector for SDC calorimetry

    International Nuclear Information System (INIS)

    Ernwein, J.

    1994-01-01

    A prototype for the SDC end-cap (EM) calorimeter complete with a pre-shower and a shower maximum detector was tested in beams of electrons and Π's at CERN by an SDC subsystem group. The prototype was manufactured from scintillator tiles and strips read out with 1 mm diameter wave-length shifting fibers. The design and construction of the shower maximum detector is described, and results of laboratory tests on light yield and performance of the scintillator-fiber system are given. Preliminary results on energy and position measurements with the shower max detector in the test beam are shown. (authors). 4 refs., 5 figs

  14. The effectiveness of information and communication technology-based psychological interventions for paediatric chronic pain: protocol for a systematic review, meta-analysis and intervention content analysis.

    Science.gov (United States)

    Traynor, Angeline; Morrissey, Eimear; Egan, Jonathan; McGuire, Brian E

    2016-10-18

    Resource and geographic barriers are the commonly cited constraints preventing the uptake of psychological treatment for chronic pain management. For adults, there is some evidence to support the use of information and communication technology (ICT) as a mode of treatment delivery. However, mixed findings have been reported for the effectiveness and acceptability of psychological interventions delivered using information and communication technology for children and adolescents. This is a protocol for a review that aims to (i) evaluate the effectiveness of psychological interventions delivered using information and communication technology for children and adolescents with chronic pain and (ii) identify the intervention components and usability factors in technology-based treatments associated with behaviour change. We will conduct a systematic review to evaluate the effectiveness of psychological interventions for paediatric chronic pain delivered using ICT. We plan to directly compare ICT-based, psychological interventions with active control, treatment as usual or waiting list control conditions. This systematic review will be reported in line with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidance. Published and unpublished randomised controlled trials will be included and the literature search will comprise Ovid MEDLINE, Ovid Embase, PsycINFO and the Cochrane Library on Wiley, including CENTRAL and Cochrane Database of Systematic Reviews. Grey literature including theses, dissertations, technical and research reports will also be examined. Two review authors will independently conduct study selection, relevant data extraction and assessment of methodological quality. Risk of bias in included studies will be assessed using the Cochrane Collaboration risk of bias tool criteria. Two qualified coders will independently code behaviour change techniques according to the behaviour change taxonomy (v1) of 93 hierarchically

  15. Affinity-based, biophysical methods to detect and analyze ligand binding to recombinant proteins: matching high information content with high throughput.

    Science.gov (United States)

    Holdgate, Geoff A; Anderson, Malcolm; Edfeldt, Fredrik; Geschwindner, Stefan

    2010-10-01

    Affinity-based technologies have become impactful tools to detect, monitor and characterize molecular interactions using recombinant target proteins. This can aid the understanding of biological function by revealing mechanistic details, and even more importantly, enables the identification of new improved ligands that can modulate the biological activity of those targets in a desired fashion. The selection of the appropriate technology is a key step in that process, as each one of the currently available technologies offers a characteristic type of biophysical information about the ligand-binding event. Alongside the indisputable advantages of each of those technologies they naturally display diverse restrictions that are quite frequently related to the target system to be studied but also to the affinity, solubility and molecular size of the ligands. This paper discusses some of the theoretical and experimental aspects of the most common affinity-based methods, what type of information can be gained from each one of those approaches, and what requirements as well as limitations are expected from working with recombinant proteins on those platforms and how those can be optimally addressed.

  16. Nonsymmetric entropy and maximum nonsymmetric entropy principle

    International Nuclear Information System (INIS)

    Liu Chengshi

    2009-01-01

    Under the frame of a statistical model, the concept of nonsymmetric entropy which generalizes the concepts of Boltzmann's entropy and Shannon's entropy, is defined. Maximum nonsymmetric entropy principle is proved. Some important distribution laws such as power law, can be derived from this principle naturally. Especially, nonsymmetric entropy is more convenient than other entropy such as Tsallis's entropy in deriving power laws.

  17. Maximum speed of dewetting on a fiber

    NARCIS (Netherlands)

    Chan, Tak Shing; Gueudre, Thomas; Snoeijer, Jacobus Hendrikus

    2011-01-01

    A solid object can be coated by a nonwetting liquid since a receding contact line cannot exceed a critical speed. We theoretically investigate this forced wetting transition for axisymmetric menisci on fibers of varying radii. First, we use a matched asymptotic expansion and derive the maximum speed

  18. Maximum potential preventive effect of hip protectors

    NARCIS (Netherlands)

    van Schoor, N.M.; Smit, J.H.; Bouter, L.M.; Veenings, B.; Asma, G.B.; Lips, P.T.A.M.

    2007-01-01

    OBJECTIVES: To estimate the maximum potential preventive effect of hip protectors in older persons living in the community or homes for the elderly. DESIGN: Observational cohort study. SETTING: Emergency departments in the Netherlands. PARTICIPANTS: Hip fracture patients aged 70 and older who

  19. Maximum gain of Yagi-Uda arrays

    DEFF Research Database (Denmark)

    Bojsen, J.H.; Schjær-Jacobsen, Hans; Nilsson, E.

    1971-01-01

    Numerical optimisation techniques have been used to find the maximum gain of some specific parasitic arrays. The gain of an array of infinitely thin, equispaced dipoles loaded with arbitrary reactances has been optimised. The results show that standard travelling-wave design methods are not optimum....... Yagi–Uda arrays with equal and unequal spacing have also been optimised with experimental verification....

  20. correlation between maximum dry density and cohesion

    African Journals Online (AJOL)

    HOD

    represents maximum dry density, signifies plastic limit and is liquid limit. Researchers [6, 7] estimate compaction parameters. Aside from the correlation existing between compaction parameters and other physical quantities there are some other correlations that have been investigated by other researchers. The well-known.

  1. Weak scale from the maximum entropy principle

    Science.gov (United States)

    Hamada, Yuta; Kawai, Hikaru; Kawana, Kiyoharu

    2015-03-01

    The theory of the multiverse and wormholes suggests that the parameters of the Standard Model (SM) are fixed in such a way that the radiation of the S3 universe at the final stage S_rad becomes maximum, which we call the maximum entropy principle. Although it is difficult to confirm this principle generally, for a few parameters of the SM, we can check whether S_rad actually becomes maximum at the observed values. In this paper, we regard S_rad at the final stage as a function of the weak scale (the Higgs expectation value) vh, and show that it becomes maximum around vh = {{O}} (300 GeV) when the dimensionless couplings in the SM, i.e., the Higgs self-coupling, the gauge couplings, and the Yukawa couplings are fixed. Roughly speaking, we find that the weak scale is given by vh ˜ T_{BBN}2 / (M_{pl}ye5), where ye is the Yukawa coupling of electron, T_BBN is the temperature at which the Big Bang nucleosynthesis starts, and M_pl is the Planck mass.

  2. The maximum-entropy method in superspace

    Czech Academy of Sciences Publication Activity Database

    van Smaalen, S.; Palatinus, Lukáš; Schneider, M.

    2003-01-01

    Roč. 59, - (2003), s. 459-469 ISSN 0108-7673 Grant - others:DFG(DE) XX Institutional research plan: CEZ:AV0Z1010914 Keywords : maximum-entropy method, * aperiodic crystals * electron density Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 1.558, year: 2003

  3. Achieving maximum sustainable yield in mixed fisheries

    NARCIS (Netherlands)

    Ulrich, Clara; Vermard, Youen; Dolder, Paul J.; Brunel, Thomas; Jardim, Ernesto; Holmes, Steven J.; Kempf, Alexander; Mortensen, Lars O.; Poos, Jan Jaap; Rindorf, Anna

    2017-01-01

    Achieving single species maximum sustainable yield (MSY) in complex and dynamic fisheries targeting multiple species (mixed fisheries) is challenging because achieving the objective for one species may mean missing the objective for another. The North Sea mixed fisheries are a representative example

  4. 5 CFR 534.203 - Maximum stipends.

    Science.gov (United States)

    2010-01-01

    ... maximum stipend established under this section. (e) A trainee at a non-Federal hospital, clinic, or medical or dental laboratory who is assigned to a Federal hospital, clinic, or medical or dental... Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS PAY UNDER OTHER SYSTEMS Student...

  5. Minimal length, Friedmann equations and maximum density

    Energy Technology Data Exchange (ETDEWEB)

    Awad, Adel [Center for Theoretical Physics, British University of Egypt,Sherouk City 11837, P.O. Box 43 (Egypt); Department of Physics, Faculty of Science, Ain Shams University,Cairo, 11566 (Egypt); Ali, Ahmed Farag [Centre for Fundamental Physics, Zewail City of Science and Technology,Sheikh Zayed, 12588, Giza (Egypt); Department of Physics, Faculty of Science, Benha University,Benha, 13518 (Egypt)

    2014-06-16

    Inspired by Jacobson’s thermodynamic approach, Cai et al. have shown the emergence of Friedmann equations from the first law of thermodynamics. We extend Akbar-Cai derivation http://dx.doi.org/10.1103/PhysRevD.75.084003 of Friedmann equations to accommodate a general entropy-area law. Studying the resulted Friedmann equations using a specific entropy-area law, which is motivated by the generalized uncertainty principle (GUP), reveals the existence of a maximum energy density closed to Planck density. Allowing for a general continuous pressure p(ρ,a) leads to bounded curvature invariants and a general nonsingular evolution. In this case, the maximum energy density is reached in a finite time and there is no cosmological evolution beyond this point which leaves the big bang singularity inaccessible from a spacetime prospective. The existence of maximum energy density and a general nonsingular evolution is independent of the equation of state and the spacial curvature k. As an example we study the evolution of the equation of state p=ωρ through its phase-space diagram to show the existence of a maximum energy which is reachable in a finite time.

  6. LCS Content Document Application

    Science.gov (United States)

    Hochstadt, Jake

    2011-01-01

    My project at KSC during my spring 2011 internship was to develop a Ruby on Rails application to manage Content Documents..A Content Document is a collection of documents and information that describes what software is installed on a Launch Control System Computer. It's important for us to make sure the tools we use everyday are secure, up-to-date, and properly licensed. Previously, keeping track of the information was done by Excel and Word files between different personnel. The goal of the new application is to be able to manage and access the Content Documents through a single database backed web application. Our LCS team will benefit greatly with this app. Admin's will be able to login securely to keep track and update the software installed on each computer in a timely manner. We also included exportability such as attaching additional documents that can be downloaded from the web application. The finished application will ease the process of managing Content Documents while streamlining the procedure. Ruby on Rails is a very powerful programming language and I am grateful to have the opportunity to build this application.

  7. Comparison of information content of temporal response of chemoresistive gas sensor under three different temperature modulation regimes for gas detection of different feature reduction methods

    Science.gov (United States)

    Hosseini-Golgoo, S. M.; Salimi, F.; Saberkari, A.; Rahbarpour, S.

    2017-12-01

    In the present work the feature extraction of transient response of a resistive gas sensor under temperature cycling, temperature transient, and temperature combination methods were compared. So, the heater were stimulated by three pulse (cycling), ramp (transient) and staircase (combination) waveforms. The period or duration of all waves was equal to 40 s. Methanol, ethanol, 1-propanol, 1-butanol, toluene and acetone each at 11 different concentration levels in the range of 100 to 2000 ppm were used as the target gases. The utilized sensor was TGS-813 that made by Figaro Company. Recorded results were studied and heuristic features such as peak, rise time, slope and curvature of recorded responses were extracted for each heater waveform. Results showed that although application of this feature extraction method to all waveforms led to gas diagnoses, best results were achieved in the case of staircase waveform. The combination waveform had enough information to separate all examined target gases.

  8. Examining the information content of time-lapse crosshole GPR data collected under different infiltration conditions to estimate unsaturated soil hydraulic properties

    DEFF Research Database (Denmark)

    Scholer, M.; Irving, J.; Zibar, Majken Caroline Looms

    2013-01-01

    Time-lapse geophysical data acquired during transient hydrological experiments are being increasingly employed to estimate subsurface hydraulic properties at the field scale. In particular, crosshole ground-penetrating radar (GPR) data, collected while water infiltrates into the subsurface either...... by natural or artificial means, have been demonstrated in a number of studies to contain valuable information concerning the hydraulic properties of the unsaturated zone. Previous work in this domain has considered a variety of infiltration conditions and different amounts of time-lapse GPR data...... of time-lapse zero-offset-profile (ZOP) GPR traveltime data, collected under three different infiltration conditions, for the estimation of van Genuchten–Mualem (VGM) parameters in a layered subsurface medium. Specifically, we systematically analyze synthetic and field GPR data acquired under natural...

  9. Un Portal de Información Médica a través de un sistema de gestión de contenidos Web A Web Portal for Medical Information through a Web Content Management System

    Directory of Open Access Journals (Sweden)

    Yaisel Lorenzo Rodríguez

    2011-06-01

    Full Text Available El trabajo que se presenta da respuesta a las acciones de informatización de la Facultad de Ciencias Médicas en Guinea Ecuatorial. Se presentan los elementos que han condicionado la creación de la Red de Información Médica en Guinea Ecuatorial (GUIMED, a través de un gestor de contenidos en línea que facilita a partir del empleo de las tecnologías de la informática, las comunicaciones y los ambientes colaborativos accesibles hasta el momento. GUIMED, expresada en su portal Web, fue fundamentada por: una interfaz sencilla y amigable, un repositorio de información basado en el modelo cliente-servidor, capacidades de búsqueda, aula virtual para la enseñanza, administración sencilla e intuitiva y el control de la accesibilidad a la información.This paper responds to the actions to computerize the Medical School in Equatorial Guinea. The elements that have conditioned the creation of the Medical Information Net in Equatorial Guinea (GUIMED are presented through a content management on-line that facilitates the use of current information and communication technologies as well as accessible collaboration environments. GUIMED is supported in its Web portal by: a simple user-friendly interface, information repository based on a user-server model, search competence, virtual teaching-learning classroom, simple intuitive administration and the control of information accessibility.

  10. Satellite Ocean Heat Content Suite

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This collection contains an operational Satellite Ocean Heat Content Suite (SOHCS) product generated by NOAA National Environmental Satellite, Data, and Information...

  11. Seeking the epoch of maximum luminosity for dusty quasars

    International Nuclear Information System (INIS)

    Vardanyan, Valeri; Weedman, Daniel; Sargsyan, Lusine

    2014-01-01

    Infrared luminosities νL ν (7.8 μm) arising from dust reradiation are determined for Sloan Digital Sky Survey (SDSS) quasars with 1.4 maximum at any redshift z < 5, reaching a plateau for z ≳ 3 with maximum luminosity νL ν (7.8 μm) ≳ 10 47 erg s –1 ; luminosity functions show one quasar Gpc –3 having νL ν (7.8 μm) > 10 46.6 erg s –1 for all 2 maximum luminosity has not yet been identified at any redshift below 5. The most ultraviolet luminous quasars, defined by rest frame νL ν (0.25 μm), have the largest values of the ratio νL ν (0.25 μm)/νL ν (7.8 μm) with a maximum ratio at z = 2.9. From these results, we conclude that the quasars most luminous in the ultraviolet have the smallest dust content and appear luminous primarily because of lessened extinction. Observed ultraviolet/infrared luminosity ratios are used to define 'obscured' quasars as those having >5 mag of ultraviolet extinction. We present a new summary of obscured quasars discovered with the Spitzer Infrared Spectrograph and determine the infrared luminosity function of these obscured quasars at z ∼ 2.1. This is compared with infrared luminosity functions of optically discovered, unobscured quasars in the SDSS and in the AGN and Galaxy Evolution Survey. The comparison indicates comparable numbers of obscured and unobscured quasars at z ∼ 2.1 with a possible excess of obscured quasars at fainter luminosities.

  12. Maximum-confidence discrimination among symmetric qudit states

    International Nuclear Information System (INIS)

    Jimenez, O.; Solis-Prosser, M. A.; Delgado, A.; Neves, L.

    2011-01-01

    We study the maximum-confidence (MC) measurement strategy for discriminating among nonorthogonal symmetric qudit states. Restricting to linearly dependent and equally likely pure states, we find the optimal positive operator valued measure (POVM) that maximizes our confidence in identifying each state in the set and minimizes the probability of obtaining inconclusive results. The physical realization of this POVM is completely determined and it is shown that after an inconclusive outcome, the input states may be mapped into a new set of equiprobable symmetric states, restricted, however, to a subspace of the original qudit Hilbert space. By applying the MC measurement again onto this new set, we can still gain some information about the input states, although with less confidence than before. This leads us to introduce the concept of sequential maximum-confidence (SMC) measurements, where the optimized MC strategy is iterated in as many stages as allowed by the input set, until no further information can be extracted from an inconclusive result. Within each stage of this measurement our confidence in identifying the input states is the highest possible, although it decreases from one stage to the next. In addition, the more stages we accomplish within the maximum allowed, the higher will be the probability of correct identification. We will discuss an explicit example of the optimal SMC measurement applied in the discrimination among four symmetric qutrit states and propose an optical network to implement it.

  13. Radioactivity content of books

    International Nuclear Information System (INIS)

    Lalit, B.Y.; Shukla, V.K.; Ramachandran, T.V.

    1981-01-01

    The natural and fallout radioactivity was measured in a large number of books produced in various countries after 1955. Results of these measurements showed that the books contained radioactivity due to fallout 137 Cs and 226 Ra, 228 Th and 40 K radioisotopes of primordial origin. Books printed in the U.S.A. had low radioactivity of 40K and 226 Ra origin compared to books printed in the European subcontinent. Books printed during high fallout rate (1962-64) or thereafter did not exhibit any significantly higher 137 Cs levels. The maximum radiation dose to the eyes calculated for the radioactivity content of the books was 0.8 μR/hr and the minimum was 0.07 μR/hr; most of the books were in the range 0.3-0.5 μR/hr. (U.K.)

  14. Assimilating solar-induced chlorophyll fluorescence into the terrestrial biosphere model BETHY-SCOPE v1.0: model description and information content

    Science.gov (United States)

    Norton, Alexander J.; Rayner, Peter J.; Koffi, Ernest N.; Scholze, Marko

    2018-04-01

    The synthesis of model and observational information using data assimilation can improve our understanding of the terrestrial carbon cycle, a key component of the Earth's climate-carbon system. Here we provide a data assimilation framework for combining observations of solar-induced chlorophyll fluorescence (SIF) and a process-based model to improve estimates of terrestrial carbon uptake or gross primary production (GPP). We then quantify and assess the constraint SIF provides on the uncertainty in global GPP through model process parameters in an error propagation study. By incorporating 1 year of SIF observations from the GOSAT satellite, we find that the parametric uncertainty in global annual GPP is reduced by 73 % from ±19.0 to ±5.2 Pg C yr-1. This improvement is achieved through strong constraint of leaf growth processes and weak to moderate constraint of physiological parameters. We also find that the inclusion of uncertainty in shortwave down-radiation forcing has a net-zero effect on uncertainty in GPP when incorporated into the SIF assimilation framework. This study demonstrates the powerful capacity of SIF to reduce uncertainties in process-based model estimates of GPP and the potential for improving our predictive capability of this uncertain carbon flux.

  15. Maximum concentrations at work and maximum biologically tolerable concentration for working materials 1991

    International Nuclear Information System (INIS)

    1991-01-01

    The meaning of the term 'maximum concentration at work' in regard of various pollutants is discussed. Specifically, a number of dusts and smokes are dealt with. The valuation criteria for maximum biologically tolerable concentrations for working materials are indicated. The working materials in question are corcinogeneous substances or substances liable to cause allergies or mutate the genome. (VT) [de

  16. Zipf's law, power laws and maximum entropy

    International Nuclear Information System (INIS)

    Visser, Matt

    2013-01-01

    Zipf's law, and power laws in general, have attracted and continue to attract considerable attention in a wide variety of disciplines—from astronomy to demographics to software structure to economics to linguistics to zoology, and even warfare. A recent model of random group formation (RGF) attempts a general explanation of such phenomena based on Jaynes' notion of maximum entropy applied to a particular choice of cost function. In the present paper I argue that the specific cost function used in the RGF model is in fact unnecessarily complicated, and that power laws can be obtained in a much simpler way by applying maximum entropy ideas directly to the Shannon entropy subject only to a single constraint: that the average of the logarithm of the observable quantity is specified. (paper)

  17. Maximum-entropy description of animal movement.

    Science.gov (United States)

    Fleming, Chris H; Subaşı, Yiğit; Calabrese, Justin M

    2015-03-01

    We introduce a class of maximum-entropy states that naturally includes within it all of the major continuous-time stochastic processes that have been applied to animal movement, including Brownian motion, Ornstein-Uhlenbeck motion, integrated Ornstein-Uhlenbeck motion, a recently discovered hybrid of the previous models, and a new model that describes central-place foraging. We are also able to predict a further hierarchy of new models that will emerge as data quality improves to better resolve the underlying continuity of animal movement. Finally, we also show that Langevin equations must obey a fluctuation-dissipation theorem to generate processes that fall from this class of maximum-entropy distributions when the constraints are purely kinematic.

  18. Pareto versus lognormal: a maximum entropy test.

    Science.gov (United States)

    Bee, Marco; Riccaboni, Massimo; Schiavo, Stefano

    2011-08-01

    It is commonly found that distributions that seem to be lognormal over a broad range change to a power-law (Pareto) distribution for the last few percentiles. The distributions of many physical, natural, and social events (earthquake size, species abundance, income and wealth, as well as file, city, and firm sizes) display this structure. We present a test for the occurrence of power-law tails in statistical distributions based on maximum entropy. This methodology allows one to identify the true data-generating processes even in the case when it is neither lognormal nor Pareto. The maximum entropy approach is then compared with other widely used methods and applied to different levels of aggregation of complex systems. Our results provide support for the theory that distributions with lognormal body and Pareto tail can be generated as mixtures of lognormally distributed units.

  19. Maximum likelihood estimation for integrated diffusion processes

    DEFF Research Database (Denmark)

    Baltazar-Larios, Fernando; Sørensen, Michael

    We propose a method for obtaining maximum likelihood estimates of parameters in diffusion models when the data is a discrete time sample of the integral of the process, while no direct observations of the process itself are available. The data are, moreover, assumed to be contaminated...... EM-algorithm to obtain maximum likelihood estimates of the parameters in the diffusion model. As part of the algorithm, we use a recent simple method for approximate simulation of diffusion bridges. In simulation studies for the Ornstein-Uhlenbeck process and the CIR process the proposed method works...... by measurement errors. Integrated volatility is an example of this type of observations. Another example is ice-core data on oxygen isotopes used to investigate paleo-temperatures. The data can be viewed as incomplete observations of a model with a tractable likelihood function. Therefore we propose a simulated...

  20. A Maximum Radius for Habitable Planets.

    Science.gov (United States)

    Alibert, Yann

    2015-09-01

    We compute the maximum radius a planet can have in order to fulfill two constraints that are likely necessary conditions for habitability: 1- surface temperature and pressure compatible with the existence of liquid water, and 2- no ice layer at the bottom of a putative global ocean, that would prevent the operation of the geologic carbon cycle to operate. We demonstrate that, above a given radius, these two constraints cannot be met: in the Super-Earth mass range (1-12 Mearth), the overall maximum that a planet can have varies between 1.8 and 2.3 Rearth. This radius is reduced when considering planets with higher Fe/Si ratios, and taking into account irradiation effects on the structure of the gas envelope.

  1. Maximum parsimony on subsets of taxa.

    Science.gov (United States)

    Fischer, Mareike; Thatte, Bhalchandra D

    2009-09-21

    In this paper we investigate mathematical questions concerning the reliability (reconstruction accuracy) of Fitch's maximum parsimony algorithm for reconstructing the ancestral state given a phylogenetic tree and a character. In particular, we consider the question whether the maximum parsimony method applied to a subset of taxa can reconstruct the ancestral state of the root more accurately than when applied to all taxa, and we give an example showing that this indeed is possible. A surprising feature of our example is that ignoring a taxon closer to the root improves the reliability of the method. On the other hand, in the case of the two-state symmetric substitution model, we answer affirmatively a conjecture of Li, Steel and Zhang which states that under a molecular clock the probability that the state at a single taxon is a correct guess of the ancestral state is a lower bound on the reconstruction accuracy of Fitch's method applied to all taxa.

  2. Maximum entropy analysis of liquid diffraction data

    International Nuclear Information System (INIS)

    Root, J.H.; Egelstaff, P.A.; Nickel, B.G.

    1986-01-01

    A maximum entropy method for reducing truncation effects in the inverse Fourier transform of structure factor, S(q), to pair correlation function, g(r), is described. The advantages and limitations of the method are explored with the PY hard sphere structure factor as model input data. An example using real data on liquid chlorine, is then presented. It is seen that spurious structure is greatly reduced in comparison to traditional Fourier transform methods. (author)

  3. Maximum power point tracker for photovoltaic power plants

    Science.gov (United States)

    Arcidiacono, V.; Corsi, S.; Lambri, L.

    The paper describes two different closed-loop control criteria for the maximum power point tracking of the voltage-current characteristic of a photovoltaic generator. The two criteria are discussed and compared, inter alia, with regard to the setting-up problems that they pose. Although a detailed analysis is not embarked upon, the paper also provides some quantitative information on the energy advantages obtained by using electronic maximum power point tracking systems, as compared with the situation in which the point of operation of the photovoltaic generator is not controlled at all. Lastly, the paper presents two high-efficiency MPPT converters for experimental photovoltaic plants of the stand-alone and the grid-interconnected type.

  4. Algorithms of maximum likelihood data clustering with applications

    Science.gov (United States)

    Giada, Lorenzo; Marsili, Matteo

    2002-12-01

    We address the problem of data clustering by introducing an unsupervised, parameter-free approach based on maximum likelihood principle. Starting from the observation that data sets belonging to the same cluster share a common information, we construct an expression for the likelihood of any possible cluster structure. The likelihood in turn depends only on the Pearson's coefficient of the data. We discuss clustering algorithms that provide a fast and reliable approximation to maximum likelihood configurations. Compared to standard clustering methods, our approach has the advantages that (i) it is parameter free, (ii) the number of clusters need not be fixed in advance and (iii) the interpretation of the results is transparent. In order to test our approach and compare it with standard clustering algorithms, we analyze two very different data sets: time series of financial market returns and gene expression data. We find that different maximization algorithms produce similar cluster structures whereas the outcome of standard algorithms has a much wider variability.

  5. A Maximum Resonant Set of Polyomino Graphs

    Directory of Open Access Journals (Sweden)

    Zhang Heping

    2016-05-01

    Full Text Available A polyomino graph P is a connected finite subgraph of the infinite plane grid such that each finite face is surrounded by a regular square of side length one and each edge belongs to at least one square. A dimer covering of P corresponds to a perfect matching. Different dimer coverings can interact via an alternating cycle (or square with respect to them. A set of disjoint squares of P is a resonant set if P has a perfect matching M so that each one of those squares is M-alternating. In this paper, we show that if K is a maximum resonant set of P, then P − K has a unique perfect matching. We further prove that the maximum forcing number of a polyomino graph is equal to the cardinality of a maximum resonant set. This confirms a conjecture of Xu et al. [26]. We also show that if K is a maximal alternating set of P, then P − K has a unique perfect matching.

  6. Automatic maximum entropy spectral reconstruction in NMR

    International Nuclear Information System (INIS)

    Mobli, Mehdi; Maciejewski, Mark W.; Gryk, Michael R.; Hoch, Jeffrey C.

    2007-01-01

    Developments in superconducting magnets, cryogenic probes, isotope labeling strategies, and sophisticated pulse sequences together have enabled the application, in principle, of high-resolution NMR spectroscopy to biomolecular systems approaching 1 megadalton. In practice, however, conventional approaches to NMR that utilize the fast Fourier transform, which require data collected at uniform time intervals, result in prohibitively lengthy data collection times in order to achieve the full resolution afforded by high field magnets. A variety of approaches that involve nonuniform sampling have been proposed, each utilizing a non-Fourier method of spectrum analysis. A very general non-Fourier method that is capable of utilizing data collected using any of the proposed nonuniform sampling strategies is maximum entropy reconstruction. A limiting factor in the adoption of maximum entropy reconstruction in NMR has been the need to specify non-intuitive parameters. Here we describe a fully automated system for maximum entropy reconstruction that requires no user-specified parameters. A web-accessible script generator provides the user interface to the system

  7. maximum neutron flux at thermal nuclear reactors

    International Nuclear Information System (INIS)

    Strugar, P.

    1968-10-01

    Since actual research reactors are technically complicated and expensive facilities it is important to achieve savings by appropriate reactor lattice configurations. There is a number of papers, and practical examples of reactors with central reflector, dealing with spatial distribution of fuel elements which would result in higher neutron flux. Common disadvantage of all the solutions is that the choice of best solution is done starting from the anticipated spatial distributions of fuel elements. The weakness of these approaches is lack of defined optimization criteria. Direct approach is defined as follows: determine the spatial distribution of fuel concentration starting from the condition of maximum neutron flux by fulfilling the thermal constraints. Thus the problem of determining the maximum neutron flux is solving a variational problem which is beyond the possibilities of classical variational calculation. This variational problem has been successfully solved by applying the maximum principle of Pontrjagin. Optimum distribution of fuel concentration was obtained in explicit analytical form. Thus, spatial distribution of the neutron flux and critical dimensions of quite complex reactor system are calculated in a relatively simple way. In addition to the fact that the results are innovative this approach is interesting because of the optimization procedure itself [sr

  8. An intelligent content discovery technique for health portal content management.

    Science.gov (United States)

    De Silva, Daswin; Burstein, Frada

    2014-04-23

    Continuous content management of health information portals is a feature vital for its sustainability and widespread acceptance. Knowledge and experience of a domain expert is essential for content management in the health domain. The rate of generation of online health resources is exponential and thereby manual examination for relevance to a specific topic and audience is a formidable challenge for domain experts. Intelligent content discovery for effective content management is a less researched topic. An existing expert-endorsed content repository can provide the necessary leverage to automatically identify relevant resources and evaluate qualitative metrics. This paper reports on the design research towards an intelligent technique for automated content discovery and ranking for health information portals. The proposed technique aims to improve efficiency of the current mostly manual process of portal content management by utilising an existing expert-endorsed content repository as a supporting base and a benchmark to evaluate the suitability of new content A model for content management was established based on a field study of potential users. The proposed technique is integral to this content management model and executes in several phases (ie, query construction, content search, text analytics and fuzzy multi-criteria ranking). The construction of multi-dimensional search queries with input from Wordnet, the use of multi-word and single-word terms as representative semantics for text analytics and the use of fuzzy multi-criteria ranking for subjective evaluation of quality metrics are original contributions reported in this paper. The feasibility of the proposed technique was examined with experiments conducted on an actual health information portal, the BCKOnline portal. Both intermediary and final results generated by the technique are presented in the paper and these help to establish benefits of the technique and its contribution towards effective

  9. Alfalfa variety selection for maximum fiber content, protein and nitrogen fixation

    Energy Technology Data Exchange (ETDEWEB)

    Ruhland, Christopher T. [Minnesota State Univ., Mankato, MN (United States); Knox, John [Minnesota State Univ., Mankato, MN (United States); Ward, Susan [Minnesota State Univ., Mankato, MN (United States); Agarwal, V. J. [Minnesota State Univ., Mankato, MN (United States); Frey, John [Minnesota State Univ., Mankato, MN (United States); Carrow, Duane [Minnesota State Univ., Mankato, MN (United States); Jones, Bruce [Minnesota State Univ., Mankato, MN (United States); Rife, James [Minnesota State Univ., Mankato, MN (United States)

    2012-09-01

    The fertile soils of Southern Minnesota are highly productive from both a natural and agricultural standpoint. We chose to examine potential feedstock in both of these settings as a focus for our study. In part one of this report we detail our findings examining potential feedstock in prairie and wetlands in the Mankato, MN area in 2009. We were able to familiarize ourselves with techniques used to isolate and quantify concentrations of cellulose, hemicellulose and lignin in both a classroom and laboratory setting. These techniques were then used for our experiments that examined the effects of harvest regime, irrigation and salinity on eight potential-feedstock varieties of alfalfa (Medicago sativa).

  10. 20 CFR 10.806 - How are the maximum fees defined?

    Science.gov (United States)

    2010-04-01

    ... AMENDED Information for Medical Providers Medical Fee Schedule § 10.806 How are the maximum fees defined? For professional medical services, the Director shall maintain a schedule of maximum allowable fees.../Current Procedural Terminology (HCPCS/CPT) code which represents the relative skill, effort, risk and time...

  11. Information Managerx

    African Journals Online (AJOL)

    USER

    Communication technologies (ICTs) to Library Operations and Routines in Selected Nigerian Federal University ... should use Open-source library information management software and DSpace content management ..... Marketing of library.

  12. Stimulus-dependent maximum entropy models of neural population codes.

    Directory of Open Access Journals (Sweden)

    Einat Granot-Atedgi

    Full Text Available Neural populations encode information about their stimulus in a collective fashion, by joint activity patterns of spiking and silence. A full account of this mapping from stimulus to neural activity is given by the conditional probability distribution over neural codewords given the sensory input. For large populations, direct sampling of these distributions is impossible, and so we must rely on constructing appropriate models. We show here that in a population of 100 retinal ganglion cells in the salamander retina responding to temporal white-noise stimuli, dependencies between cells play an important encoding role. We introduce the stimulus-dependent maximum entropy (SDME model-a minimal extension of the canonical linear-nonlinear model of a single neuron, to a pairwise-coupled neural population. We find that the SDME model gives a more accurate account of single cell responses and in particular significantly outperforms uncoupled models in reproducing the distributions of population codewords emitted in response to a stimulus. We show how the SDME model, in conjunction with static maximum entropy models of population vocabulary, can be used to estimate information-theoretic quantities like average surprise and information transmission in a neural population.

  13. Content Documents Management

    Science.gov (United States)

    Muniz, R.; Hochstadt, J.; Boelke J.; Dalton, A.

    2011-01-01

    The Content Documents are created and managed under the System Software group with. Launch Control System (LCS) project. The System Software product group is lead by NASA Engineering Control and Data Systems branch (NEC3) at Kennedy Space Center. The team is working on creating Operating System Images (OSI) for different platforms (i.e. AIX, Linux, Solaris and Windows). Before the OSI can be created, the team must create a Content Document which provides the information of a workstation or server, with the list of all the software that is to be installed on it and also the set where the hardware belongs. This can be for example in the LDS, the ADS or the FR-l. The objective of this project is to create a User Interface Web application that can manage the information of the Content Documents, with all the correct validations and filters for administrator purposes. For this project we used one of the most excellent tools in agile development applications called Ruby on Rails. This tool helps pragmatic programmers develop Web applications with Rails framework and Ruby programming language. It is very amazing to see how a student can learn about OOP features with the Ruby language, manage the user interface with HTML and CSS, create associations and queries with gems, manage databases and run a server with MYSQL, run shell commands with command prompt and create Web frameworks with Rails. All of this in a real world project and in just fifteen weeks!

  14. Information management - Assessing the demand for information

    Science.gov (United States)

    Rogers, William H.

    1991-01-01

    Information demand is defined in terms of both information content (what information) and form (when, how, and where it is needed). Providing the information richness required for flight crews to be informed without overwhelming their information processing capabilities will require a great deal of automated intelligence. It is seen that the essence of this intelligence is comprehending and capturing the demand for information.

  15. Maximum power operation of interacting molecular motors

    DEFF Research Database (Denmark)

    Golubeva, Natalia; Imparato, Alberto

    2013-01-01

    , as compared to the non-interacting system, in a wide range of biologically compatible scenarios. We furthermore consider the case where the motor-motor interaction directly affects the internal chemical cycle and investigate the effect on the system dynamics and thermodynamics.......We study the mechanical and thermodynamic properties of different traffic models for kinesin which are relevant in biological and experimental contexts. We find that motor-motor interactions play a fundamental role by enhancing the thermodynamic efficiency at maximum power of the motors...

  16. Maximum entropy method in momentum density reconstruction

    International Nuclear Information System (INIS)

    Dobrzynski, L.; Holas, A.

    1997-01-01

    The Maximum Entropy Method (MEM) is applied to the reconstruction of the 3-dimensional electron momentum density distributions observed through the set of Compton profiles measured along various crystallographic directions. It is shown that the reconstruction of electron momentum density may be reliably carried out with the aid of simple iterative algorithm suggested originally by Collins. A number of distributions has been simulated in order to check the performance of MEM. It is shown that MEM can be recommended as a model-free approach. (author). 13 refs, 1 fig

  17. On the maximum drawdown during speculative bubbles

    Science.gov (United States)

    Rotundo, Giulia; Navarra, Mauro

    2007-08-01

    A taxonomy of large financial crashes proposed in the literature locates the burst of speculative bubbles due to endogenous causes in the framework of extreme stock market crashes, defined as falls of market prices that are outlier with respect to the bulk of drawdown price movement distribution. This paper goes on deeper in the analysis providing a further characterization of the rising part of such selected bubbles through the examination of drawdown and maximum drawdown movement of indices prices. The analysis of drawdown duration is also performed and it is the core of the risk measure estimated here.

  18. Multi-Channel Maximum Likelihood Pitch Estimation

    DEFF Research Database (Denmark)

    Christensen, Mads Græsbøll

    2012-01-01

    In this paper, a method for multi-channel pitch estimation is proposed. The method is a maximum likelihood estimator and is based on a parametric model where the signals in the various channels share the same fundamental frequency but can have different amplitudes, phases, and noise characteristics....... This essentially means that the model allows for different conditions in the various channels, like different signal-to-noise ratios, microphone characteristics and reverberation. Moreover, the method does not assume that a certain array structure is used but rather relies on a more general model and is hence...

  19. Conductivity maximum in a charged colloidal suspension

    Energy Technology Data Exchange (ETDEWEB)

    Bastea, S

    2009-01-27

    Molecular dynamics simulations of a charged colloidal suspension in the salt-free regime show that the system exhibits an electrical conductivity maximum as a function of colloid charge. We attribute this behavior to two main competing effects: colloid effective charge saturation due to counterion 'condensation' and diffusion slowdown due to the relaxation effect. In agreement with previous observations, we also find that the effective transported charge is larger than the one determined by the Stern layer and suggest that it corresponds to the boundary fluid layer at the surface of the colloidal particles.

  20. Dynamical maximum entropy approach to flocking.

    Science.gov (United States)

    Cavagna, Andrea; Giardina, Irene; Ginelli, Francesco; Mora, Thierry; Piovani, Duccio; Tavarone, Raffaele; Walczak, Aleksandra M

    2014-04-01

    We derive a new method to infer from data the out-of-equilibrium alignment dynamics of collectively moving animal groups, by considering the maximum entropy model distribution consistent with temporal and spatial correlations of flight direction. When bird neighborhoods evolve rapidly, this dynamical inference correctly learns the parameters of the model, while a static one relying only on the spatial correlations fails. When neighbors change slowly and the detailed balance is satisfied, we recover the static procedure. We demonstrate the validity of the method on simulated data. The approach is applicable to other systems of active matter.